Search results for: accuracy assessment.
7952 Comparative Life Cycle Assessment of an Extensive Green Roof with a Traditional Gravel-Asphalted Roof: An Application for the Lebanese Context
Authors: Makram El Bachawati, Rima Manneh, Thomas Dandres, Carla Nassab, Henri El Zakhem, Rafik Belarbi
Abstract:
A vegetative roof, also called a garden roof, is a "roofing system that endorses the growth of plants on a rooftop". Garden roofs serve several purposes for a building, such as embellishing the roofing system, enhancing the water management, and reducing the energy consumption and heat island effects. Lebanon is a Middle East country that lacks the use of a sustainable energy system. It imports 98% of its non-renewable energy from neighboring countries and suffers flooding during heavy rains. The objective of this paper is to determine if the implementation of vegetative roofs is effectively better than the traditional roofs for the Lebanese context. A Life Cycle Assessment (LCA) is performed in order to compare an existing extensive green roof to a traditional gravel-asphalted roof. The life cycle inventory (LCI) was established and modeled using the SimaPro 8.0 software, while the environmental impacts were classified using the IMPACT 2002+ methodology. Results indicated that, for the existing extensive green roof, the waterproofing membrane and the growing medium were the highest contributors to the potential environmental impacts. When comparing the vegetative to the traditional roof, results showed that, for all impact categories, the extensive green roof had the less environmental impacts.Keywords: life cycle assessment, green roofs, vegatative roof, environmental impact
Procedia PDF Downloads 4627951 The Taiwan Environmental Impact Assessment Act Contributes to the Water Resources Saving
Authors: Feng-Ming Fan, Xiu-Hui Wen
Abstract:
Shortage of water resources is a crucial problem to be solved in Taiwan. However, lack of effective and mandatory regulation on water recovery and recycling leads to no effective water resource controls currently. Although existing legislation sets standards regarding water recovery, implementation and enforcement of legislation are facing challenges. In order to break through the dilemma, this study aims to find enforcement tools, improve inspection skills, develop an inspection system, to achieve sustainable development of precious water resources. The Taiwan Environmental Impact Assessment Act (EIA Act) was announced on 1994. The aim of EIA Act is to protect the environment by preventing and mitigating the adverse impact of development activity on the environment. During the EIA process, we can set standards that require enterprises to reach a certain percentage of water recycling based on different case characteristics, to promote sewage source reduction and water saving benefits. Next, we have to inspect how the enterprises handle their waste water and perform water recovery based on environmental assessment commitments, for the purpose of reviewing and measuring the implementation efficiency of water recycling and reuse, an eco-friendly measure. We invited leading experts in related fields to provide lecture on water recycling, strengthen law enforcement officials’ inspection knowledge, and write inspection reference manual to be used as basis of enforcement. Then we finalized the manual by reaching mutual agreement between the experts and relevant agencies. We then inspected 65 high-tech companies whose daily water consumption is over 1,000 tons individually, located at 3 science parks, set up by Ministry of Science and Technology. Great achievement on water recycling was achieved at an amount of 400 million tons per year, equivalent to 2.5 months water usage for general public in Taiwan. The amount is equal to 710 billion bottles of 600 ml cola, 170 thousand international standard swimming pools of 2,500 tons, irrigation water applied to 40 thousand hectares of rice fields, or 1.7 Taipei Feitsui Reservoir of reservoir storage. This study demonstrated promoting effects of environmental impact assessment commitments on water recycling, and therefore water resource sustainable development. It also confirms the value of EIA Act for environmental protection. Economic development should go hand in hand with environmental protection, and it’s a mainstream. It clearly shows the EIA regulation can minimize harmful effects caused by development activity to the environment, as well as pursuit water resources sustainable development.Keywords: the environmental impact assessment act, water recycling environmental assessment commitment, water resource sustainable development, water recycling, water reuse
Procedia PDF Downloads 2467950 Dynamic Fault Tree Analysis of Dynamic Positioning System through Monte Carlo Approach
Authors: A. S. Cheliyan, S. K. Bhattacharyya
Abstract:
Dynamic Positioning System (DPS) is employed in marine vessels of the offshore oil and gas industry. It is a computer controlled system to automatically maintain a ship’s position and heading by using its own thrusters. Reliability assessment of the same can be analyzed through conventional fault tree. However, the complex behaviour like sequence failure, redundancy management and priority of failing of events cannot be analyzed by the conventional fault trees. The Dynamic Fault Tree (DFT) addresses these shortcomings of conventional Fault Tree by defining additional gates called dynamic gates. Monte Carlo based simulation approach has been adopted for the dynamic gates. This method of realistic modeling of DPS gives meaningful insight into the system reliability and the ability to improve the same.Keywords: dynamic positioning system, dynamic fault tree, Monte Carlo simulation, reliability assessment
Procedia PDF Downloads 7717949 Examining Reading Comprehension Skills Based on Different Reading Comprehension Frameworks and Taxonomies
Authors: Seval Kula-Kartal
Abstract:
Developing students’ reading comprehension skills is an aim that is difficult to accomplish and requires to follow long-term and systematic teaching and assessment processes. In these processes, teachers need tools to provide guidance to them on what reading comprehension is and which comprehension skills they should develop. Due to a lack of clear and evidence-based frameworks defining reading comprehension skills, especially in Turkiye, teachers and students mostly follow various processes in the classrooms without having an idea about what their comprehension goals are and what those goals mean. Since teachers and students do not have a clear view of comprehension targets, strengths, and weaknesses in students’ comprehension skills, the formative feedback processes cannot be managed in an effective way. It is believed that detecting and defining influential comprehension skills may provide guidance both to teachers and students during the feedback process. Therefore, in the current study, some of the reading comprehension frameworks that define comprehension skills operationally were examined. The aim of the study is to develop a simple and clear framework that can be used by teachers and students during their teaching, learning, assessment, and feedback processes. The current study is qualitative research in which documents related to reading comprehension skills were analyzed. Therefore, the study group consisted of recourses and frameworks which made big contributions to theoretical and operational definitions of reading comprehension. A content analysis was conducted on the resources included in the study group. To determine the validity of the themes and sub-categories revealed as the result of content analysis, three educational assessment experts were asked to examine the content analysis results. The Fleiss’ Cappa coefficient revealed that there is consistency among themes and categories defined by three different experts. The content analysis of the reading comprehension frameworks revealed that comprehension skills could be examined under four different themes. The first and second themes focus on understanding information given explicitly or implicitly within a text. The third theme includes skills used by the readers to make connections between their personal knowledge and the information given in the text. Lastly, the fourth theme focus on skills used by readers to examine the text with a critical view. The results suggested that fundamental reading comprehension skills can be examined under four themes. Teachers are recommended to use these themes in their reading comprehension teaching and assessment processes. Acknowledgment: This research is supported by Pamukkale University Scientific Research Unit within the project, whose title is Developing A Reading Comprehension Rubric.Keywords: reading comprehension, assessing reading comprehension, comprehension taxonomies, educational assessment
Procedia PDF Downloads 827948 The Best Prediction Data Mining Model for Breast Cancer Probability in Women Residents in Kabul
Authors: Mina Jafari, Kobra Hamraee, Saied Hossein Hosseini
Abstract:
The prediction of breast cancer disease is one of the challenges in medicine. In this paper we collected 528 records of women’s information who live in Kabul including demographic, life style, diet and pregnancy data. There are many classification algorithm in breast cancer prediction and tried to find the best model with most accurate result and lowest error rate. We evaluated some other common supervised algorithms in data mining to find the best model in prediction of breast cancer disease among afghan women living in Kabul regarding to momography result as target variable. For evaluating these algorithms we used Cross Validation which is an assured method for measuring the performance of models. After comparing error rate and accuracy of three models: Decision Tree, Naive Bays and Rule Induction, Decision Tree with accuracy of 94.06% and error rate of %15 is found the best model to predicting breast cancer disease based on the health care records.Keywords: decision tree, breast cancer, probability, data mining
Procedia PDF Downloads 1367947 Optimized Deep Learning-Based Facial Emotion Recognition System
Authors: Erick C. Valverde, Wansu Lim
Abstract:
Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.Keywords: deep learning, face detection, facial emotion recognition, network optimization methods
Procedia PDF Downloads 1187946 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance
Authors: Abdullah Al Farwan, Ya Zhang
Abstract:
In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance
Procedia PDF Downloads 1657945 Investigation a New Approach "AGM" to Solve of Complicate Nonlinear Partial Differential Equations at All Engineering Field and Basic Science
Authors: Mohammadreza Akbari, Pooya Soleimani Besheli, Reza Khalili, Davood Domiri Danji
Abstract:
In this conference, our aims are accuracy, capabilities and power at solving of the complicated non-linear partial differential. Our purpose is to enhance the ability to solve the mentioned nonlinear differential equations at basic science and engineering field and similar issues with a simple and innovative approach. As we know most of engineering system behavior in practical are nonlinear process (especially basic science and engineering field, etc.) and analytical solving (no numeric) these problems are difficult, complex, and sometimes impossible like (Fluids and Gas wave, these problems can't solve with numeric method, because of no have boundary condition) accordingly in this symposium we are going to exposure an innovative approach which we have named it Akbari-Ganji's Method or AGM in engineering, that can solve sets of coupled nonlinear differential equations (ODE, PDE) with high accuracy and simple solution and so this issue will emerge after comparing the achieved solutions by Numerical method (Runge-Kutta 4th). Eventually, AGM method will be proved that could be created huge evolution for researchers, professors and students in whole over the world, because of AGM coding system, so by using this software we can analytically solve all complicated linear and nonlinear partial differential equations, with help of that there is no difficulty for solving all nonlinear differential equations. Advantages and ability of this method (AGM) as follow: (a) Non-linear Differential equations (ODE, PDE) are directly solvable by this method. (b) In this method (AGM), most of the time, without any dimensionless procedure, we can solve equation(s) by any boundary or initial condition number. (c) AGM method always is convergent in boundary or initial condition. (d) Parameters of exponential, Trigonometric and Logarithmic of the existent in the non-linear differential equation with AGM method no needs Taylor expand which are caused high solve precision. (e) AGM method is very flexible in the coding system, and can solve easily varieties of the non-linear differential equation at high acceptable accuracy. (f) One of the important advantages of this method is analytical solving with high accuracy such as partial differential equation in vibration in solids, waves in water and gas, with minimum initial and boundary condition capable to solve problem. (g) It is very important to present a general and simple approach for solving most problems of the differential equations with high non-linearity in engineering sciences especially at civil engineering, and compare output with numerical method (Runge-Kutta 4th) and Exact solutions.Keywords: new approach, AGM, sets of coupled nonlinear differential equation, exact solutions, numerical
Procedia PDF Downloads 4617944 Prediction and Analysis of Human Transmembrane Transporter Proteins Based on SCM
Authors: Hui-Ling Huang, Tamara Vasylenko, Phasit Charoenkwan, Shih-Hsiang Chiu, Shinn-Ying Ho
Abstract:
The knowledge of the human transporters is still limited due to technically demanding procedure of crystallization for the structural characterization of transporters by spectroscopic methods. It is desirable to develop bioinformatics tools for effective analysis of available sequences in order to identify human transmembrane transporter proteins (HMTPs). This study proposes a scoring card method (SCM) based method for predicting HMTPs. We estimated a set of propensity scores of dipeptides to be HMTPs using SCM from the training dataset (HTS732) consisting of 366 HMTPs and 366 non-HMTPs. SCM using the estimated propensity scores of 20 amino acids and 400 dipeptides -as HMTPs, has a training accuracy of 87.63% and a test accuracy of 66.46%. The five top-ranked dipeptides include LD, NV, LI, KY, and MN with scores 996, 992, 989, 987, and 985, respectively. Five amino acids with the highest propensity scores are Ile, Phe, Met, Gly, and Leu, that hydrophobic residues are mostly highly-scored. Furthermore, obtained propensity scores were used to analyze physicochemical properties of human transporters.Keywords: dipeptide composition, physicochemical property, human transmembrane transporter proteins, human transmembrane transporters binding propensity, scoring card method
Procedia PDF Downloads 3677943 Assessment of Noise Pollution in the City of Biskra, Algeria
Authors: Tallal Abdel Karim Bouzir, Nourdinne Zemmouri, Djihed Berkouk
Abstract:
In this research, a quantitative assessment of the urban sound environment of the city of Biskra, Algeria, was conducted. To determine the quality of the soundscape based on in-situ measurement, using a Landtek SL5868P sound level meter in 47 points, which have been identified to represent the whole city. The result shows that the urban noise level varies from 55.3 dB to 75.8 dB during the weekdays and from 51.7 dB to 74.3 dB during the weekend. On the other hand, we can also note that 70.20% of the results of the weekday measurements and 55.30% of the results of the weekend measurements have levels of sound intensity that exceed the levels allowed by Algerian law and the recommendations of the World Health Organization. These very high urban noise levels affect the quality of life, the acoustic comfort and may even pose multiple risks to people's health.Keywords: road traffic, noise pollution, sound intensity, public health
Procedia PDF Downloads 2657942 Adaptive Threshold Adjustment of Clear Channel Assessment in LAA Down Link
Authors: Yu Li, Dongyao Wang, Xiaobao Sun, Wei Ni
Abstract:
In long-term evolution (LTE), the carriers around 5GHz are planned to be utilized without licenses to further enlarge system capacity. This feature is termed licensed assisted access (LAA). The channel sensing (clean channel assessment, CCA) is required before any transmission on these unlicensed carriers, in order to make sure the harmonious co-existence of LAA with other radio access technology in the unlicensed band. Obviously, the CCA threshold is very critical, which decides whether the transmission right following CCA is delivered in time and without collisions. An improper CCA threshold may cause buffer overflow of some eNodeBs if the eNodeBs are heavily loaded with the traffic. Thus, to solve these problems, we propose an adaptive threshold adjustment method for CCA in the LAA downlink. Both the load and transmission opportunities are concerned. The trend of the LAA throughput as the threshold varies is obtained, which guides the threshold adjustment. The co-existing between LAA and Wi-Fi is particularly tested. The results from system-level simulation confirm the merits of our design, especially in heavy traffic cases.Keywords: LTE, LAA, CCA, threshold adjustment
Procedia PDF Downloads 1397941 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 377940 KCBA, A Method for Feature Extraction of Colonoscopy Images
Authors: Vahid Bayrami Rad
Abstract:
In recent years, the use of artificial intelligence techniques, tools, and methods in processing medical images and health-related applications has been highlighted and a lot of research has been done in this regard. For example, colonoscopy and diagnosis of colon lesions are some cases in which the process of diagnosis of lesions can be improved by using image processing and artificial intelligence algorithms, which help doctors a lot. Due to the lack of accurate measurements and the variety of injuries in colonoscopy images, the process of diagnosing the type of lesions is a little difficult even for expert doctors. Therefore, by using different software and image processing, doctors can be helped to increase the accuracy of their observations and ultimately improve their diagnosis. Also, by using automatic methods, the process of diagnosing the type of disease can be improved. Therefore, in this paper, a deep learning framework called KCBA is proposed to classify colonoscopy lesions which are composed of several methods such as K-means clustering, a bag of features and deep auto-encoder. Finally, according to the experimental results, the proposed method's performance in classifying colonoscopy images is depicted considering the accuracy criterion.Keywords: colorectal cancer, colonoscopy, region of interest, narrow band imaging, texture analysis, bag of feature
Procedia PDF Downloads 527939 Creating a Multilevel ESL Learning Community for Adults
Authors: Gloria Chen
Abstract:
When offering conventional level-appropriate ESL classes for adults is not feasible, a multilevel adult ESL class can be formed to benefit those who need to learn English for daily function. This paper examines the rationale, the process, the contents, and the outcomes of a multilevel ESL class for adults. The action research discusses a variety of assessments, lesson plans, teaching strategies that facilitate lifelong language learning. In small towns where adult ESL learners are only a handful, often advanced students and inexperienced students have to be placed in one class. Such class might not be viewed as desirable, but with on-going assessments, careful lesson plans, and purposeful strategies, a multilevel ESL class for adults can overcome the obstacles and help learners to reach a higher level of English proficiency. This research explores some hand-on strategies, such as group rotating, cooperative learning, and modifying textbook contents for practical purpose, and evaluate their effectiveness. The data collected in this research include Needs Assessment (beginning of class term), Mid-term Self-Assessment (5 months into class term), End-of-term Student Reflection (10 months into class), and End-of-term Assessment from the Instructor (10 months into class). A descriptive analysis of the data explains the practice of this particular learning community, and reveal the areas for improvement and enrichment. This research answers the following questions: (1) How do the assessments positively help both learners and instructors? (2) How do the learning strategies prepare students to become independent, life-long English learners? (3) How do materials, grouping, and class schedule enhance the learning? The result of the research contributes to the field of teaching and learning in language, not limited in English, by (a) examining strategies of conducting a multilevel adult class, (b) involving adult language learners with various backgrounds and learning styles for reflection and feedback, and (c) improving teaching and learning strategies upon research methods and results. One unique feature of this research is how students can work together with the instructor to form a learning community, seeking and exploring resources available to them, to become lifelong language learners.Keywords: adult language learning, assessment, multilevel, teaching strategies
Procedia PDF Downloads 3517938 Keyframe Extraction Using Face Quality Assessment and Convolution Neural Network
Authors: Rahma Abed, Sahbi Bahroun, Ezzeddine Zagrouba
Abstract:
Due to the huge amount of data in videos, extracting the relevant frames became a necessity and an essential step prior to performing face recognition. In this context, we propose a method for extracting keyframes from videos based on face quality and deep learning for a face recognition task. This method has two steps. We start by generating face quality scores for each face image based on the use of three face feature extractors, including Gabor, LBP, and HOG. The second step consists in training a Deep Convolutional Neural Network in a supervised manner in order to select the frames that have the best face quality. The obtained results show the effectiveness of the proposed method compared to the methods of the state of the art.Keywords: keyframe extraction, face quality assessment, face in video recognition, convolution neural network
Procedia PDF Downloads 2297937 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory
Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan
Abstract:
Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.Keywords: data fusion, Dempster-Shafer theory, data mining, event detection
Procedia PDF Downloads 4097936 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder
Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh
Abstract:
In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization
Procedia PDF Downloads 1127935 Reverse Supply Chain Analysis of Lithium-Ion Batteries Considering Economic and Environmental Aspects
Authors: Aravind G., Arshinder Kaur, Pushpavanam S.
Abstract:
There is a strong emphasis on shifting to electric vehicles (EVs) throughout the globe for reducing the impact on global warming following the Paris climate accord. Lithium-ion batteries (LIBs) are predominantly used in EVs, and these can be a significant threat to the environment if not disposed of safely. Lithium is also a valuable resource not widely available. There are several research groups working on developing an efficient recycling process for LIBs. Two routes - pyrometallurgical and hydrometallurgical processes have been proposed for recycling LIBs. In this paper, we focus on life cycle assessment (LCA) as a tool to quantify the environmental impact of these recycling processes. We have defined the boundary of the LCA to include only the recycling phase of the end-of-life (EoL) of the battery life cycle. The analysis is done assuming ideal conditions for the hydrometallurgical and a combined hydrometallurgical and pyrometallurgical process in the inventory analysis. CML-IA method is used for quantifying the impact assessment across eleven indicators. Our results show that cathode, anode, and foil contribute significantly to the impact. The environmental impacts of both hydrometallurgical and combined recycling processes are similar across all the indicators. Further, the results of LCA are used in developing a multi-objective optimization model for the design of lithium-ion battery recycling network. Greenhouse gas emissions and cost are the two parameters minimized for the optimization study.Keywords: life cycle assessment, lithium-ion battery recycling, multi-objective optimization, network design, reverse supply chain
Procedia PDF Downloads 1547934 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning
Authors: Wei Feilong
Abstract:
In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment
Procedia PDF Downloads 2637933 The Inattentional Blindness Paradigm: A Breaking Wave for Attentional Biases in Test Anxiety
Authors: Kritika Kulhari, Aparna Sahu
Abstract:
Test anxiety results from concerns about failure in examinations or evaluative situations. Attentional biases are known to pronounce the symptomatic expression of test anxiety. In recent times, the inattentional blindness (IB) paradigm has shown promise as an attention bias modification treatment (ABMT) for anxiety by overcoming practice and expectancy effects which preexisting paradigms fail to counter. The IB paradigm assesses the inability of an individual to attend to a stimulus that appears suddenly while indulging in a perceptual discrimination task. The present study incorporated an IB task with three critical items (book, face, and triangle) appearing randomly in the perceptual discrimination task. Attentional biases were assessed as detection and identification of the critical item. The sample (N = 50) consisted of low test anxiety (LTA) and high test anxiety (HTA) groups based on the reactions to tests scale scores. Test threat manipulation was done with pre- and post-test assessment of test anxiety using the State Test Anxiety Inventory. A mixed factorial design with gender, test anxiety, presence or absence of test threat, and critical items was conducted to assess their effects on attentional biases. Results showed only a significant main effect for test anxiety on detection with higher accuracy of detection of the critical item for the LTA group. The study presents promising results in the realm of ABMT for test anxiety.Keywords: attentional bias, attentional bias modification treatment, inattentional blindness, test anxiety
Procedia PDF Downloads 2247932 Household Food Wastage Assessment: A Case Study in South Africa
Authors: Fhumulani R. Ramukhwatho, Roelien du Plessis, Suzan H. H. Oelofse
Abstract:
There are a growing number of scientific papers, journals and reports on household food waste, the reason being that food waste has become a significant global issue that is costing billions of Rands in resources. To reduce food waste in a sustainable manner, it requires an understanding of the generation of food waste. This paper assesses household food wastage in the City of Tshwane Metropolitan Municipality (CTMM). A total of 210 interviewed participants using face-to-face interviews based on a structured questionnaire and the actual weighing of households’ food wasted was quantified using a weighing kitchen scale. Fifty-nine percent of respondents agreed that they wasted food, while 41% thought they did not waste food at all. Households wasted an average total of 6 kg of food waste per week per household. The study concluded that households buy and prepare more food that ends up wasted.Keywords: assessment, developing country, food waste, household
Procedia PDF Downloads 3177931 Digital Image Correlation: Metrological Characterization in Mechanical Analysis
Authors: D. Signore, M. Ferraiuolo, P. Caramuta, O. Petrella, C. Toscano
Abstract:
The Digital Image Correlation (DIC) is a newly developed optical technique that is spreading in all engineering sectors because it allows the non-destructive estimation of the entire surface deformation without any contact with the component under analysis. These characteristics make the DIC very appealing in all the cases the global deformation state is to be known without using strain gages, which are the most used measuring device. The DIC is applicable to any material subjected to distortion caused by either thermal or mechanical load, allowing to obtain high-definition mapping of displacements and deformations. That is why in the civil and the transportation industry, DIC is very useful for studying the behavior of metallic materials as well as of composite materials. DIC is also used in the medical field for the characterization of the local strain field of the vascular tissues surface subjected to uniaxial tensile loading. DIC can be carried out in the two dimension mode (2D DIC) if a single camera is used or in a three dimension mode (3D DIC) if two cameras are involved. Each point of the test surface framed by the cameras can be associated with a specific pixel of the image, and the coordinates of each point are calculated knowing the relative distance between the two cameras together with their orientation. In both arrangements, when a component is subjected to a load, several images related to different deformation states can be are acquired through the cameras. A specific software analyzes the images via the mutual correlation between the reference image (obtained without any applied load) and those acquired during the deformation giving the relative displacements. In this paper, a metrological characterization of the digital image correlation is performed on aluminum and composite targets both in static and dynamic loading conditions by comparison between DIC and strain gauges measures. In the static test, interesting results have been obtained thanks to an excellent agreement between the two measuring techniques. In addition, the deformation detected by the DIC is compliant with the result of a FEM simulation. In the dynamic test, the DIC was able to follow with a good accuracy the periodic deformation of the specimen giving results coherent with the ones given by FEM simulation. In both situations, it was seen that the DIC measurement accuracy depends on several parameters such as the optical focusing, the parameters chosen to perform the mutual correlation between the images and, finally, the reference points on image to be analyzed. In the future, the influence of these parameters will be studied, and a method to increase the accuracy of the measurements will be developed in accordance with the requirements of the industries especially of the aerospace one.Keywords: accuracy, deformation, image correlation, mechanical analysis
Procedia PDF Downloads 3107930 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks
Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang
Abstract:
Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.Keywords: CNN, classification, deep learning, GAN, Resnet50
Procedia PDF Downloads 857929 Factors in a Sustainability Assessment of New Types of Closed Cavity Facades
Authors: Zoran Veršić, Josip Galić, Marin Binički, Lucija Stepinac
Abstract:
With the current increase in CO₂ emissions and global warming, the sustainability of both existing and new solutions must be assessed on a wide scale. As the implementation of closed cavity facades (CCF) is on the rise, a variety of factors must be included in the analysis of new types of CCF. This paper aims to cover the relevant factors included in the sustainability assessment of new types of CCF. Several mathematical models are being used to describe the physical behavior of CCF. Depending on the type of CCF, they cover the main factors which affect the durability of the façade: thermal behavior of various elements in the façade, stress, and deflection of the glass panels, pressure inside a cavity, exchange rate, and the moisture buildup in the cavity. CCF itself represents a complex system in which all mentioned factors must be considered mutually. Still, the façade is only an envelope of a more complex system, the building. Choice of the façade dictates the heat loss and the heat gain, thermal comfort of inner space, natural lighting, and ventilation. Annual consumption of energy for heating, cooling, lighting, and maintenance costs will present the operational advantages or disadvantages of the chosen façade system in both the economic and environmental aspects. Still, the only operational viewpoint is not all-inclusive. As the building codes constantly demand higher energy efficiency as well as transfer to renewable energy sources, the ratio of embodied and lifetime operational energy footprint of buildings is changing. With the drop in operational energy CO₂ emissions, embodied energy emissions present a larger and larger share in the lifecycle emissions of the building. Taken all into account, the sustainability assessment of a façade, as well as other major building elements, should include all mentioned factors during the lifecycle of an element. The challenge of such an approach is a timescale. Depending on the climatic conditions on the building site, the expected lifetime of CCF can exceed 25 years. In such a time span, some of the factors can be estimated more precisely than others. The ones depending on the socio-economic conditions are more likely to be harder to predict than the natural ones like the climatic load. This work recognizes and summarizes the relevant factors needed for the assessment of new types of CCF, considering the entire lifetime of a façade element and economic and environmental aspects.Keywords: assessment, closed cavity façade, life cycle, sustainability
Procedia PDF Downloads 1917928 Feasibility Assessment of High-Temperature Superconducting AC Cable Lines Implementation in Megacities
Authors: Andrey Kashcheev, Victor Sytnikov, Mikhail Dubinin, Elena Filipeva, Dmitriy Sorokin
Abstract:
Various variants of technical solutions aimed at improving the reliability of power supply to consumers of 110 kV substation are considered. For each technical solution, the results of calculation and analysis of electrical modes and short-circuit currents in the electrical network are presented. The estimation of electric energy consumption for losses within the boundaries of substation reconstruction was carried out in accordance with the methodology for determining the standards of technological losses of electricity during its transmission through electric networks. The assessment of the technical and economic feasibility of the use of HTS CL compared with the complex reconstruction of the 110 kV substation was carried out. It is shown that the use of high-temperature superconducting AC cable lines is a possible alternative to traditional technical solutions used in the reconstruction of substations.Keywords: superconductivity, cable lines, superconducting cable, AC cable, feasibility
Procedia PDF Downloads 957927 Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP) for Recovering Signal
Authors: Israa Sh. Tawfic, Sema Koc Kayhan
Abstract:
Given a large sparse signal, great wishes are to reconstruct the signal precisely and accurately from lease number of measurements as possible as it could. Although this seems possible by theory, the difficulty is in built an algorithm to perform the accuracy and efficiency of reconstructing. This paper proposes a new proved method to reconstruct sparse signal depend on using new method called Least Support Matching Pursuit (LS-OMP) merge it with the theory of Partial Knowing Support (PSK) given new method called Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP). The new methods depend on the greedy algorithm to compute the support which depends on the number of iterations. So to make it faster, the PKLS-OMP adds the idea of partial knowing support of its algorithm. It shows the efficiency, simplicity, and accuracy to get back the original signal if the sampling matrix satisfies the Restricted Isometry Property (RIP). Simulation results also show that it outperforms many algorithms especially for compressible signals.Keywords: compressed sensing, lest support orthogonal matching pursuit, partial knowing support, restricted isometry property, signal reconstruction
Procedia PDF Downloads 2397926 An Analysis of Anxious/Depressed Behaviors of Chinese Adolescents
Authors: Zhidong Zhang, Zhi-Chao Zhang, Georgiana Duarte
Abstract:
This study explored early adolescents’ anxious and depressed syndromes in Northeast China. Specifically, the study examined anxious and depressed behaviors and the relationship to education environments. The purpose is to examine how the elements of educational environments and the early adolescents’ behaviors as independent variables influence and possibly predict the early adolescents’ anxious/depressed problems. Achenbach System of Empirically Based Assessment (ASEBA), was the instrument, used in collection of data. A stratified sampling method was utilized to collect data from 2532 participants in seven schools. The results indicated that several background variables influenced anxious/depressed problem. Specifically, age, grade, sports activities and hobbies had a relationship with the anxious/depressed variable.Keywords: anxious/depressed problems, CBCL, empirically-based assessment, internalizing problems
Procedia PDF Downloads 3227925 Flexible Capacitive Sensors Based on Paper Sheets
Authors: Mojtaba Farzaneh, Majid Baghaei Nejad
Abstract:
This article proposes a new Flexible Capacitive Tactile Sensors based on paper sheets. This method combines the parameters of sensor's material and dielectric, and forms a new model of flexible capacitive sensors. The present article tries to present a practical explanation of this method's application and advantages. With the use of this new method, it is possible to make a more flexibility and accurate sensor in comparison with the current models. To assess the performance of this model, the common capacitive sensor is simulated and the proposed model of this article and one of the existing models are assessed. The results of this article indicate that the proposed model of this article can enhance the speed and accuracy of tactile sensor and has less error in comparison with the current models. Based on the results of this study, it can be claimed that in comparison with the current models, the proposed model of this article is capable of representing more flexibility and more accurate output parameters for touching the sensor, especially in abnormal situations and uneven surfaces, and increases accuracy and practicality.Keywords: capacitive sensor, paper sheets, flexible, tactile, uneven
Procedia PDF Downloads 3517924 Hate Speech Detection in Tunisian Dialect
Authors: Helmi Baazaoui, Mounir Zrigui
Abstract:
This study addresses the challenge of hate speech detection in Tunisian Arabic text, a critical issue for online safety and moderation. Leveraging the strengths of the AraBERT model, we fine-tuned and evaluated its performance against the Bi-LSTM model across four distinct datasets: T-HSAB, TNHS, TUNIZI-Dataset, and a newly compiled dataset with diverse labels such as Offensive Language, Racism, and Religious Intolerance. Our experimental results demonstrate that AraBERT significantly outperforms Bi-LSTM in terms of Recall, Precision, F1-Score, and Accuracy across all datasets. The findings underline the robustness of AraBERT in capturing the nuanced features of Tunisian Arabic and its superior capability in classification tasks. This research not only advances the technology for hate speech detection but also provides practical implications for social media moderation and policy-making in Tunisia. Future work will focus on expanding the datasets and exploring more sophisticated architectures to further enhance detection accuracy, thus promoting safer online interactions.Keywords: hate speech detection, Tunisian Arabic, AraBERT, Bi-LSTM, Gemini annotation tool, social media moderation
Procedia PDF Downloads 67923 Analytical Similarity Assessment of Bevacizumab Biosimilar Candidate MB02 Using Multiple State-of-the-Art Assays
Authors: Marie-Elise Beydon, Daniel Sacristan, Isabel Ruppen
Abstract:
MB02 (Alymsys®) is a candidate biosimilar to bevacizumab, which was developed against the reference product (RP) Avastin® sourced from both the European Union (EU) and United States (US). MB02 has been extensively characterized comparatively to Avastin® at a physicochemical and biological level using sensitive orthogonal state-of-the-art analytical methods. MB02 has been demonstrated similar to the RP with regard to its primary and higher-order structure, post- and co-translational profiles such as glycosylation, charge, and size variants. Specific focus has been put on the characterization of Fab-related activities, such as binding to VEGF A 165, which directly reflect the bevacizumab mechanism of action. Fc-related functionality was also investigated, including binding to FcRn, which is indicative of antibodies' half-life. The data generated during the analytical similarity assessment demonstrate the high analytical similarity of MB02 to its RP.Keywords: analytical similarity, bevacizumab, biosimilar, MB02
Procedia PDF Downloads 286