Search results for: chemical classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6561

Search results for: chemical classification

5841 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components

Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura

Abstract:

This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.

Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding

Procedia PDF Downloads 137
5840 An Application to Predict the Best Study Path for Information Technology Students in Learning Institutes

Authors: L. S. Chathurika

Abstract:

Early prediction of student performance is an important factor to be gained academic excellence. Whatever the study stream in secondary education, students lay the foundation for higher studies during the first year of their degree or diploma program in Sri Lanka. The information technology (IT) field has certain improvements in the education domain by selecting specialization areas to show the talents and skills of students. These specializations can be software engineering, network administration, database administration, multimedia design, etc. After completing the first-year, students attempt to select the best path by considering numerous factors. The purpose of this experiment is to predict the best study path using machine learning algorithms. Five classification algorithms: decision tree, support vector machine, artificial neural network, Naïve Bayes, and logistic regression are selected and tested. The support vector machine obtained the highest accuracy, 82.4%. Then affecting features are recognized to select the best study path.

Keywords: algorithm, classification, evaluation, features, testing, training

Procedia PDF Downloads 118
5839 Analysis, Evaluation and Optimization of Food Management: Minimization of Food Losses and Food Wastage along the Food Value Chain

Authors: G. Hafner

Abstract:

A method developed at the University of Stuttgart will be presented: ‘Analysis, Evaluation and Optimization of Food Management’. A major focus is represented by quantification of food losses and food waste as well as their classification and evaluation regarding a system optimization through waste prevention. For quantification and accounting of food, food losses and food waste along the food chain, a clear definition of core terms is required at the beginning. This includes their methodological classification and demarcation within sectors of the food value chain. The food chain is divided into agriculture, industry and crafts, trade and consumption (at home and out of home). For adjustment of core terms, the authors have cooperated with relevant stakeholders in Germany for achieving the goal of holistic and agreed definitions for the whole food chain. This includes modeling of sub systems within the food value chain, definition of terms, differentiation between food losses and food wastage as well as methodological approaches. ‘Food Losses’ and ‘Food Wastes’ are assigned to individual sectors of the food chain including a description of the respective methods. The method for analyzing, evaluation and optimization of food management systems consist of the following parts: Part I: Terms and Definitions. Part II: System Modeling. Part III: Procedure for Data Collection and Accounting Part. IV: Methodological Approaches for Classification and Evaluation of Results. Part V: Evaluation Parameters and Benchmarks. Part VI: Measures for Optimization. Part VII: Monitoring of Success The method will be demonstrated at the example of an invesigation of food losses and food wastage in the Federal State of Bavaria including an extrapolation of respective results to quantify food wastage in Germany.

Keywords: food losses, food waste, resource management, waste management, system analysis, waste minimization, resource efficiency

Procedia PDF Downloads 402
5838 Issues in Translating Hadith Terminologies into English: A Critical Approach

Authors: Mohammed Riyas Pp

Abstract:

This study aimed at investigating major issues in translating the Arabic Hadith terminologies into English, focusing on choosing the most appropriate translation for each, reviewing major Hadith works in English. This study is confined to twenty terminologies with regard to classification of Hadith based on authority, strength, number of transmitters and connections in Isnad. Almost all available translations are collected and analyzed to find the most proper translation based on linguistic and translational values. To the researcher, many translations lack precise understanding of either Hadith terminologies or English language and varieties of methodologies have influence on varieties of translations. This study provides a classification of translational and conceptual issues. Translational issues are related to translatability of these terminologies and their equivalence. Conceptual issues provide a list of misunderstandings due to wrong translations of terminologies. This study ends with a suggestion for unification in translating terminologies based on convention of Muslim scholars having good understanding of Hadith terminologies and English language.

Keywords: english language, hadith terminologies, equivalence in translation, problems in translation

Procedia PDF Downloads 184
5837 Bioremediation as a Treatment of Aromatic Hydrocarbons in Wastewater

Authors: Hen Friman, Alex Schechter, Yeshayahu Nitzan, Rivka Cahan

Abstract:

The treatment of aromatic hydrocarbons in wastewater resulting from oil spills and chemical manufactories is becoming a key concern in many modern countries. Benzene, ethylbenzene, toluene and xylene (BETX) contaminate groundwater as well as soil. These compounds have an acute effect on human health and are known to be carcinogenic. Conventional removal of these toxic materials involves separation and burning of the wastes, however, the cost of chemical treatment is very high and energy consuming. Bioremediation methods for removal of toxic organic compounds constitute an attractive alternative to the conventional chemical or physical techniques. Bioremediation methods use microorganisms to reduce the concentration and toxicity of various chemical pollutants Toluene is biodegradable both aerobically and anaerobically, it can be growth inhibitory to microorganisms at elevated concentrations, even to those species that can use it as a substrate. In this research culture of Pseudomonas putida was grown in bath bio-reactor (BBR) with toluene 100 mg/l as a single carbon source under constant voltage of 125 mV, 250 mV and 500 mV. The culture grown in BBR reached to 0.8 OD660nm while the control culture that grown without external voltage reached only to 0.6 OD660nm. The residual toluene concentration after 147 h, in the BBR operated under external voltage (125 mV) was 22 % on average, while in the control BBR it was 81 % on average.

Keywords: bioremediation, aromatic hydrocarbons, BETX, toluene, pseudomonas putida

Procedia PDF Downloads 310
5836 Chemical Properties of Yushania alpina and Bamusa oldhamii Bamboo Species

Authors: Getu Dessalegn Asfaw, Yalew Dessalegn Asfaw

Abstract:

This research aims to examine the chemical composition of bamboo species in Ethiopia under the effect of age and culm height. The chemical composition of bamboo species in Ethiopia has not been investigated so far. The highest to the lowest cellulose and hemicellulose contents are Injibara (Y. alpina), Mekaneselam (Y. alpina), and Kombolcha (B. oldhamii), whereas lignin, extractives, and ash contents are Kombolcha, Mekanesealm, and Injibra, respectively. As a result of this research, the highest and lowest cellulose, hemicelluloses and lignin contents are at the age of 2 and 1 year old, respectively. Whereas extractives and ash contents are decreased at the age of the culm matured. The cellulose, hemicelluloses, lignin, and ash contents of the culm increase from the bottom to top along the height, however, extractive contents decrease from the bottom to top position. The cellulose content of Injibara, Kombolch, and Mekaneselam bamboo was recorded at 51±1.7–53±1.8%, 45±1.6%–48±1.5%, and 48±1.8–51±1.6%, and hemicelluloses content was measured at 20±1.2–23±1.1%, 17±1.0–19±0.9%, and 18±1.0–20±1.0%, lignin content was measured 19±1.0–21±1.1%, 27±1.2–29±1.1%, and 21±1.1–24±1.1%, extractive content was measured 3.9±0.2 –4.5±0.2%, 6.6±0.3–7.8±0.4%, and 4.7±0.2–5.2±0.1%, ash content was measured 1.6±0.1–2.1±0.1%, 2.8±0.1–3.5±0.2%, and 1.9±0.1–2.5±0.1% at the ages of 1–3 years old, respectively. This result demonstrated that bamboo species in Ethiopia can be a source of feedstock for lignocelluloses ethanol and bamboo composite production since they have higher cellulose content.

Keywords: age, bamboo species, culm height, chemical composition

Procedia PDF Downloads 99
5835 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers

Authors: Oumaima Lahmar

Abstract:

This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.

Keywords: finance literature, textual analysis, topic modeling, perplexity

Procedia PDF Downloads 166
5834 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 88
5833 Large Neural Networks Learning From Scratch With Very Few Data and Without Explicit Regularization

Authors: Christoph Linse, Thomas Martinetz

Abstract:

Recent findings have shown that Neural Networks generalize also in over-parametrized regimes with zero training error. This is surprising, since it is completely against traditional machine learning wisdom. In our empirical study we fortify these findings in the domain of fine-grained image classification. We show that very large Convolutional Neural Networks with millions of weights do learn with only a handful of training samples and without image augmentation, explicit regularization or pretraining. We train the architectures ResNet018, ResNet101 and VGG19 on subsets of the difficult benchmark datasets Caltech101, CUB_200_2011, FGVCAircraft, Flowers102 and StanfordCars with 100 classes and more, perform a comprehensive comparative study and draw implications for the practical application of CNNs. Finally, we show that VGG19 with 140 million weights learns to distinguish airplanes and motorbikes with up to 95% accuracy using only 20 training samples per class.

Keywords: convolutional neural networks, fine-grained image classification, generalization, image recognition, over-parameterized, small data sets

Procedia PDF Downloads 85
5832 Use of Microbial Fuel Cell for Metal Recovery from Wastewater

Authors: Surajbhan Sevda

Abstract:

Metal containing wastewater is generated in large quintiles due to rapid industrialization. Generally, the metal present in wastewater is not biodegradable and can be accumulated in living animals, humans and plant tissue, causing disorder and diseases. The conventional metal recovery methods include chemical, physical and biological methods, but these are chemical and energy intensive. The recent development in microbial fuel cell (MFC) technology provides a new approach for metal recovery; this technology offers a flexible platform for both reduction and oxidation reaction oriented process. The use of MFCs will be a new platform for more efficient and low energy approach for metal recovery from the wastewater. So far metal recover was extensively studied using chemical, physical and biological methods. The MFCs present a new and efficient approach for removing and recovering metals from different wastewater, suggesting the use of different electrode for metal recovery can be a new efficient and effective approach.

Keywords: metal recovery, microbial fuel cell, wastewater, bioelectricity

Procedia PDF Downloads 211
5831 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques

Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas

Abstract:

The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.

Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining

Procedia PDF Downloads 119
5830 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis

Authors: Wenbo Du, Xiaomei Ma

Abstract:

With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.

Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression

Procedia PDF Downloads 145
5829 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow

Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat

Abstract:

Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.

Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement

Procedia PDF Downloads 86
5828 Improved Rare Species Identification Using Focal Loss Based Deep Learning Models

Authors: Chad Goldsworthy, B. Rajeswari Matam

Abstract:

The use of deep learning for species identification in camera trap images has revolutionised our ability to study, conserve and monitor species in a highly efficient and unobtrusive manner, with state-of-the-art models achieving accuracies surpassing the accuracy of manual human classification. The high imbalance of camera trap datasets, however, results in poor accuracies for minority (rare or endangered) species due to their relative insignificance to the overall model accuracy. This paper investigates the use of Focal Loss, in comparison to the traditional Cross Entropy Loss function, to improve the identification of minority species in the “255 Bird Species” dataset from Kaggle. The results show that, although Focal Loss slightly decreased the accuracy of the majority species, it was able to increase the F1-score by 0.06 and improve the identification of the bottom two, five and ten (minority) species by 37.5%, 15.7% and 10.8%, respectively, as well as resulting in an improved overall accuracy of 2.96%.

Keywords: convolutional neural networks, data imbalance, deep learning, focal loss, species classification, wildlife conservation

Procedia PDF Downloads 185
5827 Spatial Data Mining by Decision Trees

Authors: Sihem Oujdi, Hafida Belbachir

Abstract:

Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.

Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining

Procedia PDF Downloads 610
5826 Study of Chemical State Analysis of Rubidium Compounds in Lα, Lβ₁, Lβ₃,₄ and Lγ₂,₃ X-Ray Emission Lines with Wavelength Dispersive X-Ray Fluorescence Spectrometer

Authors: Harpreet Singh Kainth

Abstract:

Rubidium salts have been commonly used as an electrolyte to improve the efficiency cycle of Li-ion batteries. In recent years, it has been implemented into the large scale for further technological advances to improve the performance rate and better cyclability in the batteries. X-ray absorption spectroscopy (XAS) is a powerful tool for obtaining the information in the electronic structure which involves the chemical state analysis in the active materials used in the batteries. However, this technique is not well suited for the industrial applications because it needs a synchrotron X-ray source and special sample file for in-situ measurements. In contrast to this, conventional wavelength dispersive X-ray fluorescence (WDXRF) spectrometer is nondestructive technique used to study the chemical shift in all transitions (K, L, M, …) and does not require any special pre-preparation planning. In the present work, the fluorescent Lα, Lβ₁ , Lβ₃,₄ and Lγ₂,₃ X-ray spectra of rubidium in different chemical forms (Rb₂CO₃ , RbCl, RbBr, and RbI) have been measured first time with high resolution wavelength dispersive X-ray fluorescence (WDXRF) spectrometer (Model: S8 TIGER, Bruker, Germany), equipped with an Rh anode X-ray tube (4-kW, 60 kV and 170 mA). In ₃₇Rb compounds, the measured energy shifts are in the range (-0.45 to - 1.71) eV for Lα X-ray peak, (0.02 to 0.21) eV for Lβ₁ , (0.04 to 0.21) eV for Lβ₃ , (0.15 to 0.43) eV for Lβ₄ and (0.22 to 0.75) eV for Lγ₂,₃ X-ray emission lines. The chemical shifts in rubidium compounds have been measured by considering Rb₂CO₃ compounds taking as a standard reference. A Voigt function is used to determine the central peak position of all compounds. Both positive and negative shifts have been observed in L shell emission lines. In Lα X-ray emission lines, all compounds show negative shift while in Lβ₁, Lβ₃,₄, and Lγ₂,₃ X-ray emission lines, all compounds show a positive shift. These positive and negative shifts result increase or decrease in X-ray energy shifts. It looks like that ligands attached with central metal atom attract or repel the electrons towards or away from the parent nucleus. This pulling and pushing character of rubidium affects the central peak position of the compounds which causes a chemical shift. To understand the chemical effect more briefly, factors like electro-negativity, line intensity ratio, effective charge and bond length are responsible for the chemical state analysis in rubidium compounds. The effective charge has been calculated from Suchet and Pauling method while the line intensity ratio has been calculated by calculating the area under the relevant emission peak. In the present work, it has been observed that electro-negativity, effective charge and intensity ratio (Lβ₁/Lα, Lβ₃,₄/Lα and Lγ₂,₃/Lα) are inversely proportional to the chemical shift (RbCl > RbBr > RbI), while bond length has been found directly proportional to the chemical shift (RbI > RbBr > RbCl).

Keywords: chemical shift in L emission lines, bond length, electro-negativity, effective charge, intensity ratio, Rubidium compounds, WDXRF spectrometer

Procedia PDF Downloads 501
5825 A Robust System for Foot Arch Type Classification from Static Foot Pressure Distribution Data Using Linear Discriminant Analysis

Authors: R. Periyasamy, Deepak Joshi, Sneh Anand

Abstract:

Foot posture assessment is important to evaluate foot type, causing gait and postural defects in all age groups. Although different methods are used for classification of foot arch type in clinical/research examination, there is no clear approach for selecting the most appropriate measurement system. Therefore, the aim of this study was to develop a system for evaluation of foot type as clinical decision-making aids for diagnosis of flat and normal arch based on the Arch Index (AI) and foot pressure distribution parameter - Power Ratio (PR) data. The accuracy of the system was evaluated for 27 subjects with age ranging from 24 to 65 years. Foot area measurements (hind foot, mid foot, and forefoot) were acquired simultaneously from foot pressure intensity image using portable PedoPowerGraph system and analysis of the image in frequency domain to obtain foot pressure distribution parameter - PR data. From our results, we obtain 100% classification accuracy of normal and flat foot by using the linear discriminant analysis method. We observe there is no misclassification of foot types because of incorporating foot pressure distribution data instead of only arch index (AI). We found that the mid-foot pressure distribution ratio data and arch index (AI) value are well correlated to foot arch type based on visual analysis. Therefore, this paper suggests that the proposed system is accurate and easy to determine foot arch type from arch index (AI), as well as incorporating mid-foot pressure distribution ratio data instead of physical area of contact. Hence, such computational tool based system can help the clinicians for assessment of foot structure and cross-check their diagnosis of flat foot from mid-foot pressure distribution.

Keywords: arch index, computational tool, static foot pressure intensity image, foot pressure distribution, linear discriminant analysis

Procedia PDF Downloads 494
5824 The Effect of Traffic on Harmful Metals and Metalloids in the Street Dust and Surface Soil from Urban Areas of Tehran, Iran: Levels, Distribution and Chemical Partitioning Based on Single and Sequential Extraction Procedures

Authors: Hossein Arfaeinia, Ahmad Jonidi Jafari, Sina Dobaradaran, Sadegh Niazi, Mojtaba Ehsanifar, Amir Zahedi

Abstract:

Street dust and surface soil samples were collected from very heavy, heavy, medium and low traffic areas and natural site in Tehran, Iran. These samples were analyzed for some physical–chemical features, total and chemical speciation of selected metals and metalloids (Zn, Al, Sr, Pb, Cu, Cr, Cd, Co, Ni, and V) to study the effect of traffic on their mobility and accumulation in the environment. The pH, electrical conductivity (EC), carbonates and organic carbon (OC) values were similar in soil and dust samples from similar traffic areas. The traffic increases EC contents in dust/soil matrixes but has no effect on concentrations of metals and metalloids in soil samples. Rises in metal and metalloids levels with traffic were found in dust samples. Moreover, the traffic increases the percentage of acid soluble fraction and Fe and Mn oxides associated fractions of Pb and Zn. The mobilization of Cu, Zn, Pb, Cr in dust samples was easier than in soil. The speciation of metals and metalloids except Cd is mainly affected by physicochemical features in soil, although total metals and metalloids affected the speciation in dust samples (except chromium and nickel).

Keywords: street dust, surface soil, traffic, metals, metalloids, chemical speciation

Procedia PDF Downloads 252
5823 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction

Authors: Kefaya Qaddoum

Abstract:

Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.

Keywords: tomato yield prediction, naive Bayes, redundancy, WSG

Procedia PDF Downloads 228
5822 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods

Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno

Abstract:

Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.

Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management

Procedia PDF Downloads 490
5821 Modeling of Processes Running in Radical Clusters Formed by Ionizing Radiation with the Help of Continuous Petri Nets and Oxygen Effect

Authors: J. Barilla, M. Lokajíček, H. Pisaková, P. Simr

Abstract:

The final biological effect of ionizing particles may be influenced strongly by some chemical substances present in cells mainly in the case of low-LET radiation. The influence of oxygen may be particularly important because oxygen is always present in living cells. The corresponding processes are then running mainly in the chemical stage of radio biological mechanism. The radical clusters formed by densely ionizing ends of primary or secondary charged particles are mainly responsible for final biological effect. The damage effect depends then on radical concentration at a time when the cluster meets a DNA molecule. It may be strongly influenced by oxygen present in a cell as oxygen may act in different directions: at small concentration of it the interaction with hydrogen radicals prevails while at higher concentrations additional efficient oxygen radicals may be formed. The basic radical concentration in individual clusters diminishes, which is influenced by two parallel processes: chemical reactions and diffusion of corresponding clusters. The given simultaneous evolution may be modeled and analyzed well with the help of Continuous Petri nets. The influence of other substances present in cells during irradiation may be studied, too. Some results concerning the impact of oxygen content will be presented.

Keywords: radiobiological mechanism, chemical phase, DSB formation, Petri nets

Procedia PDF Downloads 308
5820 Distangling Biological Noise in Cellular Images with a Focus on Explainability

Authors: Manik Sharma, Ganapathy Krishnamurthi

Abstract:

The cost of some drugs and medical treatments has risen in recent years, that many patients are having to go without. A classification project could make researchers more efficient. One of the more surprising reasons behind the cost is how long it takes to bring new treatments to market. Despite improvements in technology and science, research and development continues to lag. In fact, finding new treatment takes, on average, more than 10 years and costs hundreds of millions of dollars. If successful, we could dramatically improve the industry's ability to model cellular images according to their relevant biology. In turn, greatly decreasing the cost of treatments and ensure these treatments get to patients faster. This work aims at solving a part of this problem by creating a cellular image classification model which can decipher the genetic perturbations in cell (occurring naturally or artificially). Another interesting question addressed is what makes the deep-learning model decide in a particular fashion, which can further help in demystifying the mechanism of action of certain perturbations and paves a way towards the explainability of the deep-learning model.

Keywords: cellular images, genetic perturbations, deep-learning, explainability

Procedia PDF Downloads 105
5819 Detection and Classification of Rubber Tree Leaf Diseases Using Machine Learning

Authors: Kavyadevi N., Kaviya G., Gowsalya P., Janani M., Mohanraj S.

Abstract:

Hevea brasiliensis, also known as the rubber tree, is one of the foremost assets of crops in the world. One of the most significant advantages of the Rubber Plant in terms of air oxygenation is its capacity to reduce the likelihood of an individual developing respiratory allergies like asthma. To construct such a system that can properly identify crop diseases and pests and then create a database of insecticides for each pest and disease, we must first give treatment for the illness that has been detected. We shall primarily examine three major leaf diseases since they are economically deficient in this article, which is Bird's eye spot, algal spot and powdery mildew. And the recommended work focuses on disease identification on rubber tree leaves. It will be accomplished by employing one of the superior algorithms. Input, Preprocessing, Image Segmentation, Extraction Feature, and Classification will be followed by the processing technique. We will use time-consuming procedures that they use to detect the sickness. As a consequence, the main ailments, underlying causes, and signs and symptoms of diseases that harm the rubber tree are covered in this study.

Keywords: image processing, python, convolution neural network (CNN), machine learning

Procedia PDF Downloads 74
5818 Classifications of Sleep Apnea (Obstructive, Central, Mixed) and Hypopnea Events Using Wavelet Packet Transform and Support Vector Machines (VSM)

Authors: Benghenia Hadj Abd El Kader

Abstract:

Sleep apnea events as obstructive, central, mixed or hypopnea are characterized by frequent breathing cessations or reduction in upper airflow during sleep. An advanced method for analyzing the patterning of biomedical signals to recognize obstructive sleep apnea and hypopnea is presented. In the aim to extract characteristic parameters, which will be used for classifying the above stated (obstructive, central, mixed) sleep apnea and hypopnea, the proposed method is based first on the analysis of polysomnography signals such as electrocardiogram signal (ECG) and electromyogram (EMG), then classification of the (obstructive, central, mixed) sleep apnea and hypopnea. The analysis is carried out using the wavelet transform technique in order to extract characteristic parameters whereas classification is carried out by applying the SVM (support vector machine) technique. The obtained results show good recognition rates using characteristic parameters.

Keywords: obstructive, central, mixed, sleep apnea, hypopnea, ECG, EMG, wavelet transform, SVM classifier

Procedia PDF Downloads 367
5817 Low-Temperature Silanization of Medical Vials: Chemical Bonding and Performance

Authors: Yuanping Yang, Ruolin Zhou, Xingyu Liu, Lianbin Wu

Abstract:

Based on the challenges of silanization of pharmaceutical glass packaging materials, the silicone oil high-temperature baking method consumes a lot of energy; silicone oil is generally physically adsorbed on the inner surface of the medical vials, leading to protein adsorption on the surface of the silicone oil and fall off, so that the number of particles in the drug solution increases, which brings potential risks to people. In this paper, a new silanizing method is proposed. High-efficiency silanization is achieved by grafting trimethylsilyl groups to the inner surface of medical vials by chemical bond at low temperatures. The inner wall of the vial successfully obtained stable hydrophobicity, and the water contact Angle of the surface reached 100°~110°. With the increase of silicified reagent concentration, the water resistance of corresponding treatment vials increased gradually. This treatment can effectively reduce the risk of pH value increase and sodium ion leaching.

Keywords: low-temperature silanization, medical vials, chemical bonding, hydrophobicity

Procedia PDF Downloads 75
5816 Thermodynamics of Water Condensation on an Aqueous Organic-Coated Aerosol Aging via Chemical Mechanism

Authors: Yuri S. Djikaev

Abstract:

A large subset of aqueous aerosols can be initially (immediately upon formation) coated with various organic amphiphilic compounds whereof the hydrophilic moieties are attached to the aqueous aerosol core while the hydrophobic moieties are exposed to the air thus forming a hydrophobic coating thereupon. We study the thermodynamics of water condensation on such an aerosol whereof the hydrophobic organic coating is being concomitantly processed by chemical reactions with atmospheric reactive species. Such processing (chemical aging) enables the initially inert aerosol to serve as a nucleating center for water condensation. The most probable pathway of such aging involves atmospheric hydroxyl radicals that abstract hydrogen atoms from hydrophobic moieties of surface organics (first step), the resulting radicals being quickly oxidized by ubiquitous atmospheric oxygen molecules to produce surface-bound peroxyl radicals (second step). Taking these two reactions into account, we derive an expression for the free energy of formation of an aqueous droplet on an organic-coated aerosol. The model is illustrated by numerical calculations. The results suggest that the formation of aqueous cloud droplets on such aerosols is most likely to occur via Kohler activation rather than via nucleation. The model allows one to determine the threshold parameters necessary for their Kohler activation. Numerical results also corroborate previous suggestions that one can neglect some details of aerosol chemical composition in investigating aerosol effects on climate.

Keywords: aqueous aerosols, organic coating, chemical aging, cloud condensation nuclei, Kohler activation, cloud droplets

Procedia PDF Downloads 388
5815 Electrical Properties of CVD-Graphene on SiC

Authors: Bilal Jabakhanji, Dimitris Kazazis, Adrien Michon, Christophe Consejo, Wilfried Desrat, Benoit Jouault

Abstract:

In this paper, we investigate the electrical properties of graphene grown by Chemical Vapor Deposition (CVD) on the Si face of SiC substrates. Depending on the growth condition, hole or electron doping can be achieved, down to a few 1011cm−2. The high homogeneity of the graphene and the low intrinsic carrier concentration, allow the remarkable observation of the Half Integer Quantum Hall Effect, typical of graphene, at the centimeter scale.

Keywords: graphene, quantum hall effect, chemical vapor, deposition, silicon carbide

Procedia PDF Downloads 662
5814 Effect of Thistle Ecotype in the Physical-Chemical and Sensorial Properties of Serra da Estrela Cheese

Authors: Raquel P. F. Guiné, Marlene I. C. Tenreiro, Ana C. Correia, Paulo Barracosa, Paula M. R. Correia

Abstract:

The objective of this study was to evaluate the physical and chemical characteristics of Serra da Estrela cheese and compare these results with those of the sensory analysis. For the study were taken six samples of Serra da Estrela cheese produced with 6 different ecotypes of thistle in a dairy situated in Penalva do Castelo. The chemical properties evaluated were moisture content, protein, fat, ash, chloride and pH; the physical properties studied were color and texture; and finally a sensory evaluation was undertaken. The results showed moisture varying in the range 40-48%, protein in the range 15-20%, fat between 41-45%, ash between 3.9-5.0% and chlorides varying from 1.2 to 3.0%. The pH varied from 4.8 to 5.4. The textural properties revealed that the crust hardness is relatively low (maximum 7.3 N), although greater than flesh firmness (maximum 1.7 N), and also that these cheeses are in fact soft paste type, with measurable stickiness and intense adhesiveness. The color analysis showed that the crust is relatively light (L* over 50), and with a predominant yellow coloration (b* around 20 or over) although with a slight greenish tone (a* negative). The results of the sensory analysis did not show great variability for most of the attributes measured, although some differences were found in attributes such as crust thickness, crust uniformity, and creamy flesh.

Keywords: chemical composition, color, sensorial analysis, Serra da Estrela cheese, texture

Procedia PDF Downloads 299
5813 Discrimination and Classification of Vestibular Neuritis Using Combined Fisher and Support Vector Machine Model

Authors: Amine Ben Slama, Aymen Mouelhi, Sondes Manoubi, Chiraz Mbarek, Hedi Trabelsi, Mounir Sayadi, Farhat Fnaiech

Abstract:

Vertigo is a sensation of feeling off balance; the cause of this symptom is very difficult to interpret and needs a complementary exam. Generally, vertigo is caused by an ear problem. Some of the most common causes include: benign paroxysmal positional vertigo (BPPV), Meniere's disease and vestibular neuritis (VN). In clinical practice, different tests of videonystagmographic (VNG) technique are used to detect the presence of vestibular neuritis (VN). The topographical diagnosis of this disease presents a large diversity in its characteristics that confirm a mixture of problems for usual etiological analysis methods. In this study, a vestibular neuritis analysis method is proposed with videonystagmography (VNG) applications using an estimation of pupil movements in the case of an uncontrolled motion to obtain an efficient and reliable diagnosis results. First, an estimation of the pupil displacement vectors using with Hough Transform (HT) is performed to approximate the location of pupil region. Then, temporal and frequency features are computed from the rotation angle variation of the pupil motion. Finally, optimized features are selected using Fisher criterion evaluation for discrimination and classification of the VN disease.Experimental results are analyzed using two categories: normal and pathologic. By classifying the reduced features using the Support Vector Machine (SVM), 94% is achieved as classification accuracy. Compared to recent studies, the proposed expert system is extremely helpful and highly effective to resolve the problem of VNG analysis and provide an accurate diagnostic for medical devices.

Keywords: nystagmus, vestibular neuritis, videonystagmographic system, VNG, Fisher criterion, support vector machine, SVM

Procedia PDF Downloads 134
5812 Machine Learning Techniques in Bank Credit Analysis

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner

Abstract:

The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.

Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines

Procedia PDF Downloads 101