Search results for: mining methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15925

Search results for: mining methods

13915 A Similar Image Retrieval System for Auroral All-Sky Images Based on Local Features and Color Filtering

Authors: Takanori Tanaka, Daisuke Kitao, Daisuke Ikeda

Abstract:

The aurora is an attractive phenomenon but it is difficult to understand the whole mechanism of it. An approach of data-intensive science might be an effective approach to elucidate such a difficult phenomenon. To do that we need labeled data, which shows when and what types of auroras, have appeared. In this paper, we propose an image retrieval system for auroral all-sky images, some of which include discrete and diffuse aurora, and the other do not any aurora. The proposed system retrieves images which are similar to the query image by using a popular image recognition method. Using 300 all-sky images obtained at Tromso Norway, we evaluate two methods of image recognition methods with or without our original color filtering method. The best performance is achieved when SIFT with the color filtering is used and its accuracy is 81.7% for discrete auroras and 86.7% for diffuse auroras.

Keywords: data-intensive science, image classification, content-based image retrieval, aurora

Procedia PDF Downloads 445
13914 An Adaptive Oversampling Technique for Imbalanced Datasets

Authors: Shaukat Ali Shahee, Usha Ananthakumar

Abstract:

A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.

Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling

Procedia PDF Downloads 411
13913 Determination of Water Pollution and Water Quality with Decision Trees

Authors: Çiğdem Bakır, Mecit Yüzkat

Abstract:

With the increasing emphasis on water quality worldwide, the search for and expanding the market for new and intelligent monitoring systems has increased. The current method is the laboratory process, where samples are taken from bodies of water, and tests are carried out in laboratories. This method is time-consuming, a waste of manpower, and uneconomical. To solve this problem, we used machine learning methods to detect water pollution in our study. We created decision trees with the Orange3 software we used in our study and tried to determine all the factors that cause water pollution. An automatic prediction model based on water quality was developed by taking many model inputs such as water temperature, pH, transparency, conductivity, dissolved oxygen, and ammonia nitrogen with machine learning methods. The proposed approach consists of three stages: preprocessing of the data used, feature detection, and classification. We tried to determine the success of our study with different accuracy metrics and the results. We presented it comparatively. In addition, we achieved approximately 98% success with the decision tree.

Keywords: decision tree, water quality, water pollution, machine learning

Procedia PDF Downloads 77
13912 Flavonoids and Phenolic Acids from the Aerial Parts of Alyssum alyssoides

Authors: Olga St. Tsiftsoglou, Diamanto M. Lazari, Eugene L. Kokkalou

Abstract:

Most of Alyssum species of Brassicaceae family have been mainly studied for their contribution in ecology. In this study, A. alyssoides was examined for its chemical substitutes. The methanol extract of its aerial parts was fractionated with liquid-liquid extraction (distribution) with four different solvents of increasing polarity: diethyl ether, ethyl acetate, 1-butanol and water. The diethyl ether and ethyl acetate extracts were further studied for their chemical composition. So far, secondary metabolites which belong to phenolics were isolated by using several chromatographic methods (C.C. and HPLC) and were identified by using spectroscopic methods (UV/Vis, NMR and MS): two phenolic acids (p-hydroxy-benzoic acid and 3-methoxy-4-hydroxy-benzoic acid (vanillic acid)), and five flavonoids, which are derivatives of flavonol: kaempferol 3-O-β-D-glucopyranoside (astragalin), kaempferol 3-O-(6′′-α-L-rhamnopyranosyl)-β-D-glucopyranoside (nicotiflorin), quercetin 3-O-β-D-glucopyranoside (isoquercetin), isorhamnetin-3-O-β-D-glucopyranoside, and isoramnetin 3-O-(6′′-α-L-rhamnopyranosyl)-β-D-glucopyranoside (narcissin).

Keywords: Alyssum, chemical substitutes, flavonoids, phenolic acids

Procedia PDF Downloads 316
13911 An Overview of College English Writing Teaching Studies in China Between 2002 and 2022: Visualization Analysis Based on CiteSpace

Authors: Yang Yiting

Abstract:

This paper employs CiteSpace to conduct a visualiazation analysis of literature on college English writing teaching researches published in core journals from the CNKI database and CSSCI journals between 2002 and 2022. It aims to explore the characteristics of researches and future directions on college English writing teaching. The present study yielded the following major findings: the field primarily focuses on innovative writing teaching models and methods, the integration of traditional classroom teaching and information technology, and instructional strategies to enhance students' writing skills. The future research is anticipated to involve a hybrid writing teaching approach combining online and offline teaching methods, leveraging the "Internet+" digital platform, aiming to elevate students' writing proficiency. This paper also presents a prospective outlook for college English writing teaching research in China.

Keywords: citespace, college English, writing teaching, visualization analysis

Procedia PDF Downloads 67
13910 Generic Hybrid Models for Two-Dimensional Ultrasonic Guided Wave Problems

Authors: Manoj Reghu, Prabhu Rajagopal, C. V. Krishnamurthy, Krishnan Balasubramaniam

Abstract:

A thorough understanding of guided ultrasonic wave behavior in structures is essential for the application of existing Non Destructive Evaluation (NDE) technologies, as well as for the development of new methods. However, the analysis of guided wave phenomena is challenging because of their complex dispersive and multimodal nature. Although numerical solution procedures have proven to be very useful in this regard, the increasing complexity of features and defects to be considered, as well as the desire to improve the accuracy of inspection often imposes a large computational cost. Hybrid models that combine numerical solutions for wave scattering with faster alternative methods for wave propagation have long been considered as a solution to this problem. However usually such models require modification of the base code of the solution procedure. Here we aim to develop Generic Hybrid models that can be directly applied to any two different solution procedures. With this goal in mind, a Numerical Hybrid model and an Analytical-Numerical Hybrid model has been developed. The concept and implementation of these Hybrid models are discussed in this paper.

Keywords: guided ultrasonic waves, Finite Element Method (FEM), Hybrid model

Procedia PDF Downloads 459
13909 Parkinson’s Disease Detection Analysis through Machine Learning Approaches

Authors: Muhtasim Shafi Kader, Fizar Ahmed, Annesha Acharjee

Abstract:

Machine learning and data mining are crucial in health care, as well as medical information and detection. Machine learning approaches are now being utilized to improve awareness of a variety of critical health issues, including diabetes detection, neuron cell tumor diagnosis, COVID 19 identification, and so on. Parkinson’s disease is basically a disease for our senior citizens in Bangladesh. Parkinson's Disease indications often seem progressive and get worst with time. People got affected trouble walking and communicating with the condition advances. Patients can also have psychological and social vagaries, nap problems, hopelessness, reminiscence loss, and weariness. Parkinson's disease can happen in both men and women. Though men are affected by the illness at a proportion that is around partial of them are women. In this research, we have to get out the accurate ML algorithm to find out the disease with a predictable dataset and the model of the following machine learning classifiers. Therefore, nine ML classifiers are secondhand to portion study to use machine learning approaches like as follows, Naive Bayes, Adaptive Boosting, Bagging Classifier, Decision Tree Classifier, Random Forest classifier, XBG Classifier, K Nearest Neighbor Classifier, Support Vector Machine Classifier, and Gradient Boosting Classifier are used.

Keywords: naive bayes, adaptive boosting, bagging classifier, decision tree classifier, random forest classifier, XBG classifier, k nearest neighbor classifier, support vector classifier, gradient boosting classifier

Procedia PDF Downloads 124
13908 Process Data-Driven Representation of Abnormalities for Efficient Process Control

Authors: Hyun-Woo Cho

Abstract:

Unexpected operational events or abnormalities of industrial processes have a serious impact on the quality of final product of interest. In terms of statistical process control, fault detection and diagnosis of processes is one of the essential tasks needed to run the process safely. In this work, nonlinear representation of process measurement data is presented and evaluated using a simulation process. The effect of using different representation methods on the diagnosis performance is tested in terms of computational efficiency and data handling. The results have shown that the nonlinear representation technique produced more reliable diagnosis results and outperforms linear methods. The use of data filtering step improved computational speed and diagnosis performance for test data sets. The presented scheme is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. Thus this scheme helps to reduce the sensitivity of empirical models to noise.

Keywords: fault diagnosis, nonlinear technique, process data, reduced spaces

Procedia PDF Downloads 244
13907 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 141
13906 Studying the Theoretical and Laboratory Design of a Concrete Frame and Optimizing Its Design for Impact and Earthquake Resistance

Authors: Mehrdad Azimzadeh, Seyed Mohammadreza Jabbari, Mohammadreza Hosseinzadeh Alherd

Abstract:

This paper includes experimental results and analytical studies about increasing resistance of single-span reinforced concreted frames against impact factor and their modeling according to optimization methods and optimizing the behavior of these frames under impact loads. During this study, about 30 designs for different frames were modeled and made using specialized software like ANSYS and Sap and their behavior were examined under variable impacts. Then suitable strategies were offered for frames in terms of concrete mixing in order to optimize frame modeling. To reduce the weight of the frames, we had to use fine-grained stones. After designing about eight types of frames for each type of frames, three samples were designed with the aim of controlling the impact strength parameters, and a good shape of the frame was created for the impact resistance, which was a solid frame with muscular legs, and as a bond away from each other as much as possible with a 3 degree gradient in the upper part of the beam.

Keywords: optimization, reinforced concrete, optimization methods, impact load, earthquake

Procedia PDF Downloads 178
13905 Effects of Different Processing Methods on Composition, Physicochemical and Morphological Properties of MR263 Rice Flour

Authors: R. Asmeda, A. Noorlaila, M. H. Norziah

Abstract:

This research work was conducted to investigate the effects of different grinding techniques during the milling process of rice grains on physicochemical characteristics of rice flour produced. Dry grinding, semi-wet grinding, and wet grinding were employed to produce the rice flour. The results indicated that different grinding methods significantly (p ≤ 0.05) affected physicochemical and functional properties of starch except for the carbohydrate content, x-ray diffraction pattern and breakdown viscosity. Dry grinding technique caused highest percentage of starch damage compared to semi-wet and wet grinding. Protein, fat and ash content were highest in rice flour obtained by dry grinding. It was found that wet grinding produce flour with smallest average particle size (8.52 µm), resulting in highest process yield (73.14%). Pasting profiles revealed that dry grinding produce rice flour with significantly lowest pasting temperature and highest setback viscosity.

Keywords: average particle size, grinding techniques, physicochemical characteristics, rice flour

Procedia PDF Downloads 191
13904 Physicochemical Studies and Screening of Aflatoxins and Pesticide Residues in Some 'Honey Pastes' Marketed in Jeddah, Saudi Arabia

Authors: Rashad Al-Hindi

Abstract:

The study aimed at investigating and screening of some contaminants in some honey-based products. Sixty-nine 'honey paste' samples marketed in Jeddah, Saudi Arabia, were subjected to physicochemical studies and screening of aflatoxins and pesticide residues. The physicochemical parameters studied were mainly: moisture content, total sugars, total ash, total nitrogen, fibres, total acidity as citric acid and pH. These parameters were investigated using standard methods of analysis. Mycotoxins (aflatoxins) and pesticide residues were by an enzyme-linked immunosorbent assay (ELISA) according to official methods. Results revealed that mean values of the examined criteria were: 15.44±0.36%; 74±4.30%; 0.40±0.062%; 0.22±0.05%; 6.93±1.30%; 2.53±0.161 mmol/kg; 4.10±0.158, respectively. Overall results proved that all tested honey pastes samples were free from mycotoxins (aflatoxins) and pesticide residues. Therefore, we conclude that 'honey pastes' marketed in Jeddah city, Saudi Arabia were safe for human consumption.

Keywords: aflatoxins, honey mixtures, pesticide residues, physicochemical

Procedia PDF Downloads 170
13903 Architectural Design Studio (ADS) as an Operational Synthesis in Architectural Education

Authors: Francisco A. Ribeiro Da Costa

Abstract:

Who is responsible for teaching architecture; consider various ways to participate in learning, manipulating various pedagogical tools to streamline the creative process. The Architectural Design Studio (ADS) should become a holistic, systemic process responding to the complexity of our world. This essay corresponds to a deep reflection developed by the author on the teaching of architecture. The outcomes achieved are the corollary of experimentation; discussion and application of pedagogical methods that allowed consolidate the creativity applied by students. The purpose is to show the conjectures that have been considered effective in creating an intellectual environment that nurtures the subject of Architectural Design Studio (ADS), as an operational synthesis in the final stage of the degree. These assumptions, which are part of the proposed model, displaying theories and teaching methodologies that try to respect the learning process based on student learning styles Kolb, ensuring their latent specificities and formulating the structure of the ASD discipline. In addition, the assessing methods are proposed, which consider the architectural Design Studio as an operational synthesis in the teaching of architecture.

Keywords: teaching-learning, architectural design studio, architecture, education

Procedia PDF Downloads 385
13902 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text

Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni

Abstract:

The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.

Keywords: cooccurrence graph, entity relation graph, unstructured text, weighted distance

Procedia PDF Downloads 146
13901 WWSE School Development in German Christian Schools Revisited: Organizational Development Taken to a Test

Authors: Marco Sewald

Abstract:

WWSE School Development (Wahrnehmungs- und wertorientierte Schulentwicklung) contains surveys on pupils, teachers and parents and enables schools to align the development to the requirements mentioned by these three stakeholders. WWSE includes a derivative set of questions for Christian schools, meeting their specific needs. The conducted research on WWSE is reflecting contemporary questions on school development, questioning the quality of the implementation of the results of past surveys, delivered by WWSE School Development in Christian schools in Germany. The research focused on questions connected to organizational development, including leadership and change management. This is done contoured to the two other areas of WWSE: human resources development and development of school teaching methods. The chosen research methods are: (1) A quantitative triangulation on three sets of data. Data from a past evaluation taken in 2011, data from a second evaluation covering the same school conducted in 2014 and a structured survey among the teachers, headmasters and members of the school board taken within the research. (2) Interviews with teachers and headmasters have been conducted during the research as a second stage to fortify the result of the quantitative first stage. Results: WWSE is supporting modern school development. While organizational development, leadership, and change management are proofed to be important for modern school development, these areas are widespread underestimated by teachers and headmasters. Especially in comparison to the field of human resource development and to an even bigger extent in comparison to the area of development of school teaching methods. The research concluded, that additional efforts in the area of organizational development are necessary to meet modern demands and the research also shows which areas are the most important ones.

Keywords: school as a social organization, school development, school leadership, WWSE, Wahrnehmungs- und wertorientierte Schulentwicklung

Procedia PDF Downloads 219
13900 A Dynamic Solution Approach for Heart Disease Prediction

Authors: Walid Moudani

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets

Procedia PDF Downloads 406
13899 A Photographic Look on the Socio-Educational Inclusion of Young Refugees and Asylum-Seekers

Authors: Mara Gabrielli, Jordi Pamies Rovira

Abstract:

From a theoretical and interdisciplinary approach to visual ethnography and visual anthropology, this small scale, in-depth study explores the potential of photography as a participatory ethnographic method for a deep-understanding of the socio-educational integration of young refugees and asylum-seekers in the host society as regards their daily experiences, their needs, desires, expectations, and future goals. Qualitative data is collected by the author by observing 12 young participants in the age group 12-24 years per week for 12 months. The data consists of field notes, participatory observation, in-depth interviews with professionals, and the use of visual participatory ethnographic methods. Therefore, the young participants build their stories through the implementation of two participatory photographic methods - the 'photo-diary' and the 'photo-elicitation' - that permit them to analyse and narrate their social and educational experiences from their perspectives, thus collaborating in the construction of knowledge during the different stages of the research. Preliminary findings show the high resilience and social adaptability of young refugees and asylum-seekers to achieve their goals and overcome structural and socio-cultural barriers. However, the uncertainty of their administrative situation during the asylum submission and the lack of specific resources might impact negatively on their educational pathways and the transition to the labour market. Finally, this study also highlights the benefits of participatory photographic methods in ethnographic research, which impacts positively the well-being of these young people, helps them to develop critical thinking, and it also allows them to access information more respectfully when narrating painful experiences.

Keywords: photo-diary, photo-elicitation, resilience, strategies, visual methodologies, young refugees and asylum seekers

Procedia PDF Downloads 118
13898 Large-Scale Electroencephalogram Biometrics through Contrastive Learning

Authors: Mostafa ‘Neo’ Mohsenvand, Mohammad Rasool Izadi, Pattie Maes

Abstract:

EEG-based biometrics (user identification) has been explored on small datasets of no more than 157 subjects. Here we show that the accuracy of modern supervised methods falls rapidly as the number of users increases to a few thousand. Moreover, supervised methods require a large amount of labeled data for training which limits their applications in real-world scenarios where acquiring data for training should not take more than a few minutes. We show that using contrastive learning for pre-training, it is possible to maintain high accuracy on a dataset of 2130 subjects while only using a fraction of labels. We compare 5 different self-supervised tasks for pre-training of the encoder where our proposed method achieves the accuracy of 96.4%, improving the baseline supervised models by 22.75% and the competing self-supervised model by 3.93%. We also study the effects of the length of the signal and the number of channels on the accuracy of the user-identification models. Our results reveal that signals from temporal and frontal channels contain more identifying features compared to other channels.

Keywords: brainprint, contrastive learning, electroencephalo-gram, self-supervised learning, user identification

Procedia PDF Downloads 151
13897 A Pattern Practise for Awareness Educations on Information Security: Information Security Project

Authors: Fati̇h Apaydin

Abstract:

Education technology is an area which constantly changes and creates innovations. As an inevitable part of the changing circumstances, the societies who have a tendency to the improvements keep up with these innovations by using the methods and strategies which have been designed for education technology. At this point, education technology has taken the responsibility to help the individuals improve themselves and teach the effective teaching methods by filling the airs in theoretical information, information security and the practice. The technology which comes to the core of our lives by raising the importance of it day by day and it enforced its position in computer- based environments. As a result, ‘being ready for technological innovations, improvement on computer-based talent, information, ability and attitude’ doctrines have to be given. However, it is today quite hard to deal with the security and reinforcement of this information. The information which is got illegally gives harm to society from every aspect, especially education. This study includes how and to what extent to use these innovative appliances such as computers and the factor of information security of these appliances in computer-based education. As the use of computer is constantly becoming prevalent in our country, both education and computer will never become out of date, so how computer-based education affects our lives and the study of information security for this type of education are important topics.

Keywords: computer, information security, education, technology, development

Procedia PDF Downloads 590
13896 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 232
13895 Effect of Two Cooking Methods on Kinetics of Polyphenol Content, Flavonoid Content and Color of a Tunisian Meal: Molokheiya (Corchorus olitorius)

Authors: S. Njoumi, L. Ben Haj Said, M. J. Amiot, S. Bellagha

Abstract:

The main objective of this research was to establish the kinetics of variation of total polyphenol content (TPC) and total flavonoid content (TFC) in Tunisian Corchorus olitorius powder and in a traditional home cooked-meal (Molokheiya) when using stewing and stir-frying as cooking methods, but also to compare the effect of these two common cooking practices on water content, TPC, TFC and color. The L*, a* and b* coordinates values of the Molokheiya varied from 24.955±0.039 to 21.301±0.036, from -1.556±0.048 to 0.23±0.026 and from 5.675±0.052 to 6.313±0.103 when using stewing and from 21.328±0.025 to 20.56±0.021, from -1.093± 0.011to 0.121±0.007 and from 5.708±0.020 to 6.263±0.007 when using stir-frying, respectively. TPC and TFC increased during cooking. TPC of Molokheiya varied from 29.852±0.866 mg GAE/100 g to 220.416±0.519 mg GAE/100 g after 150 min of stewing and from 25.257±0.259 mg GAE/100 g to 208.897 ±0.173 mg GAE/100 g using stir-frying method during 150 min. TFC of Molokheiya varied from 48.229±1.47 mg QE/100 g to 843.802±1.841 mg QE/100 g when using stewing and from 37.031± 0.368 mg QE/100 g to 775.312±0.736 mg QE/100 g when using stir-frying. Kinetics followed similar curves in all cases but resulted in different final TPC and TFC. The shape of the kinetics curves suggests zero-order kinetics. The mathematical relations and the numerical approach used to model the kinetics of polyphenol and flavonoid contents in Molokheiya are described.

Keywords: Corchorus olitorius, Molokheiya, phenolic compounds, kinetic

Procedia PDF Downloads 349
13894 In vitro Skin Model for Enhanced Testing of Antimicrobial Textiles

Authors: Steven Arcidiacono, Robert Stote, Erin Anderson, Molly Richards

Abstract:

There are numerous standard test methods for antimicrobial textiles that measure activity against specific microorganisms. However, many times these results do not translate to the performance of treated textiles when worn by individuals. Standard test methods apply a single target organism grown under optimal conditions to a textile, then recover the organism to quantitate and determine activity; this does not reflect the actual performance environment that consists of polymicrobial communities in less than optimal conditions or interaction of the textile with the skin substrate. Here we propose the development of in vitro skin model method to bridge the gap between lab testing and wear studies. The model will consist of a defined polymicrobial community of 5-7 commensal microbes simulating the skin microbiome, seeded onto a solid tissue platform to represent the skin. The protocol would entail adding a non-commensal test organism of interest to the defined community and applying a textile sample to the solid substrate. Following incubation, the textile would be removed and the organisms recovered, which would then be quantitated to determine antimicrobial activity. Important parameters to consider include identification and assembly of the defined polymicrobial community, growth conditions to allow the establishment of a stable community, and choice of skin surrogate. This model could answer the following questions: 1) is the treated textile effective against the target organism? 2) How is the defined community affected? And 3) does the textile cause unwanted effects toward the skin simulant? The proposed model would determine activity under conditions comparable to the intended application and provide expanded knowledge relative to current test methods.

Keywords: antimicrobial textiles, defined polymicrobial community, in vitro skin model, skin microbiome

Procedia PDF Downloads 129
13893 Sensory Evaluation of Meat from Broilers Bird Fed Detoxified Jatropher Curcas and that Fed Conventional Feed

Authors: W. S. Lawal, T. A. Akande

Abstract:

Four (4) different methods were employed to detoxified jatropha caucas, they are physical method (if include soaking and drying) chemical method (use of methylated spirit, hexane and methene) biological method,(use of Aspergillus niger and Sunday for 7 days and then baccillus lichifarming) and finally combined method (combination of all these methods). Phobol esther andysis was carried out after the detoxification and was found that combined method is better off (P>0.05). 100 broiler birds was used to further test the effect of detoxified Jatropha by combined method, 50 birds for Jatropha made feed at 10 birds per treatment and was replicated five times, this was also repeated for another 50 birds fed conventional feed, Jatropha made feed was compranded at 8% inclusion level. At the end of the 8th weeks, 8 birds were sacrificed each from each treatment and one bird each was fry, roast, boil and grilled from both conventional and Jatropha fed birds and panelist were served for evaluation. It was found that feeding Jatropha to poultry birds has no effect on the taste of the meat.

Keywords: phobol esther, inclusion level, tolerance level, Jatropha carcass

Procedia PDF Downloads 417
13892 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 320
13891 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering

Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining

Abstract:

DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.

Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)

Procedia PDF Downloads 273
13890 Refined Procedures for Second Order Asymptotic Theory

Authors: Gubhinder Kundhi, Paul Rilstone

Abstract:

Refined procedures for higher-order asymptotic theory for non-linear models are developed. These include a new method for deriving stochastic expansions of arbitrary order, new methods for evaluating the moments of polynomials of sample averages, a new method for deriving the approximate moments of the stochastic expansions; an application of these techniques to gather improved inferences with the weak instruments problem is considered. It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. In our application, finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.

Keywords: edgeworth expansions, higher order asymptotics, saddlepoint expansions, weak instruments

Procedia PDF Downloads 275
13889 Influence of Multi-Walled Carbon Nanotube on Interface Fracture of Sandwich Composite

Authors: Alak Kumar Patra, Nilanjan Mitra

Abstract:

Interface fracture toughness of glass-epoxy (G/E) PVC core sandwich composite with and without MWCNT has been investigated through experimental methods. Results demonstrate an improvement in interface fracture toughness values (GC) of samples with a certain percentages of MWCNT. In addition, dispersion of MWCNT in epoxy resin through sonication followed by mixing of hardener and vacuum assisted resin transfer method (VARTM) used in this study is an easy and cost effective methodology in comparison to previously adopted other methods limited to laminated composites. The study also identifies the optimum weight percentage of MWCNT addition in the resin system for maximum performance gain in interfacial fracture toughness. The results are supported by high resolution transmission electron microscope (HRTEM) analysis and fracture micrograph of field emission scanning electron microscope (FESEM) investigation.

Keywords: carbon nanotube, foam, glass-epoxy, interfacial fracture, sandwich composite

Procedia PDF Downloads 427
13888 Computational Approaches for Ballistic Impact Response of Stainless Steel 304

Authors: A. Mostafa

Abstract:

This paper presents a numerical study on determination of ballistic limit velocity (V50) of stainless steel 304 (SS 304) used in manufacturing security screens. The simulated ballistic impact tests were conducted on clamped sheets with different thicknesses using ABAQUS/Explicit nonlinear finite element (FE) package. The ballistic limit velocity was determined using three approaches, namely: numerical tests based on material properties, FE calculated residual velocities and FE calculated residual energies. Johnson-Cook plasticity and failure criterion were utilized to simulate the dynamic behaviour of the SS 304 under various strain rates, while the well-known Lambert-Jonas equation was used for the data regression for the residual velocity and energy model. Good agreement between the investigated numerical methods was achieved. Additionally, the dependence of the ballistic limit velocity on the sheet thickness was observed. The proposed approaches present viable and cost-effective assessment methods of the ballistic performance of SS 304, which will support the development of robust security screen systems.

Keywords: ballistic velocity, stainless steel, numerical approaches, security screen

Procedia PDF Downloads 158
13887 Calibration of the Discrete Element Method Using a Large Shear Box

Authors: C. J. Coetzee, E. Horn

Abstract:

One of the main challenges in using the Discrete Element Method (DEM) is to specify the correct input parameter values. In general, the models are sensitive to the input parameter values and accurate results can only be achieved if the correct values are specified. For the linear contact model, micro-parameters such as the particle density, stiffness, coefficient of friction, as well as the particle size and shape distributions are required. There is a need for a procedure to accurately calibrate these parameters before any attempt can be made to accurately model a complete bulk materials handling system. Since DEM is often used to model applications in the mining and quarrying industries, a calibration procedure was developed for materials that consist of relatively large (up to 40 mm in size) particles. A coarse crushed aggregate was used as the test material. Using a specially designed large shear box with a diameter of 590 mm, the confined Young’s modulus (bulk stiffness) and internal friction angle of the material were measured by means of the confined compression test and the direct shear test respectively. DEM models of the experimental setup were developed and the input parameter values were varied iteratively until a close correlation between the experimental and numerical results was achieved. The calibration process was validated by modelling the pull-out of an anchor from a bed of material. The model results compared well with experimental measurement.

Keywords: Discrete Element Method (DEM), calibration, shear box, anchor pull-out

Procedia PDF Downloads 290
13886 Leveraging Laser Cladding Technology for Eco-Friendly Solutions and Sustainability in Equipment Refurbishment

Authors: Rakan A. Ahmed, Raja S. Khan, Mohammed M. Qahtani

Abstract:

This paper explores the transformative impact of laser cladding technology on the circular economy, emphasizing its role in reducing environmental impact compared to traditional welding methods. Laser cladding, an innovative manufacturing process, optimizes resource efficiency and sustainability by significantly decreasing power consumption and minimizing material waste. The study explores how laser cladding operates within the framework of the circular economy, promoting energy efficiency, waste reduction, and emissions control. Through a comparative analysis of energy and material consumption between laser cladding and conventional welding methods, the paper highlights the significant strides in environmental conservation and resource optimization made possible by laser cladding. The findings highlight the potential for this technology to revolutionize industrial practices and propel a more sustainable and eco-friendly manufacturing landscape.

Keywords: laser cladding, circular economy, carbon emission, energy

Procedia PDF Downloads 71