Search results for: computer assisted classification
4240 Multivariate Analysis of Spectroscopic Data for Agriculture Applications
Authors: Asmaa M. Hussein, Amr Wassal, Ahmed Farouk Al-Sadek, A. F. Abd El-Rahman
Abstract:
In this study, a multivariate analysis of potato spectroscopic data was presented to detect the presence of brown rot disease or not. Near-Infrared (NIR) spectroscopy (1,350-2,500 nm) combined with multivariate analysis was used as a rapid, non-destructive technique for the detection of brown rot disease in potatoes. Spectral measurements were performed in 565 samples, which were chosen randomly at the infection place in the potato slice. In this study, 254 infected and 311 uninfected (brown rot-free) samples were analyzed using different advanced statistical analysis techniques. The discrimination performance of different multivariate analysis techniques, including classification, pre-processing, and dimension reduction, were compared. Applying a random forest algorithm classifier with different pre-processing techniques to raw spectra had the best performance as the total classification accuracy of 98.7% was achieved in discriminating infected potatoes from control.Keywords: Brown rot disease, NIR spectroscopy, potato, random forest
Procedia PDF Downloads 1904239 Prevalence and the Results of the Czech Nationwide Survey and Personality Traits of Adolescence Playing Computer Games
Authors: Jaroslava Sucha, Martin Dolejs, Helena Pipova, Panajotis Cakirpaloglu
Abstract:
The paper introduces the research project which is focused on evaluating the level of pathological relation towards computer or video games playing (including any games played by using a screen such as a mobile or a tablet). The study involves representative sample of the Czech adolescents between ages 11 and 19. This poster presents the psychometric indicators of the new psychologic assessment method (mean, standard deviation, reliability, validity) which will be able to detect an acceptable level of games’ playing and at the same time will detect and describe the level of gaming which might be potentially risky. The prevalence of risky computer game playing at Czech adolescents in age 11 to 19 will be mentioned. The research study also aims to describe the personality profile of the problematic players with respect to the digital games. The research area will encompass risky behaviour, aggression, the level of self-esteem, impulsivity, anxiety and depression. The contribution will introduce a new test method for the assessment of pathological playing computer games. The research will give the first screening information of playing computer games in the Czech Republic by adolescents between 11-19 years. The results clarify what relationship exists between playing computer games and selected personality characteristics (it will describe personality of the gamer, who is in the category of ‘pathological playing computer games’).Keywords: adolescence, computer games, personality traits, risk behaviour
Procedia PDF Downloads 2394238 Application of Change Detection Techniques in Monitoring Environmental Phenomena: A Review
Authors: T. Garba, Y. Y. Babanyara, T. O. Quddus, A. K. Mukatari
Abstract:
Human activities make environmental parameters in order to keep on changing globally. While some changes are necessary and beneficial to flora and fauna, others have serious consequences threatening the survival of their natural habitat if these changes are not properly monitored and mitigated. In-situ assessments are characterized by many challenges due to the absence of time series data and sometimes areas to be observed or monitored are inaccessible. Satellites Remote Sensing provide us with the digital images of same geographic areas within a pre-defined interval. This makes it possible to monitor and detect changes of environmental phenomena. This paper, therefore, reviewed the commonly use changes detection techniques globally such as image differencing, image rationing, image regression, vegetation index difference, change vector analysis, principal components analysis, multidate classification, post-classification comparison, and visual interpretation. The paper concludes by suggesting the use of more than one technique.Keywords: environmental phenomena, change detection, monitor, techniques
Procedia PDF Downloads 2744237 Precoding-Assisted Frequency Division Multiple Access Transmission Scheme: A Cyclic Prefixes- Available Modulation-Based Filter Bank Multi-Carrier Technique
Authors: Ying Wang, Jianhong Xiang, Yu Zhong
Abstract:
The offset Quadrature Amplitude Modulation-based Filter Bank Multi-Carrier (FBMC) system provides superior spectral properties over Orthogonal Frequency Division Multiplexing. However, seriously affected by imaginary interference, its performances are hampered in many areas. In this paper, we propose a Precoding-Assisted Frequency Division Multiple Access (PA-FDMA) modulation scheme. By spreading FBMC symbols into the frequency domain and transmitting them with a precoding matrix, the impact of imaginary interference can be eliminated. Specifically, we first generate the coding pre-solution matrix with a nonuniform Fast Fourier Transform and pick the best columns by introducing auxiliary factors. Secondly, according to the column indexes, we obtain the precoding matrix for one symbol and impose scaling factors to ensure that the power is approximately constant throughout the transmission time. Finally, we map the precoding matrix of one symbol to multiple symbols and transmit multiple data frames, thus achieving frequency-division multiple access. Additionally, observing the interference between adjacent frames, we mitigate them by adding frequency Cyclic Prefixes (CP) and evaluating them with a signal-to-interference ratio. Note that PA-FDMA can be considered a CP-available FBMC technique because the underlying strategy is FBMC. Simulation results show that the proposed scheme has better performance compared to Single Carrier Frequency Division Multiple Access (SC-FDMA), etc.Keywords: PA-FDMA, SC-FDMA, FBMC, non-uniform fast fourier transform
Procedia PDF Downloads 644236 Internal Combustion Engine Fuel Composition Detection by Analysing Vibration Signals Using ANFIS Network
Authors: M. N. Khajavi, S. Nasiri, E. Farokhi, M. R. Bavir
Abstract:
Alcohol fuels are renewable, have low pollution and have high octane number; therefore, they are important as fuel in internal combustion engines. Percentage detection of these alcoholic fuels with gasoline is a complicated, time consuming, and expensive process. Nowadays, these processes are done in equipped laboratories, based on international standards. The aim of this research is to determine percentage detection of different fuels based on vibration analysis of engine block signals. By doing, so considerable saving in time and cost can be achieved. Five different fuels consisted of pure gasoline (G) as base fuel and combination of this fuel with different percent of ethanol and methanol are prepared. For example, volumetric combination of pure gasoline with 10 percent ethanol is called E10. By this convention, we made M10 (10% methanol plus 90% pure gasoline), E30 (30% ethanol plus 70% pure gasoline), and M30 (30% Methanol plus 70% pure gasoline) were prepared. To simulate real working condition for this experiment, the vehicle was mounted on a chassis dynamometer and run under 1900 rpm and 30 KW load. To measure the engine block vibration, a three axis accelerometer was mounted between cylinder 2 and 3. After acquisition of vibration signal, eight time feature of these signals were used as inputs to an Adaptive Neuro Fuzzy Inference System (ANFIS). The designed ANFIS was trained for classifying these five different fuels. The results show suitable classification ability of the designed ANFIS network with 96.3 percent of correct classification.Keywords: internal combustion engine, vibration signal, fuel composition, classification, ANFIS
Procedia PDF Downloads 4014235 A Taxonomy of Routing Protocols in Wireless Sensor Networks
Authors: A. Kardi, R. Zagrouba, M. Alqahtani
Abstract:
The Internet of Everything (IoE) presents today a very attractive and motivating field of research. It is basically based on Wireless Sensor Networks (WSNs) in which the routing task is the major analysis topic. In fact, it directly affects the effectiveness and the lifetime of the network. This paper, developed from recent works and based on extensive researches, proposes a taxonomy of routing protocols in WSNs. Our main contribution is that we propose a classification model based on nine classes namely application type, delivery mode, initiator of communication, network architecture, path establishment (route discovery), network topology (structure), protocol operation, next hop selection and latency-awareness and energy-efficient routing protocols. In order to provide a total classification pattern to serve as reference for network designers, each class is subdivided into possible subclasses, presented, and discussed using different parameters such as purposes and characteristics.Keywords: routing, sensor, survey, wireless sensor networks, WSNs
Procedia PDF Downloads 1824234 In vitro Effects of Berberine on the Vitality and Oxidative Profile of Bovine Spermatozoa
Authors: Eva Tvrdá, Hana Greifová, Peter Ivanič, Norbert Lukáč
Abstract:
The aim of this study was to evaluate the dose- and time-dependent in vitro effects of berberine (BER), a natural alkaloid with numerous biological properties on bovine spermatozoa during three time periods (0 h, 2 h, 24 h). Bovine semen samples were diluted and cultivated in physiological saline solution containing 0.5% DMSO together with 200, 100, 50, 10, 5, and 1 μmol/L BER. Spermatozoa motility was assessed using the computer assisted semen analyzer. The viability of spermatozoa was assessed by the metabolic (MTT) assay, production of superoxide radicals was quantified using the nitroblue tetrazolium (NBT) test, and chemiluminescence was used to evaluate the generation of reactive oxygen species (ROS). Cell lysates were prepared and the extent of lipid peroxidation (LPO) was evaluated using the TBARS assay. The results of the movement activity showed a significant increase in the motility during long term cultivation in case of concentrations ranging between 1 and 10 μmol/L BER (P < 0.01; P < 0.001; 24 h). At the same time, supplementation of 1, 5 and 10 μmol/L BER led to a significant preservation of the cell viability (P < 0.001; 24 h). BER addition at a range of 1-50 μmol/L also provided a significantly higher protection against superoxide (P < 0.05) and ROS (P < 0.001; P < 0.01) overgeneration as well as LPO (P < 0.01; P<0.05) after a 24 h cultivation. We may suggest that supplementation of BER to bovine spermatozoa, particularly at concentrations ranging between 1 and 50 μmol/L, may offer protection to the motility, viability and oxidative status of the spermatozoa, particularly notable at 24 h.Keywords: berberine, bulls, motility, oxidative profile, spermatozoa, viability
Procedia PDF Downloads 1304233 A Heart Arrhythmia Prediction Using Machine Learning’s Classification Approach and the Concept of Data Mining
Authors: Roshani S. Golhar, Neerajkumar S. Sathawane, Snehal Dongre
Abstract:
Background and objectives: As the, cardiovascular illnesses increasing and becoming cause of mortality worldwide, killing around lot of people each year. Arrhythmia is a type of cardiac illness characterized by a change in the linearity of the heartbeat. The goal of this study is to develop novel deep learning algorithms for successfully interpreting arrhythmia using a single second segment. Because the ECG signal indicates unique electrical heart activity across time, considerable changes between time intervals are detected. Such variances, as well as the limited number of learning data available for each arrhythmia, make standard learning methods difficult, and so impede its exaggeration. Conclusions: The proposed method was able to outperform several state-of-the-art methods. Also proposed technique is an effective and convenient approach to deep learning for heartbeat interpretation, that could be probably used in real-time healthcare monitoring systemsKeywords: electrocardiogram, ECG classification, neural networks, convolutional neural networks, portable document format
Procedia PDF Downloads 694232 Improvement of Microscopic Detection of Acid-Fast Bacilli for Tuberculosis by Artificial Intelligence-Assisted Microscopic Platform and Medical Image Recognition System
Authors: Hsiao-Chuan Huang, King-Lung Kuo, Mei-Hsin Lo, Hsiao-Yun Chou, Yusen Lin
Abstract:
The most robust and economical method for laboratory diagnosis of TB is to identify mycobacterial bacilli (AFB) under acid-fast staining despite its disadvantages of low sensitivity and labor-intensive. Though digital pathology becomes popular in medicine, an automated microscopic system for microbiology is still not available. A new AI-assisted automated microscopic system, consisting of a microscopic scanner and recognition program powered by big data and deep learning, may significantly increase the sensitivity of TB smear microscopy. Thus, the objective is to evaluate such an automatic system for the identification of AFB. A total of 5,930 smears was enrolled for this study. An intelligent microscope system (TB-Scan, Wellgen Medical, Taiwan) was used for microscopic image scanning and AFB detection. 272 AFB smears were used for transfer learning to increase the accuracy. Referee medical technicians were used as Gold Standard for result discrepancy. Results showed that, under a total of 1726 AFB smears, the automated system's accuracy, sensitivity and specificity were 95.6% (1,650/1,726), 87.7% (57/65), and 95.9% (1,593/1,661), respectively. Compared to culture, the sensitivity for human technicians was only 33.8% (38/142); however, the automated system can achieve 74.6% (106/142), which is significantly higher than human technicians, and this is the first of such an automated microscope system for TB smear testing in a controlled trial. This automated system could achieve higher TB smear sensitivity and laboratory efficiency and may complement molecular methods (eg. GeneXpert) to reduce the total cost for TB control. Furthermore, such an automated system is capable of remote access by the internet and can be deployed in the area with limited medical resources.Keywords: TB smears, automated microscope, artificial intelligence, medical imaging
Procedia PDF Downloads 2294231 Accuracy/Precision Evaluation of Excalibur I: A Neurosurgery-Specific Haptic Hand Controller
Authors: Hamidreza Hoshyarmanesh, Benjamin Durante, Alex Irwin, Sanju Lama, Kourosh Zareinia, Garnette R. Sutherland
Abstract:
This study reports on a proposed method to evaluate the accuracy and precision of Excalibur I, a neurosurgery-specific haptic hand controller, designed and developed at Project neuroArm. Having an efficient and successful robot-assisted telesurgery is considerably contingent on how accurate and precise a haptic hand controller (master/local robot) would be able to interpret the kinematic indices of motion, i.e., position and orientation, from the surgeon’s upper limp to the slave/remote robot. A proposed test rig is designed and manufactured according to standard ASTM F2554-10 to determine the accuracy and precision range of Excalibur I at four different locations within its workspace: central workspace, extreme forward, far left and far right. The test rig is metrologically characterized by a coordinate measuring machine (accuracy and repeatability < ± 5 µm). Only the serial linkage of the haptic device is examined due to the use of the Structural Length Index (SLI). The results indicate that accuracy decreases by moving from the workspace central area towards the borders of the workspace. In a comparative study, Excalibur I performs on par with the PHANToM PremiumTM 3.0 and more accurate/precise than the PHANToM PremiumTM 1.5. The error in Cartesian coordinate system shows a dominant component in one direction (δx, δy or δz) for the movements on horizontal, vertical and inclined surfaces. The average error magnitude of three attempts is recorded, considering all three error components. This research is the first promising step to quantify the kinematic performance of Excalibur I.Keywords: accuracy, advanced metrology, hand controller, precision, robot-assisted surgery, tele-operation, workspace
Procedia PDF Downloads 3364230 Medical Neural Classifier Based on Improved Genetic Algorithm
Authors: Fadzil Ahmad, Noor Ashidi Mat Isa
Abstract:
This study introduces an improved genetic algorithm procedure that focuses search around near optimal solution corresponded to a group of elite chromosome. This is achieved through a novel crossover technique known as Segmented Multi Chromosome Crossover. It preserves the highly important information contained in a gene segment of elite chromosome and allows an offspring to carry information from gene segment of multiple chromosomes. In this way the algorithm has better possibility to effectively explore the solution space. The improved GA is applied for the automatic and simultaneous parameter optimization and feature selection of artificial neural network in pattern recognition of medical problem, the cancer and diabetes disease. The experimental result shows that the average classification accuracy of the cancer and diabetes dataset has improved by 0.1% and 0.3% respectively using the new algorithm.Keywords: genetic algorithm, artificial neural network, pattern clasification, classification accuracy
Procedia PDF Downloads 4744229 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms
Authors: Nebi Gedik
Abstract:
One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram
Procedia PDF Downloads 2224228 Applying Unmanned Aerial Vehicle on Agricultural Damage: A Case Study of the Meteorological Disaster on Taiwan Paddy Rice
Authors: Chiling Chen, Chiaoying Chou, Siyang Wu
Abstract:
Taiwan locates at the west of Pacific Ocean and intersects between continental and marine climate. Typhoons frequently strike Taiwan and come with meteorological disasters, i.e., heavy flooding, landslides, loss of life and properties, etc. Global climate change brings more extremely meteorological disasters. So, develop techniques to improve disaster prevention and mitigation is needed, to improve rescue processes and rehabilitations is important as well. In this study, UAVs (Unmanned Aerial Vehicles) are applied to take instant images for improving the disaster investigation and rescue processes. Paddy rice fields in the central Taiwan are the study area. There have been attacked by heavy rain during the monsoon season in June 2016. UAV images provide the high ground resolution (3.5cm) with 3D Point Clouds to develop image discrimination techniques and digital surface model (DSM) on rice lodging. Firstly, image supervised classification with Maximum Likelihood Method (MLD) is used to delineate the area of rice lodging. Secondly, 3D point clouds generated by Pix4D Mapper are used to develop DSM for classifying the lodging levels of paddy rice. As results, discriminate accuracy of rice lodging is 85% by image supervised classification, and the classification accuracy of lodging level is 87% by DSM. Therefore, UAVs not only provide instant images of agricultural damage after the meteorological disaster, but the image discriminations on rice lodging also reach acceptable accuracy (>85%). In the future, technologies of UAVs and image discrimination will be applied to different crop fields. The results of image discrimination will be overlapped with administrative boundaries of paddy rice, to establish GIS-based assist system on agricultural damage discrimination. Therefore, the time and labor would be greatly reduced on damage detection and monitoring.Keywords: Monsoon, supervised classification, Pix4D, 3D point clouds, discriminate accuracy
Procedia PDF Downloads 3004227 Transformation of Positron Emission Tomography Raw Data into Images for Classification Using Convolutional Neural Network
Authors: Paweł Konieczka, Lech Raczyński, Wojciech Wiślicki, Oleksandr Fedoruk, Konrad Klimaszewski, Przemysław Kopka, Wojciech Krzemień, Roman Shopa, Jakub Baran, Aurélien Coussat, Neha Chug, Catalina Curceanu, Eryk Czerwiński, Meysam Dadgar, Kamil Dulski, Aleksander Gajos, Beatrix C. Hiesmayr, Krzysztof Kacprzak, łukasz Kapłon, Grzegorz Korcyl, Tomasz Kozik, Deepak Kumar, Szymon Niedźwiecki, Dominik Panek, Szymon Parzych, Elena Pérez Del Río, Sushil Sharma, Shivani Shivani, Magdalena Skurzok, Ewa łucja Stępień, Faranak Tayefi, Paweł Moskal
Abstract:
This paper develops the transformation of non-image data into 2-dimensional matrices, as a preparation stage for classification based on convolutional neural networks (CNNs). In positron emission tomography (PET) studies, CNN may be applied directly to the reconstructed distribution of radioactive tracers injected into the patient's body, as a pattern recognition tool. Nonetheless, much PET data still exists in non-image format and this fact opens a question on whether they can be used for training CNN. In this contribution, the main focus of this paper is the problem of processing vectors with a small number of features in comparison to the number of pixels in the output images. The proposed methodology was applied to the classification of PET coincidence events.Keywords: convolutional neural network, kernel principal component analysis, medical imaging, positron emission tomography
Procedia PDF Downloads 1434226 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills
Authors: Hoorieh Darvishi, Rezaei
Abstract:
Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.Keywords: autism, communication_skills, repetitive_behaviors, sensory_integration
Procedia PDF Downloads 94225 Using Probabilistic Neural Network (PNN) for Extracting Acoustic Microwaves (Bulk Acoustic Waves) in Piezoelectric Material
Authors: Hafdaoui Hichem, Mehadjebia Cherifa, Benatia Djamel
Abstract:
In this paper, we propose a new method for Bulk detection of an acoustic microwave signal during the propagation of acoustic microwaves in a piezoelectric substrate (Lithium Niobate LiNbO3). We have used the classification by probabilistic neural network (PNN) as a means of numerical analysis in which we classify all the values of the real part and the imaginary part of the coefficient attenuation with the acoustic velocity in order to build a model from which we note the Bulk waves easily. These singularities inform us of presence of Bulk waves in piezoelectric materials. By which we obtain accurate values for each of the coefficient attenuation and acoustic velocity for Bulk waves. This study will be very interesting in modeling and realization of acoustic microwaves devices (ultrasound) based on the propagation of acoustic microwaves.Keywords: piezoelectric material, probabilistic neural network (PNN), classification, acoustic microwaves, bulk waves, the attenuation coefficient
Procedia PDF Downloads 4324224 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills
Authors: Hoorieh Darvishi, Rezaei
Abstract:
Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.Keywords: autism, narrative speech, persian, SI, repetitive behaviors, communication
Procedia PDF Downloads 124223 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 474222 Evaluation of the Self-Efficacy and Learning Experiences of Final year Students of Computer Science of Southwest Nigerian Universities
Authors: Olabamiji J. Onifade, Peter O. Ajayi, Paul O. Jegede
Abstract:
This study aimed at investigating the preparedness of the undergraduate final year students of Computer Science as the next entrants into the workplace. It assessed their self-efficacy in computational tasks and examined the relationship between their self-efficacy and their learning experiences in Southwest Nigerian universities. The study employed a descriptive survey research design. The population of the study comprises all the final year students of Computer Science. A purposive sampling technique was adopted in selecting a representative sample of interest from the final year students of Computer Science. The Students’ Computational Task Self-Efficacy Questionnaire (SCTSEQ) was used to collect data. Mean, standard deviation, frequency, percentages, and linear regression were used for data analysis. The result obtained revealed that the final year students of Computer Science were averagely confident in performing computational tasks, and there is a significant relationship between the learning experiences of the students and their self-efficacy. The study recommends that the curriculum be improved upon to accommodate industry experts as lecturers in some of the courses, make provision for more practical sessions, and the learning experiences of the student be considered an important component in the undergraduate Computer Science curriculum development process.Keywords: computer science, learning experiences, self-efficacy, students
Procedia PDF Downloads 1434221 Early Stage Suicide Ideation Detection Using Supervised Machine Learning and Neural Network Classifier
Authors: Devendra Kr Tayal, Vrinda Gupta, Aastha Bansal, Khushi Singh, Sristi Sharma, Hunny Gaur
Abstract:
In today's world, suicide is a serious problem. In order to save lives, early suicide attempt detection and prevention should be addressed. A good number of at-risk people utilize social media platforms to talk about their issues or find knowledge on related chores. Twitter and Reddit are two of the most common platforms that are used for expressing oneself. Extensive research has already been done in this field. Through supervised classification techniques like Nave Bayes, Bernoulli Nave Bayes, and Multiple Layer Perceptron on a Reddit dataset, we demonstrate the early recognition of suicidal ideation. We also performed comparative analysis on these approaches and used accuracy, recall score, F1 score, and precision score for analysis.Keywords: machine learning, suicide ideation detection, supervised classification, natural language processing
Procedia PDF Downloads 904220 Knowledge Discovery and Data Mining Techniques in Textile Industry
Authors: Filiz Ersoz, Taner Ersoz, Erkin Guler
Abstract:
This paper addresses the issues and technique for textile industry using data mining techniques. Data mining has been applied to the stitching of garments products that were obtained from a textile company. Data mining techniques were applied to the data obtained from the CHAID algorithm, CART algorithm, Regression Analysis and, Artificial Neural Networks. Classification technique based analyses were used while data mining and decision model about the production per person and variables affecting about production were found by this method. In the study, the results show that as the daily working time increases, the production per person also decreases. In addition, the relationship between total daily working and production per person shows a negative result and the production per person show the highest and negative relationship.Keywords: data mining, textile production, decision trees, classification
Procedia PDF Downloads 3494219 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling
Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal
Abstract:
Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining
Procedia PDF Downloads 1724218 Role of Cellulose Fibers in Tuning the Microstructure and Crystallographic Phase of α-Fe₂O₃ and α-FeOOH Nanoparticles
Authors: Indu Chauhan, Bhupendra S. Butola, Paritosh Mohanty
Abstract:
It is very well known that properties of material changes as their size approach to nanoscale level due to the high surface area to volume ratio. However, in last few decades, a tenet ‘structure dictates function’ is quickly being adopted by researchers working with nanomaterials. The design and exploitation of nanoparticles with tailored shape and size has become one of the primary goals of materials science researchers to expose the properties of nanostructures. To date, various methods, including soft/hard template/surfactant assisted route hydrothermal reaction, seed mediated growth method, capping molecule-assisted synthesis, polyol process, etc. have been adopted to synthesize the nanostructures with controlled size and shape and monodispersity. However controlling the shape and size of nanoparticles is an ultimate challenge of modern material research. In particular, many efforts have been devoted to rational and skillful control of hierarchical and complex nanostructures. Thus in our research work, role of cellulose in manipulating the nanostructures has been discussed. Nanoparticles of α-Fe₂O₃ (diameter ca. 15 to 130 nm) were immobilized on the cellulose fiber surface by a single step in situ hydrothermal method. However, nanoflakes of α-FeOOH having thickness ca. ~25 nm and length ca. ~250 nm were obtained by the same method in absence of cellulose fibers. A possible nucleation and growth mechanism of the formation of nanostructures on cellulose fibers have been proposed. The covalent bond formation between the cellulose fibers and nanostructures has been discussed with supporting evidence from the spectroscopic and other analytical studies such as Fourier transform infrared spectroscopy and X-ray photoelectron spectroscopy. The role of cellulose in manipulating the nanostructures has been discussed.Keywords: cellulose fibers, α-Fe₂O₃, α-FeOOH, hydrothermal, nanoflakes, nanoparticles
Procedia PDF Downloads 1504217 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings
Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir
Abstract:
Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine
Procedia PDF Downloads 1624216 Effect of Ultrasonic Assisted High Pressure Soaking of Soybean on Soymilk Properties
Authors: Rahul Kumar, Pavuluri Srinivasa Rao
Abstract:
This study investigates the effect of ultrasound-assisted high pressure (HP) treatment on the soaking characteristic of soybeans and extracted soy milk quality. The soybean (variety) was subjected to sonication (US) at ambient temperature for 15 and 30 min followed by HP treatment in the range of 200-400 MPa for dwell times 5-10 min. The bean samples were also compared with HPP samples (200-400 MPa; 5-10 mins), overnight soaked samples(12-15 h) and thermal treated samples (100°C/30 min) followed by overnight soaking for 12-15 h soaking. Rapid soaking within 40 min was achieved by the combined US-HPP treatment, and it reduced the soaking time by about 25 times in comparison to overnight soaking or thermal treatment followed by soaking. Reducing the soaking time of soybeans is expected to suppress the development of undesirable beany flavor of soy milk developed during normal soaking milk extraction. The optimum moisture uptake by the sonicated-pressure treated soybeans was 60-62% (w.b) similar to that obtained after overnight soaking for 12-15 h or thermal treatment followed by overnight soaking. pH of soy milk was not much affected by the different US-HPP treatments and overnight soaking which centered around the range of 6.6-6.7 much like the normal cow milk. For milk extracted from thermally treated soy samples, pH reduced to 6.2. Total soluble solids were found to be maximum for the normal overnight soaked soy samples, and it was in the range of 10.3-10.6. For the HPP treated soy milk, the TSS reduced to 7.4 while sonication further reduced it to 6.2. TSS was found to be getting reduced with increasing time of ultrasonication. Further reduction in TSS to 2.3 was observed in soy milk produced from thermally treated samples following overnight soaking. Our results conclude that thermally treated beans' milk is less stable and more acidic, soaking is very rapid compared to overnight soaking hence milk productivity can be enhanced with less development of undesirable beany flavor.Keywords: beany flavor, high pressure processing, high pressure, soybean, soaking, milk, ultrasound, wet basis
Procedia PDF Downloads 2554215 Modular Robotics and Terrain Detection Using Inertial Measurement Unit Sensor
Authors: Shubhakar Gupta, Dhruv Prakash, Apoorv Mehta
Abstract:
In this project, we design a modular robot capable of using and switching between multiple methods of propulsion and classifying terrain, based on an Inertial Measurement Unit (IMU) input. We wanted to make a robot that is not only intelligent in its functioning but also versatile in its physical design. The advantage of a modular robot is that it can be designed to hold several movement-apparatuses, such as wheels, legs for a hexapod or a quadpod setup, propellers for underwater locomotion, and any other solution that may be needed. The robot takes roughness input from a gyroscope and an accelerometer in the IMU, and based on the terrain classification from an artificial neural network; it decides which method of propulsion would best optimize its movement. This provides the bot with adaptability over a set of terrains, which means it can optimize its locomotion on a terrain based on its roughness. A feature like this would be a great asset to have in autonomous exploration or research drones.Keywords: modular robotics, terrain detection, terrain classification, neural network
Procedia PDF Downloads 1454214 ICanny: CNN Modulation Recognition Algorithm
Authors: Jingpeng Gao, Xinrui Mao, Zhibin Deng
Abstract:
Aiming at the low recognition rate on the composite signal modulation in low signal to noise ratio (SNR), this paper proposes a modulation recognition algorithm based on ICanny-CNN. Firstly, the radar signal is transformed into the time-frequency image by Choi-Williams Distribution (CWD). Secondly, we propose an image processing algorithm using the Guided Filter and the threshold selection method, which is combined with the hole filling and the mask operation. Finally, the shallow convolutional neural network (CNN) is combined with the idea of the depth-wise convolution (Dw Conv) and the point-wise convolution (Pw Conv). The proposed CNN is designed to complete image classification and realize modulation recognition of radar signal. The simulation results show that the proposed algorithm can reach 90.83% at 0dB and 71.52% at -8dB. Therefore, the proposed algorithm has a good classification and anti-noise performance in radar signal modulation recognition and other fields.Keywords: modulation recognition, image processing, composite signal, improved Canny algorithm
Procedia PDF Downloads 1914213 The Awareness of Computer Science Students Regarding the Security of Location Based Games
Authors: Jacques Barnard, Magda Huisman, Gunther R. Drevin
Abstract:
Rapid expansion and development in die mobile technology market has created an opportunity for users to participate in location based games. As a consequence of this fast expanding market and new technology, it is important to be aware of the implications this has on security. This paper measures the impact on the security awareness of games’ participants, as well as on that of students at university level with regards to their various stages of input in years of studying and gamer classification. This serves to provide insight into the matter as to discernible differences in the awareness of the security implications concerning these technologies. The data was accumulated via a web questionnaire that was to be completed yearly by students from respective year groups. Results signify a meaningful disparity in security awareness among students completing the varying study years and research. This awareness, however, does not always impact on gamers.Keywords: gamer classifications, location based games, location based data, security awareness
Procedia PDF Downloads 2924212 Efficient Manageability and Intelligent Classification of Web Browsing History Using Machine Learning
Authors: Suraj Gururaj, Sumantha Udupa U.
Abstract:
Browsing the Web has emerged as the de facto activity performed on the Internet. Although browsing gets tracked, the manageability aspect of Web browsing history is very poor. In this paper, we have a workable solution implemented by using machine learning and natural language processing techniques for efficient manageability of user’s browsing history. The significance of adding such a capability to a Web browser is that it ensures efficient and quick information retrieval from browsing history, which currently is very challenging. Our solution guarantees that any important websites visited in the past can be easily accessible because of the intelligent and automatic classification. In a nutshell, our solution-based paper provides an implementation as a browser extension by intelligently classifying the browsing history into most relevant category automatically without any user’s intervention. This guarantees no information is lost and increases productivity by saving time spent revisiting websites that were of much importance.Keywords: adhoc retrieval, Chrome extension, supervised learning, tile, Web personalization
Procedia PDF Downloads 3764211 Ultrasound Assisted Alkaline Potassium Permanganate Pre-Treatment of Spent Coffee Waste
Authors: Rajeev Ravindran, Amit K. Jaiswal
Abstract:
Lignocellulose is the largest reservoir of inexpensive, renewable source of carbon. It is composed of lignin, cellulose and hemicellulose. Cellulose and hemicellulose is composed of reducing sugars glucose, xylose and several other monosaccharides which can be metabolised by microorganisms to produce several value added products such as biofuels, enzymes, aminoacids etc. Enzymatic treatment of lignocellulose leads to the release of monosaccharides such as glucose and xylose. However, factors such as the presence of lignin, crystalline cellulose, acetyl groups, pectin etc. contributes to recalcitrance restricting the effective enzymatic hydrolysis of cellulose and hemicellulose. In order to overcome these problems, pre-treatment of lignocellulose is generally carried out which essentially facilitate better degradation of lignocellulose. A range of pre-treatment strategy is commonly employed based on its mode of action viz. physical, chemical, biological and physico-chemical. However, existing pretreatment strategies result in lower sugar yield and formation of inhibitory compounds. In order to overcome these problems, we proposes a novel pre-treatment, which utilises the superior oxidising capacity of alkaline potassium permanganate assisted by ultra-sonication to break the covalent bonds in spent coffee waste to remove recalcitrant compounds such as lignin. The pre-treatment was conducted for 30 minutes using 2% (w/v) potassium permanganate at room temperature with solid to liquid ratio of 1:10. The pre-treated spent coffee waste (SCW) was subjected to enzymatic hydrolysis using enzymes cellulase and hemicellulase. Shake flask experiments were conducted with a working volume of 50mL buffer containing 1% substrate. The results showed that the novel pre-treatment strategy yielded 7 g/L of reducing sugar as compared to 3.71 g/L obtained from biomass that had undergone dilute acid hydrolysis after 24 hours. From the results obtained it is fairly certain that ultrasonication assists the oxidation of recalcitrant components in lignocellulose by potassium permanganate. Enzyme hydrolysis studies suggest that ultrasound assisted alkaline potassium permanganate pre-treatment is far superior over treatment by dilute acid. Furthermore, SEM, XRD and FTIR were carried out to analyse the effect of the new pre-treatment strategy on structure and crystallinity of pre-treated spent coffee wastes. This novel one-step pre-treatment strategy was implemented under mild conditions and exhibited high efficiency in the enzymatic hydrolysis of spent coffee waste. Further study and scale up is in progress in order to realise future industrial applications.Keywords: spent coffee waste, alkaline potassium permanganate, ultra-sonication, physical characterisation
Procedia PDF Downloads 357