Search results for: tree detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4293

Search results for: tree detection

1563 A Network Approach to Analyzing Financial Markets

Authors: Yusuf Seedat

Abstract:

The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.

Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks

Procedia PDF Downloads 192
1562 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder

Authors: Dua Hişam, Serhat İkizoğlu

Abstract:

Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.

Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting

Procedia PDF Downloads 70
1561 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission

Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong

Abstract:

Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.

Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU

Procedia PDF Downloads 292
1560 Comparison of Concentration of Heavy Metals in PM2.5 Analyzed in Three Different Global Research Institutions Using X-Ray Fluorescence

Authors: Sungroul Kim, Yeonjin Kim

Abstract:

This study was conducted by comparing the concentrations of heavy metals analyzed from the same samples with three X-Ray fluorescence (XRF) spectrometer in three different global research institutions, including PAN (A Branch of Malvern Panalytical, Seoul, South Korea), RTI (Research Triangle Institute, NC, U.S.A), and aerosol laboratory in Harvard University, Boston, U.S.A. To achieve our research objectives, the indoor air filter samples were collected at homes (n=24) of adults or child asthmatics then analyzed in PAN followed by Harvard University and RTI consecutively. Descriptive statistics were conducted for data comparison as well as correlation and simple regression analysis using R version 4.0.3. As a result, detection rates of most heavy metals analyzed in three institutions were about 90%. Of the 25 elements commonly analyzed among those institutions, 16 elements showed an R² (coefficient of determination) of 0.7 or higher (10 components were 0.9 or higher). The findings of this study demonstrated that XRF was a useful device ensuring reproducibility and compatibility for measuring heavy metals in PM2.5 collected from indoor air of asthmatics’ home.

Keywords: heavy metals, indoor air quality, PM2.5, X-ray fluorescence

Procedia PDF Downloads 201
1559 A Review of HVDC Modular Multilevel Converters Subjected to DC and AC Faults

Authors: Jude Inwumoh, Adam P. R. Taylor, Kosala Gunawardane

Abstract:

Modular multilevel converters (MMC) exhibit a highly scalable and modular characteristic with good voltage/power expansion, fault tolerance capability, low output harmonic content, good redundancy, and a flexible front-end configuration. Fault detection, location, and isolation, as well as maintaining fault ride-through (FRT), are major challenges to MMC reliability and power supply sustainability. Different papers have been reviewed to seek the best MMC configuration with fault capability. DC faults are the most common fault, while the probability that AC fault occurs in a modular multilevel converter (MCC) is low; though, AC faults consequence are severe. This paper reviews several MMC topologies and modulation techniques in tackling faults. These fault control strategies are compared based on cost, complexity, controllability, and power loss. A meshed network of half-bridge (HB) MMC topology was optimal in rendering fault ride through than any other MMC topologies but only when combined with DC circuit breakers (CBS), AC CBS, and fault current limiters (FCL).

Keywords: MMC-HVDC, DC faults, fault current limiters, control scheme

Procedia PDF Downloads 140
1558 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline

Authors: Leo Nnamdi Ozurumba-Dwight

Abstract:

Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.

Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.

Procedia PDF Downloads 124
1557 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning

Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana

Abstract:

Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.

Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning

Procedia PDF Downloads 40
1556 Tax Evasion with Mobility between the Regular and Irregular Sectors

Authors: Xavier Ruiz Del Portal

Abstract:

This paper incorporates mobility between the legal and black economies into a model of tax evasion with endogenous labor supply in which underreporting is possible in one sector but impossible in the other. We have found that the results of the effects along the extensive margin (number of evaders) become more robust and conclusive than those along the intensive margin (hours of illegal work) usually considered by the literature. In particular, it is shown that the following policies reduce the number of evaders: (a) larger and more progressive evasion penalties; (b) higher detection probabilities; (c) an increase in the legal sector wage rate; (d) a decrease in the moonlighting wage rate; (e) higher costs for creating opportunities to evade; (f) lower opportunities to evade, and (g) greater psychological costs of tax evasion. When tax concealment and illegal work also are taken into account, the effects do not vary significantly under the assumptions in Cowell (1985), except for the fact that policies (a) and (b) only hold as regards low- and middle-income groups and policies (e) and (f) as regards high-income groups.

Keywords: income taxation, tax evasion, extensive margin responses, the penalty system

Procedia PDF Downloads 156
1555 Career Guidance System Using Machine Learning

Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan

Abstract:

Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should properly evaluate their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, Neural Networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable to offer an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.

Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills

Procedia PDF Downloads 81
1554 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection

Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang

Abstract:

To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.

Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method

Procedia PDF Downloads 368
1553 Research on Teachers’ Perceptions on the Usability of Classroom Space: Analysis of a Nation-Wide Questionnaire Survey in Japan

Authors: Masayuki Mori

Abstract:

This study investigates the relationship between teachers’ perceptions of the usability of classroom space and various elements, including both physical and non-physical, of classroom environments. With the introduction of the GIGA School funding program in Japan in 2019, understanding its impact on learning in classroom space is crucial. The program enabled local educational authorities (LEA) to make it possible to provide one PC/tablet for each student of both elementary and junior high schools. Moreover, at the same time, the program also supported LEA to purchase other electronic devices for educational purposes such as electronic whiteboards, large displays, and real image projectors. A nationwide survey was conducted using random sampling methodology among 100 junior high schools to collect data on classroom space. Of those, 60 schools responded to the survey. The survey covered approximately fifty items, including classroom space size, class size, and educational electronic devices owned. After the data compilation, statistical analysis was used to identify correlations between the variables and to explore the extent to which classroom environment elements influenced teachers’ perceptions. Furthermore, decision tree analysis was applied to visualize the causal relationships between the variables. The findings indicate a significant negative correlation between class size and teachers’ evaluation of usability. In addition to the class size, the way students stored their belongings also influenced teachers’ perceptions. As for the placement of educational electronic devices, the installation of a projector produced a small negative correlation with teachers’ perceptions. The study suggests that while the GIGA School funding program is not significantly influential, traditional educational conditions such as class size have a greater impact on teachers’ perceptions of the usability of classroom space. These results highlight the need for awareness and strategies to integrate various elements in designing the learning environment of the classroom for teachers and students to improve their learning experience.

Keywords: classroom space, GIGA School, questionnaire survey, teachers’ perceptions

Procedia PDF Downloads 26
1552 Career Guidance System Using Machine Learning

Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan

Abstract:

Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should evaluate properly their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, neural networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable of offering an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.

Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills

Procedia PDF Downloads 71
1551 Nondestructive Testing for Reinforced Concrete Buildings with Active Infrared Thermography

Authors: Huy Q. Tran, Jungwon Huh, Kiseok Kwak, Choonghyun Kang

Abstract:

Infrared thermography (IRT) technique has been proven to be a good method for nondestructive evaluation of concrete material. In the building, a broad range of applications has been used such as subsurface defect inspection, energy loss, and moisture detection. The purpose of this research is to consider the qualitative and quantitative performance of reinforced concrete deteriorations using active infrared thermography technique. An experiment of three different heating regimes was conducted on a concrete slab in the laboratory. The thermal characteristics of the IRT method, i.e., absolute contrast and observation time, are investigated. A linear relationship between the observation time and the real depth was established with a well linear regression R-squared of 0.931. The results showed that the absolute contrast above defective area increases with the rise of the size of delamination and the heating time. In addition, the depth of delamination can be predicted by using the proposal relationship of this study.

Keywords: concrete building, infrared thermography, nondestructive evaluation, subsurface delamination

Procedia PDF Downloads 283
1550 Emotional Analysis for Text Search Queries on Internet

Authors: Gemma García López

Abstract:

The goal of this study is to analyze if search queries carried out in search engines such as Google, can offer emotional information about the user that performs them. Knowing the emotional state in which the Internet user is located can be a key to achieve the maximum personalization of content and the detection of worrying behaviors. For this, two studies were carried out using tools with advanced natural language processing techniques. The first study determines if a query can be classified as positive, negative or neutral, while the second study extracts emotional content from words and applies the categorical and dimensional models for the representation of emotions. In addition, we use search queries in Spanish and English to establish similarities and differences between two languages. The results revealed that text search queries performed by users on the Internet can be classified emotionally. This allows us to better understand the emotional state of the user at the time of the search, which could involve adapting the technology and personalizing the responses to different emotional states.

Keywords: emotion classification, text search queries, emotional analysis, sentiment analysis in text, natural language processing

Procedia PDF Downloads 142
1549 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based on Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling

Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König

Abstract:

As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focuses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.

Keywords: auto-ID, construction logistic, fuzzy, monitoring, RFID, scheduling

Procedia PDF Downloads 516
1548 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: calibration model, monitoring, quality improvement, feature selection

Procedia PDF Downloads 357
1547 Charging-Vacuum Helium Mass Spectrometer Leak Detection Technology in the Application of Space Products Leak Testing and Error Control

Authors: Jijun Shi, Lichen Sun, Jianchao Zhao, Lizhi Sun, Enjun Liu, Chongwu Guo

Abstract:

Because of the consistency of pressure direction, more short cycle, and high sensitivity, Charging-Vacuum helium mass spectrometer leak testing technology is the most popular leak testing technology for the seal testing of the spacecraft parts, especially the small and medium size ones. Usually, auxiliary pump was used, and the minimum detectable leak rate could reach 5E-9Pa•m3/s, even better on certain occasions. Relative error is more important when evaluating the results. How to choose the reference leak, the background level of helium, and record formats would affect the leak rate tested. In the linearity range of leak testing system, it would reduce 10% relative error if the reference leak with larger leak rate was used, and the relative error would reduce obviously if the background of helium was low efficiently, the record format of decimal was used, and the more stable data were recorded.

Keywords: leak testing, spacecraft parts, relative error, error control

Procedia PDF Downloads 456
1546 Dynamic Process Monitoring of an Ammonia Synthesis Fixed-Bed Reactor

Authors: Bothinah Altaf, Gary Montague, Elaine B. Martin

Abstract:

This study involves the modeling and monitoring of an ammonia synthesis fixed-bed reactor using partial least squares (PLS) and its variants. The process exhibits complex dynamic behavior due to the presence of heat recycling and feed quench. One limitation of static PLS model in this situation is that it does not take account of the process dynamics and hence dynamic PLS was used. Although it showed, superior performance to static PLS in terms of prediction, the monitoring scheme was inappropriate hence adaptive PLS was considered. A limitation of adaptive PLS is that non-conforming observations also contribute to the model, therefore, a new adaptive approach was developed, robust adaptive dynamic PLS. This approach updates a dynamic PLS model and is robust to non-representative data. The developed methodology showed a clear improvement over existing approaches in terms of the modeling of the reactor and the detection of faults.

Keywords: ammonia synthesis fixed-bed reactor, dynamic partial least squares modeling, recursive partial least squares, robust modeling

Procedia PDF Downloads 393
1545 Early Detection of Major Earthquakes Using Broadband Accelerometers

Authors: Umberto Cerasani, Luca Cerasani

Abstract:

Methods for earthquakes forecasting have been intensively investigated in the last decades, but there is still no universal solution agreed by seismologists. Rock failure is most often preceded by a tiny elastic movement in the failure area and by the appearance of micro-cracks. These micro-cracks could be detected at the soil surface and represent useful earth-quakes precursors. The aim of this study was to verify whether tiny raw acceleration signals (in the 10⁻¹ to 10⁻⁴ cm/s² range) prior to the arrival of main primary-waves could be exploitable and related to earthquakes magnitude. Mathematical tools such as Fast Fourier Transform (FFT), moving average and wavelets have been applied on raw acceleration data available on the ITACA web site, and the study focused on one of the most unpredictable earth-quakes, i.e., the August 24th, 2016 at 01H36 one that occurred in the central Italy area. It appeared that these tiny acceleration signals preceding main P-waves have different patterns both on frequency and time domains for high magnitude earthquakes compared to lower ones.

Keywords: earthquake, accelerometer, earthquake forecasting, seism

Procedia PDF Downloads 146
1544 Hyper Tuned RBF SVM: Approach for the Prediction of the Breast Cancer

Authors: Surita Maini, Sanjay Dhanka

Abstract:

Machine learning (ML) involves developing algorithms and statistical models that enable computers to learn and make predictions or decisions based on data without being explicitly programmed. Because of its unlimited abilities ML is gaining popularity in medical sectors; Medical Imaging, Electronic Health Records, Genomic Data Analysis, Wearable Devices, Disease Outbreak Prediction, Disease Diagnosis, etc. In the last few decades, many researchers have tried to diagnose Breast Cancer (BC) using ML, because early detection of any disease can save millions of lives. Working in this direction, the authors have proposed a hybrid ML technique RBF SVM, to predict the BC in earlier the stage. The proposed method is implemented on the Breast Cancer UCI ML dataset with 569 instances and 32 attributes. The authors recorded performance metrics of the proposed model i.e., Accuracy 98.24%, Sensitivity 98.67%, Specificity 97.43%, F1 Score 98.67%, Precision 98.67%, and run time 0.044769 seconds. The proposed method is validated by K-Fold cross-validation.

Keywords: breast cancer, support vector classifier, machine learning, hyper parameter tunning

Procedia PDF Downloads 68
1543 A Memetic Algorithm Approach to Clustering in Mobile Wireless Sensor Networks

Authors: Masood Ahmad, Ataul Aziz Ikram, Ishtiaq Wahid

Abstract:

Wireless sensor network (WSN) is the interconnection of mobile wireless nodes with limited energy and memory. These networks can be deployed formany critical applications like military operations, rescue management, fire detection and so on. In flat routing structure, every node plays an equal role of sensor and router. The topology may change very frequently due to the mobile nature of nodes in WSNs. The topology maintenance may produce more overhead messages. To avoid topology maintenance overhead messages, an optimized cluster based mobile wireless sensor network using memetic algorithm is proposed in this paper. The nodes in this network are first divided into clusters. The cluster leaders then transmit data to that base station. The network is validated through extensive simulation study. The results show that the proposed technique has superior results compared to existing techniques.

Keywords: WSN, routing, cluster based, meme, memetic algorithm

Procedia PDF Downloads 484
1542 Extraction of Polystyrene from Styrofoam Waste: Synthesis of Novel Chelating Resin for the Enrichment and Speciation of Cr(III)/Cr(vi) Ions in Industrial Effluents

Authors: Ali N. Siyal, Saima Q. Memon, Latif Elçi, Aydan Elçi

Abstract:

Polystyrene (PS) was extracted from Styrofoam (expanded polystyrene foam) waste, so called white pollutant. The PS was functionalized with N, N- Bis(2-aminobenzylidene)benzene-1,2-diamine (ABA) ligand through an azo spacer. The resin was characterized by FT-IR spectroscopy and elemental analysis. The PS-N=N-ABA resin was used for the enrichment and speciation of Cr(III)/Cr(VI) ions and total Cr determination in aqueous samples by Flame Atomic Absorption Spectrometry (FAAS). The separation of Cr(III)/Cr(VI) ions was achieved at pH 2. The recovery of Cr(VI) ions was achieved ≥ 95.0% at optimum parameters: pH 2; resin amount 300 mg; flow rates 2.0 mL min-1 of solution and 2.0 mL min-1 of eluent (2.0 mol L-1 HNO3). Total Cr was determined by oxidation of Cr(III) to Cr(VI) ions using H2O2. The limit of detection (LOD) and quantification (LOQ) of Cr(VI) were found to be 0.40 and 1.20 μg L-1, respectively with preconcentration factor of 250. Total saturation and breakthrough capacitates of the resin for Cr(IV) ions were found to be 0.181 and 0.531 mmol g-1, respectively. The proposed method was successfully applied for the preconcentration/speciation of Cr(III)/Cr(VI) ions and determination of total Cr in industrial effluents.

Keywords: styrofoam waste, polymeric resin, preconcentration, speciation, Cr(III)/Cr(VI) ions, FAAS

Procedia PDF Downloads 296
1541 An ANOVA-based Sequential Forward Channel Selection Framework for Brain-Computer Interface Application based on EEG Signals Driven by Motor Imagery

Authors: Forouzan Salehi Fergeni

Abstract:

Converting the movement intents of a person into commands for action employing brain signals like electroencephalogram signals is a brain-computer interface (BCI) system. When left or right-hand motions are imagined, different patterns of brain activity appear, which can be employed as BCI signals for control. To make better the brain-computer interface (BCI) structures, effective and accurate techniques for increasing the classifying precision of motor imagery (MI) based on electroencephalography (EEG) are greatly needed. Subject dependency and non-stationary are two features of EEG signals. So, EEG signals must be effectively processed before being used in BCI applications. In the present study, after applying an 8 to 30 band-pass filter, a car spatial filter is rendered for the purpose of denoising, and then, a method of analysis of variance is used to select more appropriate and informative channels from a category of a large number of different channels. After ordering channels based on their efficiencies, a sequential forward channel selection is employed to choose just a few reliable ones. Features from two domains of time and wavelet are extracted and shortlisted with the help of a statistical technique, namely the t-test. Finally, the selected features are classified with different machine learning and neural network classifiers being k-nearest neighbor, Probabilistic neural network, support-vector-machine, Extreme learning machine, decision tree, Multi-layer perceptron, and linear discriminant analysis with the purpose of comparing their performance in this application. Utilizing a ten-fold cross-validation approach, tests are performed on a motor imagery dataset found in the BCI competition III. Outcomes demonstrated that the SVM classifier got the greatest classification precision of 97% when compared to the other available approaches. The entire investigative findings confirm that the suggested framework is reliable and computationally effective for the construction of BCI systems and surpasses the existing methods.

Keywords: brain-computer interface, channel selection, motor imagery, support-vector-machine

Procedia PDF Downloads 52
1540 Rapid Detection of MBL Genes by SYBR Green Based Real-Time PCR

Authors: Taru Singh, Shukla Das, V. G. Ramachandran

Abstract:

Objectives: To develop SYBR green based real-time PCR assay to detect carbapenemases (NDM, IMP) genes in E. coli. Methods: A total of 40 E. coli from stool samples were tested. Six were previously characterized as resistant to carbapenems and documented by PCR. The remaining 34 isolates previously tested susceptible to carbapenems and were negative for these genes. Bacterial RNA was extracted using manual method. The real-time PCR was performed using the Light Cycler III 480 instrument (Roche) and specific primers for each carbapenemase target were used. Results: Each one of the two carbapenemase gene tested presented a different melting curve after PCR amplification. The melting temperature (Tm) analysis of the amplicons identified was as follows: blaIMP type (Tm 82.18°C), blaNDM-1 (Tm 78.8°C). No amplification was detected among the negative samples. The results showed 100% concordance with the genotypes previously identified. Conclusions: The new assay was able to detect the presence of two different carbapenemase gene type by real-time PCR.

Keywords: resistance, b-lactamases, E. coli, real-time PCR

Procedia PDF Downloads 411
1539 Water Leakage Detection System of Pipe Line using Radial Basis Function Neural Network

Authors: A. Ejah Umraeni Salam, M. Tola, M. Selintung, F. Maricar

Abstract:

Clean water is an essential and fundamental human need. Therefore, its supply must be assured by maintaining the quality, quantity and water pressure. However the fact is, on its distribution system, leakage happens and becomes a common world issue. One of the technical causes of the leakage is a leaking pipe. The purpose of the research is how to use the Radial Basis Function Neural (RBFNN) model to detect the location and the magnitude of the pipeline leakage rapidly and efficiently. In this study the RBFNN are trained and tested on data from EPANET hydraulic modeling system. Method of Radial Basis Function Neural Network is proved capable to detect location and magnitude of pipeline leakage with of the accuracy of the prediction results based on the value of RMSE (Root Meant Square Error), comparison prediction and actual measurement approaches 0.000049 for the whole pipeline system.

Keywords: radial basis function neural network, leakage pipeline, EPANET, RMSE

Procedia PDF Downloads 360
1538 Learning Fashion Construction and Manufacturing Methods from the Past: Cultural History and Genealogy at the Middle Tennessee State University Historic Clothing Collection

Authors: Teresa B. King

Abstract:

In the millennial age, with more students desiring a fashion major yet fewer having sewing and manufacturing knowledge, this increases demand on academicians to adequately educate. While fashion museums have a prominent place for historical preservation, the need for apparel education via working collections of handmade or mass manufactured apparel is lacking in most universities in the United States, especially in the Southern region. Created in 1988, Middle Tennessee State University’s historic clothing collection provides opportunities to study apparel construction methods throughout history, to compare and apply to today’s construction and manufacturing methods, as well as to learn the cyclical nature/importance of historic styles on current and upcoming fashion. In 2019, a class exercise experiment was implemented for which students researched their family genealogy using Ancestry.com, identified the oldest visual media (photographs, etc.) available, and analyzed the garment represented in said media. The student then located a comparable garment in the historic collection and evaluated the construction methods of the ancestor’s time period. A class 'fashion' genealogy tree was created and mounted for public viewing/education. Results of this exercise indicated that student learning increased due to the 'personal/familial connection' as it triggered more interest in historical garments as related to the student’s own personal culture. Students better identified garments regarding the historical time period, fiber content, fabric, and construction methods utilized, thus increasing learning and retention. Students also developed increased learning and recognition of custom construction methods versus current mass manufacturing techniques, which impact today’s fashion industry. A longitudinal effort will continue with the growth of the historic collection and as students continue to utilize the historic clothing collection.

Keywords: ancestry, clothing history, fashion history, genealogy, historic fashion museum collection

Procedia PDF Downloads 138
1537 Molecular Diversity of Forensically Relevant Insects from the Cadavers of Lahore

Authors: Sundus Mona, Atif Adnan, Babar Ali, Fareeha Arshad, Allah Rakha

Abstract:

Molecular diversity is the variation in the abundance of species. Forensic entomology is a neglected field in Pakistan. Insects collected from the crime scene should be handled by forensic entomologists who are currently virtually non-existent in Pakistan. Correct identification of insect specimen along with knowledge of their biodiversity can aid in solving many problems related to complicated forensic cases. Inadequate morphological identification and insufficient thermal biological studies limit the entomological utility in Forensic Medicine. Recently molecular identification of entomological evidence has gained attention globally. DNA barcoding is the latest and established method for species identification. Only proper identification can provide a precise estimation of postmortem intervals. Arthropods are known to be the first tourists scavenging on decomposing dead matter. The objective of the proposed study was to identify species by molecular techniques and analyze their phylogenetic importance with barcoded necrophagous insect species of early succession on human cadavers. Based upon this identification, the study outcomes will be the utilization of established DNA bar codes to identify carrion feeding insect species for concordant estimation of post mortem interval. A molecular identification method involving sequencing of a 658bp ‘barcode’ fragment of the mitochondrial cytochrome oxidase subunit 1 (CO1) gene from collected specimens of unknown dipteral species from cadavers of Lahore was evaluated. Nucleotide sequence divergences were calculated using MEGA 7 and Arlequin, and a neighbor-joining phylogenetic tree was generated. Three species were identified, Chrysomya megacephala, Chrysomya saffranea, and Chrysomya rufifacies with low genetic diversity. The fixation index was 0.83992 that suggests a need for further studies to identify and classify forensically relevant insects in Pakistan. There is an exigency demand for further research especially when immature forms of arthropods are recovered from the crime scene.

Keywords: molecular diversity, DNA barcoding, species identification, forensically relevant

Procedia PDF Downloads 150
1536 Machine Learning Automatic Detection on Twitter Cyberbullying

Authors: Raghad A. Altowairgi

Abstract:

With the wide spread of social media platforms, young people tend to use them extensively as the first means of communication due to their ease and modernity. But these platforms often create a fertile ground for bullies to practice their aggressive behavior against their victims. Platform usage cannot be reduced, but intelligent mechanisms can be implemented to reduce the abuse. This is where machine learning comes in. Understanding and classifying text can be helpful in order to minimize the act of cyberbullying. Artificial intelligence techniques have expanded to formulate an applied tool to address the phenomenon of cyberbullying. In this research, machine learning models are built to classify text into two classes; cyberbullying and non-cyberbullying. After preprocessing the data in 4 stages; removing characters that do not provide meaningful information to the models, tokenization, removing stop words, and lowering text. BoW and TF-IDF are used as the main features for the five classifiers, which are; logistic regression, Naïve Bayes, Random Forest, XGboost, and Catboost classifiers. Each of them scores 92%, 90%, 92%, 91%, 86% respectively.

Keywords: cyberbullying, machine learning, Bag-of-Words, term frequency-inverse document frequency, natural language processing, Catboost

Procedia PDF Downloads 132
1535 Application of Advanced Remote Sensing Data in Mineral Exploration in the Vicinity of Heavy Dense Forest Cover Area of Jharkhand and Odisha State Mining Area

Authors: Hemant Kumar, R. N. K. Sharma, A. P. Krishna

Abstract:

The study has been carried out on the Saranda in Jharkhand and a part of Odisha state. Geospatial data of Hyperion, a remote sensing satellite, have been used. This study has used a wide variety of patterns related to image processing to enhance and extract the mining class of Fe and Mn ores.Landsat-8, OLI sensor data have also been used to correctly explore related minerals. In this way, various processes have been applied to increase the mineralogy class and comparative evaluation with related frequency done. The Hyperion dataset for hyperspectral remote sensing has been specifically verified as an effective tool for mineral or rock information extraction within the band range of shortwave infrared used. The abundant spatial and spectral information contained in hyperspectral images enables the differentiation of different objects of any object into targeted applications for exploration such as exploration detection, mining.

Keywords: Hyperion, hyperspectral, sensor, Landsat-8

Procedia PDF Downloads 125
1534 Detection of Arterial Stiffness in Diabetes Using Photoplethysmograph

Authors: Neelamshobha Nirala, R. Periyasamy, Awanish Kumar

Abstract:

Diabetes is a metabolic disorder and with the increase of global prevalence of diabetes, cardiovascular diseases and mortality related to diabetes has also increased. Diabetes causes the increase of arterial stiffness by elusive hormonal and metabolic abnormalities. We used photoplethysmograph (PPG), a simple non-invasive method to study the change in arterial stiffness due to diabetes. Toe PPG signals were taken from 29 diabetic subjects with mean age of (65±8.4) years and 21 non-diabetic subjects of mean age of (49±14) years. Mean duration of diabetes is 12±8 years for diabetic group. Rise-time (RT) and area under rise time (AUR) were calculated from the PPG signal of each subject and Welch’s t-test is used to find the significant difference between two groups. We obtained a significant difference of (p-value) 0.0005 and 0.03 for RT and AUR respectively between diabetic and non-diabetic subjects. Average value of RT and AUR is 0.298±0.003 msec and 14.4±4.2 arbitrary units respectively for diabetic subject compared to 0.277±0.0005 msec and 13.66±2.3 a.u respectively for non-diabetic subjects. In conclusion, this study support that arterial stiffness is increased in diabetes and can be detected early using PPG.

Keywords: area under rise-time, AUR, arterial stiffness, diabetes, photoplethysmograph, PPG, rise-time (RT)

Procedia PDF Downloads 260