Search results for: predictive accuracy
2969 Prevalence of Thyroid Disorders in Pregnancy in Northern Algeria
Authors: Samira Akdader-Oudahmane, Assia Kamel, Lynda Lakabi, Michael Bruce Zimmermann, Zohra Hamouli-Said, Djamila Meskine
Abstract:
Background: Iodine is a trace element whose adequate intakes are essential during pregnancy to promote the correct growth and development of the fetus. Iodine deficiency is the cause of several disorders in foetal development, and thyroid disorders during pregnancy are associated with an increased risk of miscarriage or premature birth. The aim of this study was to assess the iodine status and thyroid function of pregnant women (PW) in northern Algeria. Methods: Healthy PW were recruited from an urban area (Algiers). Spot urine and venous blood samples were collected to assess iodine status (urinary iodine concentration, UIC) and serum thyroid hormones (TSH, FT4), and anti-thyroid peroxidase antibodies (TPO-Ab) concentrations. Results: The median UIC for the PW (n=172) in Algiers was 246,74µg/L, 244,68 µg/L, and 220,63µg/L, respectively, during the first, second, and third trimesters of pregnancy. Mean TSH and FT4 concentrations were within reference ranges in all groups of women. Among PW, 72.7%, 75.4%, and 75.5% in the first, second and third trimester were TPO-Ab+. Among PW, 14%, 10%, and 10% in the first, second and third trimester, respectively, with TPO -Ab+ had subclinical hypothyroidism. An analysis of the variations in the levels of the serum parameters (FT4, TSH and anti-TPO antibodies) was analyzed according to the UIC intervals admitted and show that these marker are predictive of thyroid function. Conclusion: In northern Algeria, median UICs indicate iodine sufficiency in PW. About 75% of PW are TPO-Ab+ and the prevalence of subclinical hypothyroidism is high.Keywords: thyroid, pregnant woman, urinary iodine, subclinical hypothyroidism
Procedia PDF Downloads 792968 A Stochastic Model to Predict Earthquake Ground Motion Duration Recorded in Soft Soils Based on Nonlinear Regression
Authors: Issam Aouari, Abdelmalek Abdelhamid
Abstract:
For seismologists, the characterization of seismic demand should include the amplitude and duration of strong shaking in the system. The duration of ground shaking is one of the key parameters in earthquake resistant design of structures. This paper proposes a nonlinear statistical model to estimate earthquake ground motion duration in soft soils using multiple seismicity indicators. Three definitions of ground motion duration proposed by literature have been applied. With a comparative study, we select the most significant definition to use for predict the duration. A stochastic model is presented for the McCann and Shah Method using nonlinear regression analysis based on a data set for moment magnitude, source to site distance and site conditions. The data set applied is taken from PEER strong motion databank and contains shallow earthquakes from different regions in the world; America, Turkey, London, China, Italy, Chili, Mexico...etc. Main emphasis is placed on soft site condition. The predictive relationship has been developed based on 600 records and three input indicators. Results have been compared with others published models. It has been found that the proposed model can predict earthquake ground motion duration in soft soils for different regions and sites conditions.Keywords: duration, earthquake, prediction, regression, soft soil
Procedia PDF Downloads 1532967 Artificial Intelligance Features in Canva
Authors: Amira Masood, Zainah Alshouri, Noor Bantan, Samira Kutbi
Abstract:
Artificial intelligence is continuously becoming more advanced and more widespread and is present in many of our day-to-day lives as a means of assistance in numerous different fields. A growing number of people, companies, and corporations are utilizing Canva and its AI tools as a method of quick and easy media production. Hence, in order to test the integrity of the rapid growth of AI, this paper will explore the usefulness of Canva's advanced design features as well as their accuracy by determining user satisfaction through a survey-based research approach and by investigating whether or not AI is successful enough that it eliminates the need for human alterations.Keywords: artificial intelligence, canva, features, users, satisfaction
Procedia PDF Downloads 1062966 The Effect of Artificial Intelligence on the Production of Agricultural Lands and Labor
Authors: Ibrahim Makram Ibrahim Salib
Abstract:
Agriculture plays an essential role in providing food for the world's population. It also offers numerous benefits to countries, including non-food products, transportation, and environmental balance. Precision agriculture, which employs advanced tools to monitor variability and manage inputs, can help achieve these benefits. The increasing demand for food security puts pressure on decision-makers to ensure sufficient food production worldwide. To support sustainable agriculture, unmanned aerial vehicles (UAVs) can be utilized to manage farms and increase yields. This paper aims to provide an understanding of UAV usage and its applications in agriculture. The objective is to review the various applications of UAVs in agriculture. Based on a comprehensive review of existing research, it was found that different sensors provide varying analyses for agriculture applications. Therefore, the purpose of the project must be determined before using UAV technology for better data quality and analysis. In conclusion, identifying a suitable sensor and UAV is crucial to gather accurate data and precise analysis when using UAVs in agriculture.Keywords: agriculture land, agriculture land loss, Kabul city, urban land expansion, urbanization agriculture yield growth, agriculture yield prediction, explorative data analysis, predictive models, regression models drone, precision agriculture, farmer income
Procedia PDF Downloads 742965 Langerian Mindfulness and School Manager’s Competencies: A Comprehensive Model in Khorasan Razavi Educational Province
Authors: Reza Taherian, Naziasadat Naseri, Elham Fariborzi, Faride Hashmiannejad
Abstract:
Effective management plays a crucial role in the success of educational institutions and training organizations. This study aims to develop and validate a professional competency model for managers in the education and training sector of Khorasan Razavi Province using a mindfulness approach based on Langerian theory. Employing a mixed exploratory design, the research involved qualitative data collection from experts and top national and provincial managers, as well as quantitative data collection using a researcher-developed questionnaire. The findings revealed that 81% of the competency of education and training managers is influenced by the dimensions of Langerian mindfulness, including engagement, seeking, producing, and flexibility. These dimensions were found to be predictive of the competencies of education and training managers, which encompass specialized knowledge, professional skills, pedagogical knowledge, commitment to Islamic values, personal characteristics, and creativity. This research provides valuable insights into the essential role of mindfulness in shaping the competencies of education and training managers, shedding light on the specific dimensions that significantly contribute to managerial success in Khorasan Razavi province.Keywords: school managers, school manager’s competencies, mindfulness, Langerian mindfulness
Procedia PDF Downloads 542964 Identification of Potential Predictive Biomarkers for Early Diagnosis of Preeclampsia Growth Factors to microRNAs
Authors: Sadia Munir
Abstract:
Preeclampsia is the contributor to the worldwide maternal mortality of approximately 100,000 deaths a year. It complicates about 10% of all pregnancies and is the first cause of maternal admission to intensive care units. Predicting preeclampsia is a major challenge in obstetrics. More importantly, no major progress has been achieved in the treatment of preeclampsia. As placenta is the main cause of the disease, the only way to treat the disease is to extract placental and deliver the baby. In developed countries, the cost of an average case of preeclampsia is estimated at £9000. Interestingly, preeclampsia may have an impact on the health of mother or infant, beyond the pregnancy. We performed a systematic search of PubMed including the combination of terms such as preeclampsia, biomarkers, treatment, hypoxia, inflammation, oxidative stress, vascular endothelial growth factor A, activin A, inhibin A, placental growth factor, transforming growth factor β-1, Nodal, placenta, trophoblast cells, microRNAs. In this review, we have summarized current knowledge on the identification of potential biomarkers for the diagnosis of preeclampsia. Although these studies show promising data in early diagnosis of preeclampsia, the current value of these factors as biomarkers, for the precise prediction of preeclampsia, has its limitation. Therefore, future studies need to be done to support some of the very promising and interesting data to develop affordable and widely available tests for early detection and treatment of preeclampsia.Keywords: activin, biomarkers, growth factors, miroRNA
Procedia PDF Downloads 4422963 Fourier Transform and Machine Learning Techniques for Fault Detection and Diagnosis of Induction Motors
Authors: Duc V. Nguyen
Abstract:
Induction motors are widely used in different industry areas and can experience various kinds of faults in stators and rotors. In general, fault detection and diagnosis techniques for induction motors can be supervised by measuring quantities such as noise, vibration, and temperature. The installation of mechanical sensors in order to assess the health conditions of a machine is typically only done for expensive or load-critical machines, where the high cost of a continuous monitoring system can be Justified. Nevertheless, induced current monitoring can be implemented inexpensively on machines with arbitrary sizes by using current transformers. In this regard, effective and low-cost fault detection techniques can be implemented, hence reducing the maintenance and downtime costs of motors. This work proposes a method for fault detection and diagnosis of induction motors, which combines classical fast Fourier transform and modern/advanced machine learning techniques. The proposed method is validated on real-world data and achieves a precision of 99.7% for fault detection and 100% for fault classification with minimal expert knowledge requirement. In addition, this approach allows users to be able to optimize/balance risks and maintenance costs to achieve the highest benet based on their requirements. These are the key requirements of a robust prognostics and health management system.Keywords: fault detection, FFT, induction motor, predictive maintenance
Procedia PDF Downloads 1702962 Design of an Automated Deep Learning Recurrent Neural Networks System Integrated with IoT for Anomaly Detection in Residential Electric Vehicle Charging in Smart Cities
Authors: Wanchalerm Patanacharoenwong, Panaya Sudta, Prachya Bumrungkun
Abstract:
The paper focuses on the development of a system that combines Internet of Things (IoT) technologies and deep learning algorithms for anomaly detection in residential Electric Vehicle (EV) charging in smart cities. With the increasing number of EVs, ensuring efficient and reliable charging systems has become crucial. The aim of this research is to develop an integrated IoT and deep learning system for detecting anomalies in residential EV charging and enhancing EV load profiling and event detection in smart cities. This approach utilizes IoT devices equipped with infrared cameras to collect thermal images and household EV charging profiles from the database of Thailand utility, subsequently transmitting this data to a cloud database for comprehensive analysis. The methodology includes the use of advanced deep learning techniques such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) algorithms. IoT devices equipped with infrared cameras are used to collect thermal images and EV charging profiles. The data is transmitted to a cloud database for comprehensive analysis. The researchers also utilize feature-based Gaussian mixture models for EV load profiling and event detection. Moreover, the research findings demonstrate the effectiveness of the developed system in detecting anomalies and critical profiles in EV charging behavior. The system provides timely alarms to users regarding potential issues and categorizes the severity of detected problems based on a health index for each charging device. The system also outperforms existing models in event detection accuracy. This research contributes to the field by showcasing the potential of integrating IoT and deep learning techniques in managing residential EV charging in smart cities. The system ensures operational safety and efficiency while also promoting sustainable energy management. The data is collected using IoT devices equipped with infrared cameras and is stored in a cloud database for analysis. The collected data is then analyzed using RNN, LSTM, and feature-based Gaussian mixture models. The approach includes both EV load profiling and event detection, utilizing a feature-based Gaussian mixture model. This comprehensive method aids in identifying unique power consumption patterns among EV owners and outperforms existing models in event detection accuracy. In summary, the research concludes that integrating IoT and deep learning techniques can effectively detect anomalies in residential EV charging and enhance EV load profiling and event detection accuracy. The developed system ensures operational safety and efficiency, contributing to sustainable energy management in smart cities.Keywords: cloud computing framework, recurrent neural networks, long short-term memory, Iot, EV charging, smart grids
Procedia PDF Downloads 642961 The Unscented Kalman Filter Implementation for the Sensorless Speed Control of a Permanent Magnet Synchronous Motor
Authors: Justas Dilys
Abstract:
ThispaperaddressestheimplementationandoptimizationofanUnscentedKalmanFilter(UKF) for the Permanent Magnet Synchronous Motor (PMSM) sensorless control using an ARM Cortex- M3 microcontroller. A various optimization levels based on arithmetic calculation reduction was implemented in ARM Cortex-M3 microcontroller. The execution time of UKF estimator was up to 90µs without loss of accuracy. Moreover, simulation studies on the Unscented Kalman filters are carried out using Matlab to explore the usability of the UKF in a sensorless PMSMdrive.Keywords: unscented kalman filter, ARM, PMSM, implementation
Procedia PDF Downloads 1672960 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 1672959 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanismsKeywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 1592958 Examining the Effects of Increasing Lexical Retrieval Attempts in Tablet-Based Naming Therapy for Aphasia
Authors: Jeanne Gallee, Sofia Vallila-Rohter
Abstract:
Technology-based applications are increasingly being utilized in aphasia rehabilitation as a means of increasing intensity of treatment and improving accessibility to treatment. These interactive therapies, often available on tablets, lead individuals to complete language and cognitive rehabilitation tasks that draw upon skills such as the ability to name items, recognize semantic features, count syllables, rhyme, and categorize objects. Tasks involve visual and auditory stimulus cues and provide feedback about the accuracy of a person’s response. Research has begun to examine the efficacy of tablet-based therapies for aphasia, yet much remains unknown about how individuals interact with these therapy applications. Thus, the current study aims to examine the efficacy of a tablet-based therapy program for anomia, further examining how strategy training might influence the way that individuals with aphasia engage with and benefit from therapy. Individuals with aphasia are enrolled in one of two treatment paradigms: traditional therapy or strategy therapy. For ten weeks, all participants receive 2 hours of weekly in-house therapy using Constant Therapy, a tablet-based therapy application. Participants are provided with iPads and are additionally encouraged to work on therapy tasks for one hour a day at home (home logins). For those enrolled in traditional therapy, in-house sessions involve completing therapy tasks while a clinician researcher is present. For those enrolled in the strategy training group, in-house sessions focus on limiting cue use in order to maximize lexical retrieval attempts and naming opportunities. The strategy paradigm is based on the principle that retrieval attempts may foster long-term naming gains. Data have been collected from 7 participants with aphasia (3 in the traditional therapy group, 4 in the strategy training group). We examine cue use, latency of responses and accuracy through the course of therapy, comparing results across group and setting (in-house sessions vs. home logins).Keywords: aphasia, speech-language pathology, traumatic brain injury, language
Procedia PDF Downloads 2032957 Topographic Characteristics Derived from UAV Images to Detect Ephemeral Gully Channels
Authors: Recep Gundogan, Turgay Dindaroglu, Hikmet Gunal, Mustafa Ulukavak, Ron Bingner
Abstract:
A majority of total soil losses in agricultural areas could be attributed to ephemeral gullies caused by heavy rains in conventionally tilled fields; however, ephemeral gully erosion is often ignored in conventional soil erosion assessments. Ephemeral gullies are often easily filled from normal soil tillage operations, which makes capturing the existing ephemeral gullies in croplands difficult. This study was carried out to determine topographic features, including slope and aspect composite topographic index (CTI) and initiation points of gully channels, using images obtained from unmanned aerial vehicle (UAV) images. The study area was located in Topcu stream watershed in the eastern Mediterranean Region, where intense rainfall events occur over very short time periods. The slope varied between 0.7 and 99.5%, and the average slope was 24.7%. The UAV (multi-propeller hexacopter) was used as the carrier platform, and images were obtained with the RGB camera mounted on the UAV. The digital terrain models (DTM) of Topçu stream micro catchment produced using UAV images and manual field Global Positioning System (GPS) measurements were compared to assess the accuracy of UAV based measurements. Eighty-one gully channels were detected in the study area. The mean slope and CTI values in the micro-catchment obtained from DTMs generated using UAV images were 19.2% and 3.64, respectively, and both slope and CTI values were lower than those obtained using GPS measurements. The total length and volume of the gully channels were 868.2 m and 5.52 m³, respectively. Topographic characteristics and information on ephemeral gully channels (location of initial point, volume, and length) were estimated with high accuracy using the UAV images. The results reveal that UAV-based measuring techniques can be used in lieu of existing GPS and total station techniques by using images obtained with high-resolution UAVs.Keywords: aspect, compound topographic index, digital terrain model, initial gully point, slope, unmanned aerial vehicle
Procedia PDF Downloads 1142956 A Longitudinal Study of Psychological Capital, Parent-Child Relationships, and Subjective Well-Beings in Economically Disadvantaged Adolescents
Authors: Chang Li-Yu
Abstract:
Purposes: The present research focuses on exploring the latent growth model of psychological capital in disadvantaged adolescents and assessing its relationship with subjective well-being. Methods: Longitudinal study design was utilized and the data was from Taiwan Database of Children and Youth in Poverty (TDCYP), using the student questionnaires from 2009, 2011, and 2013. Data analysis was conducted using both univariate and multivariate latent growth curve models. Results: This study finds that: (1) The initial state and growth rate of individual factors such as parent-child relationships, psychological capital, and subjective wellbeing in economically disadvantaged adolescents have a predictive impact; (2) There are positive interactive effects in the development among factors like parentchild relationships, psychological capital, and subjective well-being in economically disadvantaged adolescents; and (3) The initial state and growth rate of parent-child relationships and psychological capital in economically disadvantaged adolescents positively affect the initial state and growth rate of their subjective well-being. Recommendations: Based on these findings, this study concretely discusses the significance of psychological capital and family cohesion for the mental health of economically disadvantaged youth and offers suggestions for counseling, psychological therapy, and future research.Keywords: economically disadvantaged adolescents, psychological capital, parent-child relationships, subjective well-beings
Procedia PDF Downloads 572955 The Effect of Bilingualism on Prospective Memory
Authors: Aslı Yörük, Mevla Yahya, Banu Tavat
Abstract:
It is well established that bilinguals outperform monolinguals on executive function tasks. However, the effects of bilingualism on prospective memory (PM), which also requires executive functions, have not been investigated vastly. This study aimed to compare bi and monolingual participants' PM performance in focal and non-focal PM tasks. Considering that bilinguals have greater executive function abilities than monolinguals, we predict that bilinguals’ PM performance would be higher than monolinguals on the non-focal PM task, which requires controlled monitoring processes. To investigate these predictions, we administered the focal and non-focal PM task and measured the PM and ongoing task performance. Forty-eight Turkish-English bilinguals residing in North Macedonia and forty-eight Turkish monolinguals living in Turkey between the ages of 18-30 participated in the study. They were instructed to remember responding to rarely appearing PM cues while engaged in an ongoing task, i.e., spatial working memory task. The focality of the task was manipulated by giving different instructions for PM cues. In the focal PM task, participants were asked to remember to press an enter key whenever a particular target stimulus appeared in the working memory task; in the non-focal PM task, instead of responding to a specific target shape, participants were asked to remember to press the enter key whenever the background color of the working memory trials changes to a specific color (yellow). To analyze data, we performed a 2 × 2 mixed factorial ANOVA with the task (focal versus non-focal) as a within-subject variable and language group (bilinguals versus monolinguals) as a between-subject variable. The results showed no direct evidence for a bilingual advantage in PM. That is, the group’s performance did not differ in PM accuracy and ongoing task accuracy. However, bilinguals were overall faster in the ongoing task, yet this was not specific to PM cue’s focality. Moreover, the results showed a reversed effect of PM cue's focality on the ongoing task performance. That is, both bi and monolinguals showed enhanced performance in the non-focal PM cue task. These findings raise skepticism about the literature's prevalent findings and theoretical explanations. Future studies should investigate possible alternative explanations.Keywords: bilingualism, executive functions, focality, prospective memory
Procedia PDF Downloads 1152954 A Three-modal Authentication Method for Industrial Robots
Authors: Luo Jiaoyang, Yu Hongyang
Abstract:
In this paper, we explore a method that can be used in the working scene of intelligent industrial robots to confirm the identity information of operators to ensure that the robot executes instructions in a sufficiently safe environment. This approach uses three information modalities, namely visible light, depth, and sound. We explored a variety of fusion modes for the three modalities and finally used the joint feature learning method to improve the performance of the model in the case of noise compared with the single-modal case, making the maximum noise in the experiment. It can also maintain an accuracy rate of more than 90%.Keywords: multimodal, kinect, machine learning, distance image
Procedia PDF Downloads 792953 Exploring Pre-Trained Automatic Speech Recognition Model HuBERT for Early Alzheimer’s Disease and Mild Cognitive Impairment Detection in Speech
Authors: Monica Gonzalez Machorro
Abstract:
Dementia is hard to diagnose because of the lack of early physical symptoms. Early dementia recognition is key to improving the living condition of patients. Speech technology is considered a valuable biomarker for this challenge. Recent works have utilized conventional acoustic features and machine learning methods to detect dementia in speech. BERT-like classifiers have reported the most promising performance. One constraint, nonetheless, is that these studies are either based on human transcripts or on transcripts produced by automatic speech recognition (ASR) systems. This research contribution is to explore a method that does not require transcriptions to detect early Alzheimer’s disease (AD) and mild cognitive impairment (MCI). This is achieved by fine-tuning a pre-trained ASR model for the downstream early AD and MCI tasks. To do so, a subset of the thoroughly studied Pitt Corpus is customized. The subset is balanced for class, age, and gender. Data processing also involves cropping the samples into 10-second segments. For comparison purposes, a baseline model is defined by training and testing a Random Forest with 20 extracted acoustic features using the librosa library implemented in Python. These are: zero-crossing rate, MFCCs, spectral bandwidth, spectral centroid, root mean square, and short-time Fourier transform. The baseline model achieved a 58% accuracy. To fine-tune HuBERT as a classifier, an average pooling strategy is employed to merge the 3D representations from audio into 2D representations, and a linear layer is added. The pre-trained model used is ‘hubert-large-ls960-ft’. Empirically, the number of epochs selected is 5, and the batch size defined is 1. Experiments show that our proposed method reaches a 69% balanced accuracy. This suggests that the linguistic and speech information encoded in the self-supervised ASR-based model is able to learn acoustic cues of AD and MCI.Keywords: automatic speech recognition, early Alzheimer’s recognition, mild cognitive impairment, speech impairment
Procedia PDF Downloads 1272952 A New Criterion Using Pose and Shape of Objects for Collision Risk Estimation
Authors: DoHyeung Kim, DaeHee Seo, ByungDoo Kim, ByungGil Lee
Abstract:
As many recent researches being implemented in aviation and maritime aspects, strong doubts have been raised concerning the reliability of the estimation of collision risk. It is shown that using position and velocity of objects can lead to imprecise results. In this paper, therefore, a new approach to the estimation of collision risks using pose and shape of objects is proposed. Simulation results are presented validating the accuracy of the new criterion to adapt to collision risk algorithm based on fuzzy logic.Keywords: collision risk, pose, shape, fuzzy logic
Procedia PDF Downloads 5292951 Satellite Photogrammetry for DEM Generation Using Stereo Pair and Automatic Extraction of Terrain Parameters
Authors: Tridipa Biswas, Kamal Pandey
Abstract:
A Digital Elevation Model (DEM) is a simple representation of a surface in 3 dimensional space with elevation as the third dimension along with X (horizontal coordinates) and Y (vertical coordinates) in rectangular coordinates. DEM has wide applications in various fields like disaster management, hydrology and watershed management, geomorphology, urban development, map creation and resource management etc. Cartosat-1 or IRS P5 (Indian Remote Sensing Satellite) is a state-of-the-art remote sensing satellite built by ISRO (May 5, 2005) which is mainly intended for cartographic applications.Cartosat-1 is equipped with two panchromatic cameras capable of simultaneous acquiring images of 2.5 meters spatial resolution. One camera is looking at +26 degrees forward while another looks at –5 degrees backward to acquire stereoscopic imagery with base to height ratio of 0.62. The time difference between acquiring of the stereopair images is approximately 52 seconds. The high resolution stereo data have great potential to produce high-quality DEM. The high-resolution Cartosat-1 stereo image data is expected to have significant impact in topographic mapping and watershed applications. The objective of the present study is to generate high-resolution DEM, quality evaluation in different elevation strata, generation of ortho-rectified image and associated accuracy assessment from CARTOSAT-1 data based Ground Control Points (GCPs) for Aglar watershed (Tehri-Garhwal and Dehradun district, Uttarakhand, India). The present study reveals that generated DEMs (10m and 30m) derived from the CARTOSAT-1 stereo pair is much better and accurate when compared with existing DEMs (ASTER and CARTO DEM) also for different terrain parameters like slope, aspect, drainage, watershed boundaries etc., which are derived from the generated DEMs, have better accuracy and results when compared with the other two (ASTER and CARTO) DEMs derived terrain parameters.Keywords: ASTER-DEM, CARTO-DEM, CARTOSAT-1, digital elevation model (DEM), ortho-rectified image, photogrammetry, RPC, stereo pair, terrain parameters
Procedia PDF Downloads 3092950 Evaluation and Assessment of Bioinformatics Methods and Their Applications
Authors: Fatemeh Nokhodchi Bonab
Abstract:
Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.Keywords: methods, applications, transcriptional regulatory systems, techniques
Procedia PDF Downloads 1272949 Comparing SVM and Naïve Bayes Classifier for Automatic Microaneurysm Detections
Authors: A. Sopharak, B. Uyyanonvara, S. Barman
Abstract:
Diabetic retinopathy is characterized by the development of retinal microaneurysms. The damage can be prevented if disease is treated in its early stages. In this paper, we are comparing Support Vector Machine (SVM) and Naïve Bayes (NB) classifiers for automatic microaneurysm detection in images acquired through non-dilated pupils. The Nearest Neighbor classifier is used as a baseline for comparison. Detected microaneurysms are validated with expert ophthalmologists’ hand-drawn ground-truths. The sensitivity, specificity, precision and accuracy of each method are also compared.Keywords: diabetic retinopathy, microaneurysm, naive Bayes classifier, SVM classifier
Procedia PDF Downloads 3292948 An Experimental Modeling of Steel Surfaces Wear in Injection of Plastic Materials with SGF
Authors: L. Capitanu, V. Floresci, L. L. Badita
Abstract:
Starting from the idea that the greatest pressure and velocity of composite melted is in the die nozzle, was an experimental nozzle with wear samples of sizes and weights which can be measured with precision as good. For a larger accuracy of measurements, we used a method for radiometric measuring, extremely accurate. Different nitriding steels have been studied as nitriding treatments, as well as some special steels and alloyed steels. Besides these, there have been preliminary attempts made to describe and checking corrosive action of thermoplastics on metals.Keywords: plastics, composites with short glass fibres, moulding, wear, experimental modelling, glass fibres content influence
Procedia PDF Downloads 2662947 Microstructures of Si Surfaces Fabricated by Electrochemical Anodic Oxidation with Agarose Stamps
Abstract:
This paper investigates the fabrication of microstructures on Si surfaces by using electrochemical anodic oxidation with agarose stamps. The fabricating process is based on a selective anodic oxidation reaction that occurs in the contact area between a stamp and a Si substrate. The stamp which is soaked in electrolyte previously acts as a current flow channel. After forming the oxide patterns as an etching mask, a KOH aqueous is used for the wet etching of Si. A complicated microstructure array of 1 cm2 was fabricated by the method with high accuracy.Keywords: microstructures, anodic oxidation, silicon, agarose stamps
Procedia PDF Downloads 3052946 Detecting Covid-19 Fake News Using Deep Learning Technique
Authors: AnjalI A. Prasad
Abstract:
Nowadays, social media played an important role in spreading misinformation or fake news. This study analyzes the fake news related to the COVID-19 pandemic spread in social media. This paper aims at evaluating and comparing different approaches that are used to mitigate this issue, including popular deep learning approaches, such as CNN, RNN, LSTM, and BERT algorithm for classification. To evaluate models’ performance, we used accuracy, precision, recall, and F1-score as the evaluation metrics. And finally, compare which algorithm shows better result among the four algorithms.Keywords: BERT, CNN, LSTM, RNN
Procedia PDF Downloads 2052945 Affective Transparency in Compound Word Processing
Authors: Jordan Gallant
Abstract:
In the compound word processing literature, much attention has been paid to the relationship between a compound’s denotational meaning and that of its morphological whole-word constituents, which is referred to as ‘semantic transparency’. However, the parallel relationship between a compound’s connotation and that of its constituents has not been addressed at all. For instance, while a compound like ‘painkiller’ might be semantically transparent, it is not ‘affectively transparent’. That is, both constituents have primarily negative connotations, while the whole compound has a positive one. This paper investigates the role of affective transparency on compound processing using two methodologies commonly employed in this field: a lexical decision task and a typing task. The critical stimuli used were 112 English bi-constituent compounds that differed in terms of the effective transparency of their constituents. Of these, 36 stimuli contained constituents with similar connotations to the compound (e.g., ‘dreamland’), 36 contained constituents with more positive connotations (e.g. ‘bedpan’), and 36 contained constituents with more negative connotations (e.g. ‘painkiller’). Connotation of whole-word constituents and compounds were operationalized via valence ratings taken from an off-line ratings database. In Experiment 1, compound stimuli and matched non-word controls were presented visually to participants, who were then asked to indicate whether it was a real word in English. Response times and accuracy were recorded. In Experiment 2, participants typed compound stimuli presented to them visually. Individual keystroke response times and typing accuracy were recorded. The results of both experiments provided positive evidence that compound processing is influenced by effective transparency. In Experiment 1, compounds in which both constituents had more negative connotations than the compound itself were responded to significantly more slowly than compounds in which the constituents had similar or more positive connotations. Typed responses from Experiment 2 showed that inter-keystroke intervals at the morphological constituent boundary were significantly longer when the connotation of the head constituent was either more positive or more negative than that of the compound. The interpretation of this finding is discussed in the context of previous compound typing research. Taken together, these findings suggest that affective transparency plays a role in the recognition, storage, and production of English compound words. This study provides a promising first step in a new direction for research on compound words.Keywords: compound processing, semantic transparency, typed production, valence
Procedia PDF Downloads 1272944 Efficiency of Geocell Reinforcement for Using in Expanded Polystyrene Embankments via Numerical Analysis
Authors: S. N. Moghaddas Tafreshi, S. M. Amin Ghotbi
Abstract:
This paper presents a numerical study for investigating the effectiveness of geocell reinforcement in reducing pressure and settlement over EPS geofoam blocks in road embankments. A 3-D FEM model of soil and geofoam was created in ABAQUS, and geocell was also modeled realistically using membrane elements. The accuracy of the model was tested by comparing its results with previous works. Sensitivity analyses showed that reinforcing the soil cover with geocell has a significant influence on the reduction of imposed stresses over geofoam and consequently decreasing its deformation.Keywords: EPS geofoam, geocell, reinforcement, road embankments, lightweight fill
Procedia PDF Downloads 2732943 Wearable Antenna for Diagnosis of Parkinson’s Disease Using a Deep Learning Pipeline on Accelerated Hardware
Authors: Subham Ghosh, Banani Basu, Marami Das
Abstract:
Background: The development of compact, low-power antenna sensors has resulted in hardware restructuring, allowing for wireless ubiquitous sensing. The antenna sensors can create wireless body-area networks (WBAN) by linking various wireless nodes across the human body. WBAN and IoT applications, such as remote health and fitness monitoring and rehabilitation, are becoming increasingly important. In particular, Parkinson’s disease (PD), a common neurodegenerative disorder, presents clinical features that can be easily misdiagnosed. As a mobility disease, it may greatly benefit from the antenna’s nearfield approach with a variety of activities that can use WBAN and IoT technologies to increase diagnosis accuracy and patient monitoring. Methodology: This study investigates the feasibility of leveraging a single patch antenna mounted (using cloth) on the wrist dorsal to differentiate actual Parkinson's disease (PD) from false PD using a small hardware platform. The semi-flexible antenna operates at the 2.4 GHz ISM band and collects reflection coefficient (Γ) data from patients performing five exercises designed for the classification of PD and other disorders such as essential tremor (ET) or those physiological disorders caused by anxiety or stress. The obtained data is normalized and converted into 2-D representations using the Gabor wavelet transform (GWT). Data augmentation is then used to expand the dataset size. A lightweight deep-learning (DL) model is developed to run on the GPU-enabled NVIDIA Jetson Nano platform. The DL model processes the 2-D images for feature extraction and classification. Findings: The DL model was trained and tested on both the original and augmented datasets, thus doubling the dataset size. To ensure robustness, a 5-fold stratified cross-validation (5-FSCV) method was used. The proposed framework, utilizing a DL model with 1.356 million parameters on the NVIDIA Jetson Nano, achieved optimal performance in terms of accuracy of 88.64%, F1-score of 88.54, and recall of 90.46%, with a latency of 33 seconds per epoch.Keywords: antenna, deep-learning, GPU-hardware, Parkinson’s disease
Procedia PDF Downloads 72942 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 1212941 Quantification of Glucosinolates in Turnip Greens and Turnip Tops by Near-Infrared Spectroscopy
Authors: S. Obregon-Cano, R. Moreno-Rojas, E. Cartea-Gonzalez, A. De Haro-Bailon
Abstract:
The potential of near-infrared spectroscopy (NIRS) for screening the total glucosinolate (t-GSL) content, and also, the aliphatic glucosinolates gluconapin (GNA), progoitrin (PRO) and glucobrassicanapin (GBN) in turnip greens and turnip tops was assessed. This crop is grown for edible leaves and stems for human consumption. The reference values for glucosinolates, as they were obtained by high performance liquid chromatography on the vegetable samples, were regressed against different spectral transformations by modified partial least-squares (MPLS) regression (calibration set of samples n= 350). The resulting models were satisfactory, with calibration coefficient values from 0.72 (GBN) to 0.98 (tGSL). The predictive ability of the equations obtained was tested using a set of samples (n=70) independent of the calibration set. The determination coefficients and prediction errors (SEP) obtained in the external validation were: GNA=0.94 (SEP=3.49); PRO=0.41 (SEP=1.08); GBN=0.55 (SEP=0.60); tGSL=0.96 (SEP=3.28). These results show that the equations developed for total glucosinolates, as well as for gluconapin can be used for screening these compounds in the leaves and stems of this species. In addition, the progoitrin and glucobrassicanapin equations obtained can be used to identify those samples with high, medium and low contents. The calibration equations obtained were accurate enough for a fast, non-destructive and reliable analysis of the content in GNA and tGSL directly from NIR spectra. The equations for PRO and GBN can be employed to identify samples with high, medium and low contents.Keywords: brassica rapa, glucosinolates, gluconapin, NIRS, turnip greens
Procedia PDF Downloads 1442940 The Use of a Novel Visual Kinetic Demonstration Technique in Student Skill Acquisition of the Sellick Cricoid Force Manoeuvre
Authors: L. Nathaniel-Wurie
Abstract:
The Sellick manoeuvre a.k.a the application of cricoid force (CF), was first described by Brian Sellick in 1961. CF is the application of digital pressure against the cricoid cartilage with the intention of posterior force causing oesophageal compression against the vertebrae. This is designed to prevent passive regurgitation of gastric contents, which is a major cause of morbidity and mortality during emergency airway management inside and outside of the hospital. To the authors knowledge, there is no universally standardised training modality and, therefore, no reliable way to examine if there are appropriate outcomes. If force is not measured during training, how can one surmise that appropriate, accurate, or precise amounts of force are being used routinely. Poor homogeneity in teaching and untested outcomes will correlate with reduced efficacy and increased adverse effects. For this study, the accuracy of force delivery in trained professionals was tested, and outcomes contrasted against a novice control and a novice study group. In this study, 20 operating department practitioners were tested (with a mean experience of 5.3years of performing CF). Subsequent contrast with 40 novice students who were randomised into one of two arms. ‘Arm A’ were explained the procedure, then shown the procedure then asked to perform CF with the corresponding force measurement being taken three times. Arm B had the same process as arm A then before being tested, they had 10, and 30 Newtons applied to their hands to increase intuitive understanding of what the required force equated to, then were asked to apply the equivalent amount of force against a visible force metre and asked to hold that force for 20 seconds which allowed direct visualisation and correction of any over or under estimation. Following this, Arm B were then asked to perform the manoeuvre, and the force generated measured three times. This study shows that there is a wide distribution of force produced by trained professionals and novices performing the procedure for the first time. Our methodology for teaching the manoeuvre shows an improved accuracy, precision, and homogeneity within the group when compared to novices and even outperforms trained practitioners. In conclusion, if this methodology is adopted, it may correlate with higher clinical outcomes, less adverse events, and more successful airway management in critical medical scenarios.Keywords: airway, cricoid, medical education, sellick
Procedia PDF Downloads 79