Search results for: classification framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7020

Search results for: classification framework

5970 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks

Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul

Abstract:

Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.

Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50

Procedia PDF Downloads 124
5969 EEG-Based Classification of Psychiatric Disorders: Bipolar Mood Disorder vs. Schizophrenia

Authors: Han-Jeong Hwang, Jae-Hyun Jo, Fatemeh Alimardani

Abstract:

An accurate diagnosis of psychiatric diseases is a challenging issue, in particular when distinct symptoms for different diseases are overlapped, such as delusions appeared in bipolar mood disorder (BMD) and schizophrenia (SCH). In the present study, we propose a useful way to discriminate BMD and SCH using electroencephalography (EEG). A total of thirty BMD and SCH patients (15 vs. 15) took part in our experiment. EEG signals were measured with nineteen electrodes attached on the scalp using the international 10-20 system, while they were exposed to a visual stimulus flickering at 16 Hz for 95 s. The flickering visual stimulus induces a certain brain signal, known as steady-state visual evoked potential (SSVEP), which is differently observed in patients with BMD and SCH, respectively, in terms of SSVEP amplitude because they process the same visual information in own unique way. For classifying BDM and SCH patients, machine learning technique was employed in which leave-one-out-cross validation was performed. The SSVEPs induced at the fundamental (16 Hz) and second harmonic (32 Hz) stimulation frequencies were extracted using fast Fourier transformation (FFT), and they were used as features. The most discriminative feature was selected using the Fisher score, and support vector machine (SVM) was used as a classifier. From the analysis, we could obtain a classification accuracy of 83.33 %, showing the feasibility of discriminating patients with BMD and SCH using EEG. We expect that our approach can be utilized for psychiatrists to more accurately diagnose the psychiatric disorders, BMD and SCH.

Keywords: bipolar mood disorder, electroencephalography, schizophrenia, machine learning

Procedia PDF Downloads 415
5968 Analytical Authentication of Butter Using Fourier Transform Infrared Spectroscopy Coupled with Chemometrics

Authors: M. Bodner, M. Scampicchio

Abstract:

Fourier Transform Infrared (FT-IR) spectroscopy coupled with chemometrics was used to distinguish between butter samples and non-butter samples. Further, quantification of the content of margarine in adulterated butter samples was investigated. Fingerprinting region (1400-800 cm–1) was used to develop unsupervised pattern recognition (Principal Component Analysis, PCA), supervised modeling (Soft Independent Modelling by Class Analogy, SIMCA), classification (Partial Least Squares Discriminant Analysis, PLS-DA) and regression (Partial Least Squares Regression, PLS-R) models. PCA of the fingerprinting region shows a clustering of the two sample types. All samples were classified in their rightful class by SIMCA approach; however, nine adulterated samples (between 1% and 30% w/w of margarine) were classified as belonging both at the butter class and at the non-butter one. In the two-class PLS-DA model’s (R2 = 0.73, RMSEP, Root Mean Square Error of Prediction = 0.26% w/w) sensitivity was 71.4% and Positive Predictive Value (PPV) 100%. Its threshold was calculated at 7% w/w of margarine in adulterated butter samples. Finally, PLS-R model (R2 = 0.84, RMSEP = 16.54%) was developed. PLS-DA was a suitable classification tool and PLS-R a proper quantification approach. Results demonstrate that FT-IR spectroscopy combined with PLS-R can be used as a rapid, simple and safe method to identify pure butter samples from adulterated ones and to determine the grade of adulteration of margarine in butter samples.

Keywords: adulterated butter, margarine, PCA, PLS-DA, PLS-R, SIMCA

Procedia PDF Downloads 139
5967 A Holistic Conceptual Measurement Framework for Assessing the Effectiveness and Viability of an Academic Program

Authors: Munir Majdalawieh, Adam Marks

Abstract:

In today’s very competitive higher education industry (HEI), HEIs are faced with the primary concern of developing, deploying, and sustaining high quality academic programs. Today, the HEI has well-established accreditation systems endorsed by a country’s legislation and institutions. The accreditation system is an educational pathway focused on the criteria and processes for evaluating educational programs. Although many aspects of the accreditation process highlight both the past and the present (prove), the “program review” assessment is "forward-looking assessment" (improve) and thus transforms the process into a continuing assessment activity rather than a periodic event. The purpose of this study is to propose a conceptual measurement framework for program review to be used by HEIs to undertake a robust and targeted approach to proactively and continuously review their academic programs to evaluate its practicality and effectiveness as well as to improve the education of the students. The proposed framework consists of two main components: program review principles and the program review measurement matrix.

Keywords: academic program, program review principles, curriculum development, accreditation, evaluation, assessment, review measurement matrix, program review process, information technologies supporting learning, learning/teaching methodologies and assessment

Procedia PDF Downloads 235
5966 Revolutionizing Healthcare Facility Maintenance: A Groundbreaking AI, BIM, and IoT Integration Framework

Authors: Mina Sadat Orooje, Mohammad Mehdi Latifi, Behnam Fereydooni Eftekhari

Abstract:

The integration of cutting-edge Internet of Things (IoT) technologies with advanced Artificial Intelligence (AI) systems is revolutionizing healthcare facility management. However, the current landscape of hospital building maintenance suffers from slow, repetitive, and disjointed processes, leading to significant financial, resource, and time losses. Additionally, the potential of Building Information Modeling (BIM) in facility maintenance is hindered by a lack of data within digital models of built environments, necessitating a more streamlined data collection process. This paper presents a robust framework that harmonizes AI with BIM-IoT technology to elevate healthcare Facility Maintenance Management (FMM) and address these pressing challenges. The methodology begins with a thorough literature review and requirements analysis, providing insights into existing technological landscapes and associated obstacles. Extensive data collection and analysis efforts follow to deepen understanding of hospital infrastructure and maintenance records. Critical AI algorithms are identified to address predictive maintenance, anomaly detection, and optimization needs alongside integration strategies for BIM and IoT technologies, enabling real-time data collection and analysis. The framework outlines protocols for data processing, analysis, and decision-making. A prototype implementation is executed to showcase the framework's functionality, followed by a rigorous validation process to evaluate its efficacy and gather user feedback. Refinement and optimization steps are then undertaken based on evaluation outcomes. Emphasis is placed on the scalability of the framework in real-world scenarios and its potential applications across diverse healthcare facility contexts. Finally, the findings are meticulously documented and shared within the healthcare and facility management communities. This framework aims to significantly boost maintenance efficiency, cut costs, provide decision support, enable real-time monitoring, offer data-driven insights, and ultimately enhance patient safety and satisfaction. By tackling current challenges in healthcare facility maintenance management it paves the way for the adoption of smarter and more efficient maintenance practices in healthcare facilities.

Keywords: artificial intelligence, building information modeling, healthcare facility maintenance, internet of things integration, maintenance efficiency

Procedia PDF Downloads 52
5965 Real-Time Classification of Hemodynamic Response by Functional Near-Infrared Spectroscopy Using an Adaptive Estimation of General Linear Model Coefficients

Authors: Sahar Jahani, Meryem Ayse Yucel, David Boas, Seyed Kamaledin Setarehdan

Abstract:

Near-infrared spectroscopy allows monitoring of oxy- and deoxy-hemoglobin concentration changes associated with hemodynamic response function (HRF). HRF is usually affected by natural physiological hemodynamic (systemic interferences) which occur in all body tissues including brain tissue. This makes HRF extraction a very challenging task. In this study, we used Kalman filter based on a general linear model (GLM) of brain activity to define the proportion of systemic interference in the brain hemodynamic. The performance of the proposed algorithm is evaluated in terms of the peak to peak error (Ep), mean square error (MSE), and Pearson’s correlation coefficient (R2) criteria between the estimated and the simulated hemodynamic responses. This technique also has the ability of real time estimation of single trial functional activations as it was applied to classify finger tapping versus resting state. The average real-time classification accuracy of 74% over 11 subjects demonstrates the feasibility of developing an effective functional near infrared spectroscopy for brain computer interface purposes (fNIRS-BCI).

Keywords: hemodynamic response function, functional near-infrared spectroscopy, adaptive filter, Kalman filter

Procedia PDF Downloads 156
5964 Legal Regulation of Personal Information Data Transmission Risk Assessment: A Case Study of the EU’s DPIA

Authors: Cai Qianyi

Abstract:

In the midst of global digital revolution, the flow of data poses security threats that call China's existing legislative framework for protecting personal information into question. As a preliminary procedure for risk analysis and prevention, the risk assessment of personal data transmission lacks detailed guidelines for support. Existing provisions reveal unclear responsibilities for network operators and weakened rights for data subjects. Furthermore, the regulatory system's weak operability and a lack of industry self-regulation heighten data transmission hazards. This paper aims to compare the regulatory pathways for data information transmission risks between China and Europe from a legal framework and content perspective. It draws on the “Data Protection Impact Assessment Guidelines” to empower multiple stakeholders, including data processors, controllers, and subjects, while also defining obligations. In conclusion, this paper intends to solve China's digital security shortcomings by developing a more mature regulatory framework and industry self-regulation mechanisms, resulting in a win-win situation for personal data protection and the development of the digital economy.

Keywords: personal information data transmission, risk assessment, DPIA, internet service provider, personal information data transimission, risk assessment

Procedia PDF Downloads 55
5963 Human Resource Utilization Models for Graceful Ageing

Authors: Chuang-Chun Chiou

Abstract:

In this study, a systematic framework of graceful ageing has been used to explore the possible human resource utilization models for graceful ageing purpose. This framework is based on the Chinese culture. We call ‘Nine-old’ target. They are ageing gracefully with feeding, accomplishment, usefulness, learning, entertainment, care, protection, dignity, and termination. This study is focused on two areas: accomplishment and usefulness. We exam the current practices of initiatives and laws of promoting labor participation. That is to focus on how to increase Labor Force Participation Rate of the middle aged as well as the elderly and try to promote the elderly to achieve graceful ageing. Then we present the possible models that support graceful ageing.

Keywords: human resource utilization model, labor participation, graceful ageing, employment

Procedia PDF Downloads 387
5962 Hydrochemical Assessment and Quality Classification of Water in Torogh and Kardeh Dam Reservoirs, North-East Iran

Authors: Mojtaba Heydarizad

Abstract:

Khorasan Razavi is the second most important province in north-east of Iran, which faces a water shortage crisis due to recent droughts and huge water consummation. Kardeh and Torogh dam reservoirs in this province provide a notable part of Mashhad metropolitan (with more than 4.5 million inhabitants) potable water needs. Hydrochemical analyses on these dam reservoirs samples demonstrate that MgHCO3 in Kardeh and CaHCO3 and to lower extent MgHCO3 water types in Torogh dam reservoir are dominant. On the other hand, Gibbs binary diagram demonstrates that rock weathering is the main factor controlling water quality in dam reservoirs. Plotting dam reservoir samples on Mg2+/Na+ and HCO3-/Na+ vs. Ca2+/ Na+ diagrams demonstrate evaporative and carbonate mineral dissolution is the dominant rock weathering ion sources in these dam reservoirs. Cluster Analyses (CA) also demonstrate intense role of rock weathering mainly (carbonate and evaporative minerals dissolution) in water quality of these dam reservoirs. Studying water quality by the U.S. National Sanitation Foundation (NSF) WQI index NSF-WQI, Oregon Water Quality Index (OWQI) and Canadian Water Quality Index DWQI index show moderate and good quality.

Keywords: hydrochemistry, water quality classification, water quality indexes, Torogh and Kardeh dam reservoir

Procedia PDF Downloads 253
5961 Sex Estimation Using Cervical Measurements of Molar Teeth in an Iranian Archaeological Population

Authors: Seyedeh Mandan Kazzazi, Elena Kranioti

Abstract:

In the field of human osteology, sex estimation is an important step in developing biological profile. There are a number of methods that can be used to estimate the sex of human remains varying from visual assessments to metric analysis of sexually dimorphic traits. Teeth are one of the most durable physical elements in human body that can be used for this purpose. The present study investigated the utility of cervical measurements for sex estimation through discriminant analysis. The permanent molar teeth of 75 skeletons (28 females and 52 males) from Hasanlu site in North-western Iran were studied. Cervical mesiodistal and buccolingual measurements were taken from both maxillary and mandibular first and second molars. Discriminant analysis was used to evaluate the accuracy of each diameter in assessing sex. The results showed that males had statistically larger teeth than females for maxillary and mandibular molars and both measurements (P < 0.05). The range of classification rate was from (75.7% to 85.5%) for the original and cross-validated data. The most dimorphic teeth were maxillary and mandibular second molars providing 85.5% and 83.3% correct classification rate respectively. The data generated from the present study suggested that cervical mesiodistal and buccolingual measurements of the molar teeth can be useful and reliable for sex estimation in Iranian archaeological populations.

Keywords: cervical measurements, Hasanlu, premolars, sex estimation

Procedia PDF Downloads 327
5960 In Silico Study of Cell Surface Structures of Parabacteroides distasonis Involved in Its Maintain Within the Gut Microbiota and Its Potential Pathogenicity

Authors: Jordan Chamarande, Lisiane Cunat, Corentine Alauzet, Catherine Cailliez-Grimal

Abstract:

Gut microbiota (GM) is now considered a new organ mainly due to the microorganism’s specific biochemical interaction with its host. Although mechanisms underlying host-microbiota interactions are not fully described, it is now well-defined that cell surface molecules and structures of the GM play a key role in such relation. The study of surface structures of GM members is also fundamental for their role in the establishment of species in the versatile and competitive environment of the digestive tract and as a potential virulence factor. Among these structures are capsular polysaccharides (CPS), fimbriae, pili and lipopolysaccharides (LPS), all well-described for their central role in microorganism colonization and communication with host epithelium. The health-promoting Parabacteroides distasonis, which is part of the core microbiome, has recently received a lot of attention, showing beneficial properties for its host and as a new potential biotherapeutic product. However, to the best of the authors’ knowledge, the cell surface molecules and structures of P. distasonis that allow its maintain within the GM are not identified. Moreover, although P. distasonis is strongly recognized as intestinal commensal species with benefits for its host, it has also been recognized as an opportunistic pathogen. In this study, we reported gene clusters potentially involved in the synthesis of the capsule, fimbriae-like and pili-like cell surface structures in 26 P. distasonis genomes and applied the new RfbA-Typing classification in order to better understand and characterize the beneficial/pathogenic behaviour related to P. distasonis strains. In context, 2 different types of fimbriae, 3 of pilus and up to 14 capsular polysaccharide loci, have been identified over the 26 genomes studied. Moreover, the addition of data to the rfbA-Type classification modified the outcome by rearranging rfbA genes and adding a fifth group to the classification. In conclusion, the strain variability in terms of external proteinaceous structure could explain the inter-strain differences previously observed in P. distasonis adhesion capacities and its potential pathogenicity.

Keywords: gut microbiota, Parabacteroides distasonis, capsular polysaccharide, fimbriae, pilus, O-antigen, pathogenicity, probiotic, comparative genomics

Procedia PDF Downloads 98
5959 Classification of Sequential Sports Using Automata Theory

Authors: Aniket Alam, Sravya Gurram

Abstract:

This paper proposes a categorization of sport that is based on the system of rules that a sport must adhere to. We focus on these systems of rules to examine how a winner is produced in different sports. The rules of a sport dictate the game play and the direction it takes. We propose to break down the game play into events. At this junction, we observe two kinds of events that constitute the game play of a sport –ones that follow sequential logic and ones that do not. Our focus is pertained to sports that are comprised of sequential events. To examine these events further, to understand how a winner emerges, we take the help of finite-state automaton from the theory of computation (Automata theory). We showcase how sequential sports are eligible to be represented as finite state machines. We depict these finite state machines as state diagrams. We examine these state diagrams to observe how a team/player reaches the final states of the sport, with a special focus on one final state –the final state which determines the winner. This exercise has been carried out for the following sports: Hurdles, Track, Shot Put, Long Jump, Bowling, Badminton, Pacman and Weightlifting (Snatch). Based on our observations of how this final state of winning is achieved, we propose a categorization of sports.

Keywords: sport classification, sport modelling, ontology, automata theory

Procedia PDF Downloads 116
5958 Machine Learning Predictive Models for Hydroponic Systems: A Case Study Nutrient Film Technique and Deep Flow Technique

Authors: Kritiyaporn Kunsook

Abstract:

Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), decision tree, support vector machines (SVMs), Naïve Bayes, and ensemble classifier by voting are powerful data driven methods that are relatively less widely used in the mapping of technique of system, and thus have not been comparatively evaluated together thoroughly in this field. The performances of a series of MLAs, ANNs, decision tree, SVMs, Naïve Bayes, and ensemble classifier by voting in technique of hydroponic systems prospectively modeling are compared based on the accuracy of each model. Classification of hydroponic systems only covers the test samples from vegetables grown with Nutrient film technique (NFT) and Deep flow technique (DFT). The feature, which are the characteristics of vegetables compose harvesting height width, temperature, require light and color. The results indicate that the classification performance of the ANNs is 98%, decision tree is 98%, SVMs is 97.33%, Naïve Bayes is 96.67%, and ensemble classifier by voting is 98.96% algorithm respectively.

Keywords: artificial neural networks, decision tree, support vector machines, naïve Bayes, ensemble classifier by voting

Procedia PDF Downloads 369
5957 Regional Analysis of Freight Movement by Vehicle Classification

Authors: Katerina Koliou, Scott Parr, Evangelos Kaisar

Abstract:

The surface transportation of freight is particularly vulnerable to storm and hurricane disasters, while at the same time, it is the primary transportation mode for delivering medical supplies, fuel, water, and other essential goods. To better plan for commercial vehicles during an evacuation, it is necessary to understand how these vehicles travel during an evacuation and determine if this travel is different from the general public. The research investigation used Florida's statewide continuous-count station traffic volumes, where then compared between years, to identify locations where traffic was moving differently during the evacuation. The data was then used to identify days on which traffic was significantly different between years. While the literature on auto-based evacuations is extensive, the consideration of freight travel is lacking. To better plan for commercial vehicles during an evacuation, it is necessary to understand how these vehicles travel during an evacuation and determine if this travel is different from the general public. The goal of this research was to investigate the movement of vehicles by classification, with an emphasis on freight during two major evacuation events: hurricanes Irma (2017) and Michael (2018). The methodology of the research was divided into three phases: data collection and management, spatial analysis, and temporal comparisons. Data collection and management obtained continuous-co station data from the state of Florida for both 2017 and 2018 by vehicle classification. The data was then processed into a manageable format. The second phase used geographic information systems (GIS) to display where and when traffic varied across the state. The third and final phase was a quantitative investigation into which vehicle classifications were statistically different and on which dates statewide. This phase used a two-sample, two-tailed t-test to compare sensor volume by classification on similar days between years. Overall, increases in freight movement between years prevented a more precise paired analysis. This research sought to identify where and when different classes of vehicles were traveling leading up to hurricane landfall and post-storm reentry. Of the more significant findings, the research results showed that commercial-use vehicles may have underutilized rest areas during the evacuation, or perhaps these rest areas were closed. This may suggest that truckers are driving longer distances and possibly longer hours before hurricanes. Another significant finding of this research was that changes in traffic patterns for commercial-use vehicles occurred earlier and lasted longer than changes for personal-use vehicles. This finding suggests that commercial vehicles are perhaps evacuating in a fashion different from personal use vehicles. This paper may serve as the foundation for future research into commercial travel during evacuations and explore additional factors that may influence freight movements during evacuations.

Keywords: evacuation, freight, travel time, evacuation

Procedia PDF Downloads 64
5956 Web-Based Cognitive Writing Instruction (WeCWI): A Hybrid e-Framework for Instructional Design

Authors: Boon Yih Mah

Abstract:

Web-based Cognitive Writing Instruction (WeCWI) is a hybrid e-framework that consolidates instructional design and language development towards the development of a web-based instruction (WBI). WeCWI divides instructional design into macro and micro perspectives. In macro perspective, a 21st century educator is encouraged to disseminate knowledge and share ideas with in-class and global learners. By leveraging the virtue of technology, WeCWI aims to transform the educator into an aggregator, curator, publisher, social networker and finally, a web-based instructor. Since the most notable contribution of integrating technology is being a tool of teaching as well as a stimulus for learning, WeCWI focuses on the use of contemporary web tools based on the multiple roles played by the 21st century educator. The micro perspective draws attention to the pedagogical approaches focussing on three main aspects: reading, discussion, and writing. With the effective use of pedagogical approaches, technology adds new dimensions and expands the bounds of learning capacity. Lastly, WeCWI also imparts the fundamental theoretical concepts for web-based instructors’ awareness such as interactionism, e-learning interactional-based model, computer-mediated communication (CMC), cognitive theories, and learning style model.

Keywords: web-based cognitive writing instruction, WeCWI, instructional design, e-framework, web-based instructor

Procedia PDF Downloads 435
5955 Critical Mathematics Education and School Education in India: A Study of the National Curriculum Framework 2022 for Foundational Stage

Authors: Eish Sharma

Abstract:

Literature around Mathematics education suggests that democratic attitudes can be strengthened through teaching and learning Mathematics. Furthermore, connections between critical education and Mathematics education are observed in the light of critical pedagogy to locate Critical Mathematics Education (CME) as the theoretical framework. Critical pedagogy applied to Mathematics education is identified as one of the key themes subsumed under Critical Mathematics Education. Through the application of critical pedagogy in mathematics, unequal power relations and social injustice can be identified, analyzed, and challenged. The research question is: have educational policies in India viewed the role of critical pedagogy applied to mathematics education (i.e., critical mathematics education) to ensure social justice as an educational aim? The National Curriculum Framework (NCF), 2005 upholds education for democracy and the role of mathematics education in facilitating the same. More than this, NCF 2005 rests on Critical Pedagogy Framework and it recommends that critical pedagogy must be practiced in all dimensions of school education. NCF 2005 visualizes critical pedagogy for social sciences as well as sciences, stating that the science curriculum, including mathematics, must be used as an “instrument for achieving social change to reduce the divide based on economic class, gender, caste, religion, and the region”. Furthermore, the implementation of NCF 2005 led to a reform in the syllabus and textbooks in school mathematics at the national level, and critical pedagogy was applied to mathematics textbooks at the primary level. This intervention led to ethnomathematics and critical mathematics education in the school curriculum in India for the first time at the national level. In October 2022, the Ministry of Education launched the National Curriculum Framework for Foundational Stage (NCF-FS), developed in light of the National Education Policy, 2020, for children in the three to eight years age group. I want to find out whether critical pedagogy-based education and critical pedagogy-based mathematics education are carried forward in NCF 2022. To find this, an argument analysis of specific sections of the National Curriculum Framework 2022 document needs to be executed. Des Gasper suggests two tables: The first table contains four columns, namely, text component, comments on meanings, possible reformulation of the same text, and identified conclusions and assumptions (both stated and unstated). This table is for understanding the components and meanings of the text and is based on Scriven’s model for understanding the components and meanings of words in the text. The second table contains four columns i.e., claim identified, given data, warrant, and stated qualifier/rebuttal. This table is for describing the structure of the argument, how and how well the components fit together and is called ‘George Table diagram based on Toulmin-Bunn Model’.

Keywords: critical mathematics education, critical pedagogy, social justice, etnomathematics

Procedia PDF Downloads 78
5954 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings

Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller

Abstract:

Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.

Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram

Procedia PDF Downloads 263
5953 Reducing Support Structures in Design for Additive Manufacturing: A Neural Networks Approach

Authors: Olivia Borgue, Massimo Panarotto, Ola Isaksson

Abstract:

This article presents a neural networks-based strategy for reducing the need for support structures when designing for additive manufacturing (AM). Additive manufacturing is a relatively new and immature industrial technology, and the information to make confident decisions when designing for AM is limited. This lack of information impacts especially the early stages of engineering design, for instance, it is difficult to actively consider the support structures needed for manufacturing a part. This difficulty is related to the challenge of designing a product geometry accounting for customer requirements, manufacturing constraints and minimization of support structure. The approach presented in this article proposes an automatized geometry modification technique for reducing the use of the support structures while designing for AM. This strategy starts with a neural network-based strategy for shape recognition to achieve product classification, using an STL file of the product as input. Based on the classification, an automatic part geometry modification based on MATLAB© is implemented. At the end of the process, the strategy presents different geometry modification alternatives depending on the type of product to be designed. The geometry alternatives are then evaluated adopting a QFD-like decision support tool.

Keywords: additive manufacturing, engineering design, geometry modification optimization, neural networks

Procedia PDF Downloads 249
5952 Patterns of TV Simultaneous Interpreting of Emotive Overtones in Trump’s Victory Speech from English into Arabic

Authors: Hanan Al-Jabri

Abstract:

Simultaneous interpreting is deemed to be the most challenging mode of interpreting by many scholars. The special constraints involved in this task including time constraints, different linguistic systems, and stress pose a great challenge to most interpreters. These constraints are likely to maximise when the interpreting task is done live on TV. The TV interpreter is exposed to a wide variety of audiences with different backgrounds and needs and is mostly asked to interpret high profile tasks which raise his/her levels of stress, which further complicate the task. Under these constraints, which require fast and efficient performance, TV interpreters of four TV channels were asked to render Trump's victory speech into Arabic. However, they had also to deal with the burden of rendering English emotive overtones employed by the speaker into a whole different linguistic system. The current study aims at investigating the way TV interpreters, who worked in the simultaneous mode, handled this task; it aims at exploring and evaluating the TV interpreters’ linguistic choices and whether the original emotive effect was maintained, upgraded, downgraded or abandoned in their renditions. It also aims at exploring the possible difficulties and challenges that emerged during this process and might have influenced the interpreters’ linguistic choices. To achieve its aims, the study analysed Trump’s victory speech delivered on November 6, 2016, along with four Arabic simultaneous interpretations produced by four TV channels: Al-Jazeera, RT, CBC News, and France 24. The analysis of the study relied on two frameworks: a macro and a micro framework. The former presents an overview of the wider context of the English speech as well as an overview of the speaker and his political background to help understand the linguistic choices he made in the speech, and the latter framework investigates the linguistic tools which were employed by the speaker to stir people’s emotions. These tools were investigated based on Shamaa’s (1978) classification of emotive meaning according to their linguistic level: phonological, morphological, syntactic, and semantic and lexical levels. Moreover, this level investigates the patterns of rendition which were detected in the Arabic deliveries. The results of the study identified different rendition patterns in the Arabic deliveries, including parallel rendition, approximation, condensation, elaboration, transformation, expansion, generalisation, explicitation, paraphrase, and omission. The emerging patterns, as suggested by the analysis, were influenced by factors such as speedy and continuous delivery of some stretches, and highly-dense segments among other factors. The study aims to contribute to a better understanding of TV simultaneous interpreting between English and Arabic, as well as the practices of TV interpreters when rendering emotiveness especially that little is known about interpreting practices in the field of TV, particularly between Arabic and English.

Keywords: emotive overtones, interpreting strategies, political speeches, TV interpreting

Procedia PDF Downloads 158
5951 Enterprise Security Architecture: Approaches and a Framework

Authors: Amir Mohtarami, Hadi Kandjani

Abstract:

The amount of business-critical information in enterprises is growing at an extraordinary rate, and the ability to catalog that information and properly protect it using traditional security mechanisms is not keeping pace. Alongside the Information Technology (IT), information security needs a holistic view in enterprise. In other words, a comprehensive architectural approach is required, focusing on the information itself, understanding what the data are, who owns it, and which business and regulatory policies should be applied to the information. Enterprise Architecture Frameworks provide useful tools to grasp different dimensions of IT in organizations. Usually this is done by the layered views on IT architecture, but not requisite security attention has been held in this frameworks. In this paper, after a brief look at the Enterprise Architecture (EA), we discuss the issue of security in the overall enterprise IT architecture. Due to the increasing importance of security, a rigorous EA program in an enterprise should be able to consider security architecture as an integral part of its processes and gives a visible roadmap and blueprint for this aim.

Keywords: enterprise architecture, architecture framework, security architecture, information systems

Procedia PDF Downloads 702
5950 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin is an emerging research topic that attracted researchers in the last decade. It is used in many fields, such as smart manufacturing and smart healthcare because it saves time and money. It is usually related to other technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, Human digital twin (HDT), in specific, is still a novel idea that still needs to prove its feasibility. HDT expands the idea of Digital Twin to human beings, which are living beings and different from the inanimate physical entities. The goal of this research was to create a Human digital twin that is responsible for real-time human replies automation by simulating human behavior. For this reason, clustering, supervised classification, topic extraction, and sentiment analysis were studied in this paper. The feasibility of the HDT for personal replies generation on social messaging applications was proved in this work. The overall accuracy of the proposed approach in this paper was 63% which is a very promising result that can open the way for researchers to expand the idea of HDT. This was achieved by using Random Forest for clustering the question data base and matching new questions. K-nearest neighbor was also applied for sentiment analysis.

Keywords: human digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification, clustering

Procedia PDF Downloads 84
5949 Linguistic Features for Sentence Difficulty Prediction in Aspect-Based Sentiment Analysis

Authors: Adrian-Gabriel Chifu, Sebastien Fournier

Abstract:

One of the challenges of natural language understanding is to deal with the subjectivity of sentences, which may express opinions and emotions that add layers of complexity and nuance. Sentiment analysis is a field that aims to extract and analyze these subjective elements from text, and it can be applied at different levels of granularity, such as document, paragraph, sentence, or aspect. Aspect-based sentiment analysis is a well-studied topic with many available data sets and models. However, there is no clear definition of what makes a sentence difficult for aspect-based sentiment analysis. In this paper, we explore this question by conducting an experiment with three data sets: ”Laptops”, ”Restaurants”, and ”MTSC” (Multi-Target-dependent Sentiment Classification), and a merged version of these three datasets. We study the impact of domain diversity and syntactic diversity on difficulty. We use a combination of classifiers to identify the most difficult sentences and analyze their characteristics. We employ two ways of defining sentence difficulty. The first one is binary and labels a sentence as difficult if the classifiers fail to correctly predict the sentiment polarity. The second one is a six-level scale based on how many of the top five best-performing classifiers can correctly predict the sentiment polarity. We also define 9 linguistic features that, combined, aim at estimating the difficulty at sentence level.

Keywords: sentiment analysis, difficulty, classification, machine learning

Procedia PDF Downloads 78
5948 Autogenous Diabetic Retinopathy Censor for Ophthalmologists - AKSHI

Authors: Asiri Wijesinghe, N. D. Kodikara, Damitha Sandaruwan

Abstract:

The Diabetic Retinopathy (DR) is a rapidly growing interrogation around the world which can be annotated by abortive metabolism of glucose that causes long-term infection in human retina. This is one of the preliminary reason of visual impairment and blindness of adults. Information on retinal pathological mutation can be recognized using ocular fundus images. In this research, we are mainly focused on resurrecting an automated diagnosis system to detect DR anomalies such as severity level classification of DR patient (Non-proliferative Diabetic Retinopathy approach) and vessel tortuosity measurement of untwisted vessels to assessment of vessel anomalies (Proliferative Diabetic Retinopathy approach). Severity classification method is obtained better results according to the precision, recall, F-measure and accuracy (exceeds 94%) in all formats of cross validation. In ROC (Receiver Operating Characteristic) curves also visualized the higher AUC (Area Under Curve) percentage (exceeds 95%). User level evaluation of severity capturing is obtained higher accuracy (85%) result and fairly better values for each evaluation measurements. Untwisted vessel detection for tortuosity measurement also carried out the good results with respect to the sensitivity (85%), specificity (89%) and accuracy (87%).

Keywords: fundus image, exudates, microaneurisms, hemorrhages, tortuosity, diabetic retinopathy, optic disc, fovea

Procedia PDF Downloads 339
5947 Breast Cancer Metastasis Detection and Localization through Transfer-Learning Convolutional Neural Network Classification Based on Convolutional Denoising Autoencoder Stack

Authors: Varun Agarwal

Abstract:

Introduction: With the advent of personalized medicine, histopathological review of whole slide images (WSIs) for cancer diagnosis presents an exceedingly time-consuming, complex task. Specifically, detecting metastatic regions in WSIs of sentinel lymph node biopsies necessitates a full-scanned, holistic evaluation of the image. Thus, digital pathology, low-level image manipulation algorithms, and machine learning provide significant advancements in improving the efficiency and accuracy of WSI analysis. Using Camelyon16 data, this paper proposes a deep learning pipeline to automate and ameliorate breast cancer metastasis localization and WSI classification. Methodology: The model broadly follows five stages -region of interest detection, WSI partitioning into image tiles, convolutional neural network (CNN) image-segment classifications, probabilistic mapping of tumor localizations, and further processing for whole WSI classification. Transfer learning is applied to the task, with the implementation of Inception-ResNetV2 - an effective CNN classifier that uses residual connections to enhance feature representation, adding convolved outputs in the inception unit to the proceeding input data. Moreover, in order to augment the performance of the transfer learning CNN, a stack of convolutional denoising autoencoders (CDAE) is applied to produce embeddings that enrich image representation. Through a saliency-detection algorithm, visual training segments are generated, which are then processed through a denoising autoencoder -primarily consisting of convolutional, leaky rectified linear unit, and batch normalization layers- and subsequently a contrast-normalization function. A spatial pyramid pooling algorithm extracts the key features from the processed image, creating a viable feature map for the CNN that minimizes spatial resolution and noise. Results and Conclusion: The simplified and effective architecture of the fine-tuned transfer learning Inception-ResNetV2 network enhanced with the CDAE stack yields state of the art performance in WSI classification and tumor localization, achieving AUC scores of 0.947 and 0.753, respectively. The convolutional feature retention and compilation with the residual connections to inception units synergized with the input denoising algorithm enable the pipeline to serve as an effective, efficient tool in the histopathological review of WSIs.

Keywords: breast cancer, convolutional neural networks, metastasis mapping, whole slide images

Procedia PDF Downloads 126
5946 Food Consumer Protection in Moroccan Legal System: A Systematic Review

Authors: Bouchaib Gazzaz, Mounir Mehdi

Abstract:

In order to ensure consumer food protection, the food industry has a legal obligation to provide food products that comply with the requirements of the legislation in force. National regulations in this area occupy an important place in the food control system in terms of consumer protection. This article discusses the legal and regulatory framework of food safety and consumer protection in Moroccan law. We used the doctrinal research approach by analyzing the judicial normative and bibliographic legal research. As a result, we were able to present the basic principles of consumer food protection by showing to what extent the food safety law provides effective consumer protection in Morocco. We have concluded that there is an impact -in terms of consumer legal protection- of food law reform on the concept of food safety.

Keywords: food safety, Morocco, consumer protection, framework, food law

Procedia PDF Downloads 229
5945 Legal Interpretation of the Transplanted Law

Authors: Wahyu Kurniawan

Abstract:

Indonesia developed the legal system radically since 1999. Several laws have been established and mostly the result of transplantation. Laws were made general but legal problems have been growing. In the legal enforcement, the judges have authority to interpret the laws. Authority and freedom are the source of corruption by the courts in Indonesia. Therefore, it should be built the conceptual framework to interpret the transplanted laws as the legal basis in deciding the cases. This article describes legal development based on interpretation of transplanted law in Indonesia by using the Indonesian Supervisory Commission for Business Competition (KPPU) decisions between 2000 and 2010 as the object of the research. The study was using law as a system theory and theories of legal interpretation especially the static and dynamic interpretations. The research showed that the KPPU interpreted the concept that exists in the Competition Law by using static and dynamic interpretation. Static interpretation was used to interpret the legal concepts based on two grounds, minute of meeting during law making process and the definitions that have been recognized in the Indonesian legal system. Dynamic interpretation was used when the KPPU developing the definition of the legal concepts. The general purpose of the law and the theories of the basis of the law were the conceptual framework in using dynamic interpretation. There are two recommendations in this article. Firstly, interpreting the laws by the judges should be based on the correct conceptual framework. Secondly, the technique of interpreting the laws would be the method of controlling the judges.

Keywords: legal interpretation, legal transplant, competition law, KPPU

Procedia PDF Downloads 338
5944 A Cloud-Based Spectrum Database Approach for Licensed Shared Spectrum Access

Authors: Hazem Abd El Megeed, Mohamed El-Refaay, Norhan Magdi Osman

Abstract:

Spectrum scarcity is a challenging obstacle in wireless communications systems. It hinders the introduction of innovative wireless services and technologies that require larger bandwidth comparing to legacy technologies. In addition, the current worldwide allocation of radio spectrum bands is already congested and can not afford additional squeezing or optimization to accommodate new wireless technologies. This challenge is a result of accumulative contributions from different factors that will be discussed later in this paper. One of these factors is the radio spectrum allocation policy governed by national regulatory authorities nowadays. The framework for this policy allocates specified portion of radio spectrum to a particular wireless service provider on exclusive utilization basis. This allocation is executed according to technical specification determined by the standard bodies of each Radio Access Technology (RAT). Dynamic access of spectrum is a framework for flexible utilization of radio spectrum resources. In this framework there is no exclusive allocation of radio spectrum and even the public safety agencies can share their spectrum bands according to a governing policy and service level agreements. In this paper, we explore different methods for accessing the spectrum dynamically and its associated implementation challenges.

Keywords: licensed shared access, cognitive radio, spectrum sharing, spectrum congestion, dynamic spectrum access, spectrum database, spectrum trading, reconfigurable radio systems, opportunistic spectrum allocation (OSA)

Procedia PDF Downloads 423
5943 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana

Authors: Lidon Lashley

Abstract:

This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.

Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index

Procedia PDF Downloads 157
5942 Assessing the Applicability of Kevin Lynch’s Framework of ‘the Image of the City’ in the Case of a Walled City of Jaipur

Authors: Jay Patel

Abstract:

This Research is about investigating the ‘image’ of the city, and asks whether this ‘image’ holds any significance that can be changed. Kevin Lynch in the book ‘The image of the city’ develops a framework that breaks down the city’s image into five physical elements. These elements (Paths, Edge, Nodes, Districts, and Landmarks), according to Lynch assess the legibility of the urbanscapes, that emerged from his perception-based study in 3 different cities (New Jersey, Los Angeles, and Boston) in the USA. The aim of this research is to investigate whether Lynch’s framework can be applied within an Indian context or not. If so, what are the possibilities and whether the imageability of Indian cities can be depicted through the Lynch’s physical elements or it demands an extension to the framework by either adding or subtracting a physical attribute. For this research project, the walled city of Jaipur was selected, as it is considered one of the futuristic designed cities of all time in India. The other significant reason for choosing Jaipur was that it is a historically planned city with solid historical, touristic and local importance; allowing an opportunity to understand the application of Lynch's elements to the city's image. In other words, it provides an opportunity to examine how the disadvantages of a city's implicit programme (its relics of bygone eras) can be converted into assets by improving the imageability of the city. To obtain data, a structured semi-open ended interview method was chosen. The reason for selecting this method explicitly was to gain qualitative data from the users rather than collecting quantitative data from closed-ended questions. This allowed in-depth understanding and applicability of Kevin Lynch’s framework while assessing what needs to be added. The interviews were conducted in Jaipur that yielded varied inferences that were different from the expected learning outcomes, highlighting the need for extension on Lynch’s physical elements to achieve city’s image. Whilst analyzing the data, there were few attributes found that defined the image of Jaipur. These were categorized into two: a Physical aspect (streets and arcade entities, natural features, temples and temporary/ informal activities) and Associational aspects (History, Culture and Tradition, Medium of help in wayfinding, and intangible aspects).

Keywords: imageability, Kevin Lynch, people’s perception, assessment, associational aspects, physical aspects

Procedia PDF Downloads 195
5941 Data Mining of Students' Performance Using Artificial Neural Network: Turkish Students as a Case Study

Authors: Samuel Nii Tackie, Oyebade K. Oyedotun, Ebenezer O. Olaniyi, Adnan Khashman

Abstract:

Artificial neural networks have been used in different fields of artificial intelligence, and more specifically in machine learning. Although, other machine learning options are feasible in most situations, but the ease with which neural networks lend themselves to different problems which include pattern recognition, image compression, classification, computer vision, regression etc. has earned it a remarkable place in the machine learning field. This research exploits neural networks as a data mining tool in predicting the number of times a student repeats a course, considering some attributes relating to the course itself, the teacher, and the particular student. Neural networks were used in this work to map the relationship between some attributes related to students’ course assessment and the number of times a student will possibly repeat a course before he passes. It is the hope that the possibility to predict students’ performance from such complex relationships can help facilitate the fine-tuning of academic systems and policies implemented in learning environments. To validate the power of neural networks in data mining, Turkish students’ performance database has been used; feedforward and radial basis function networks were trained for this task; and the performances obtained from these networks evaluated in consideration of achieved recognition rates and training time.

Keywords: artificial neural network, data mining, classification, students’ evaluation

Procedia PDF Downloads 610