Search results for: PROWESS database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1654

Search results for: PROWESS database

1324 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis

Authors: Arin Ghazarian, Cyril Rakovski

Abstract:

Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter called

Keywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases

Procedia PDF Downloads 150
1323 A Prototype of an Information and Communication Technology Based Intervention Tool for Children with Dyslexia

Authors: Rajlakshmi Guha, Sajjad Ansari, Shazia Nasreen, Hirak Banerjee, Jiaul Paik

Abstract:

Dyslexia is a neurocognitive disorder, affecting around fifteen percent of the Indian population. The symptoms include difficulty in reading alphabet, words, and sentences. This can be difficult at the phonemic or recognition level and may further affect lexical structures. Therapeutic intervention of dyslexic children post assessment is generally done by special educators and psychologists through one on one interaction. Considering the large number of children affected and the scarcity of experts, access to care is limited in India. Moreover, unavailability of resources and timely communication with caregivers add on to the problem of proper intervention. With the development of Educational Technology and its use in India, access to information and care has been improved in such a large and diverse country. In this context, this paper proposes an ICT enabled home-based intervention program for dyslexic children which would support the child, and provide an interactive interface between expert, parents, and students. The paper discusses the details of the database design and system layout of the program. Along with, it also highlights the development of different technical aids required to build out personalized android applications for the Indian dyslexic population. These technical aids include speech database creation for children, automatic speech recognition system, serious game development, and color coded fonts. The paper also emphasizes the games developed to assist the dyslexic child on cognitive training primarily for attention, working memory, and spatial reasoning. In addition, it talks about the specific elements of the interactive intervention tool that makes it effective for home based intervention of dyslexia.

Keywords: Android applications, cognitive training, dyslexia, intervention

Procedia PDF Downloads 289
1322 An Assessment of Drainage Network System in Nigeria Urban Areas using Geographical Information Systems: A Case Study of Bida, Niger State

Authors: Yusuf Hussaini Atulukwu, Daramola Japheth, Tabitit S. Tabiti, Daramola Elizabeth Lara

Abstract:

In view of the recent limitations faced by the township concerning poorly constructed and in some cases non - existence of drainage facilities that resulted into incessant flooding in some parts of the community poses threat to life,property and the environment. The research seeks to address this issue by showing the spatial distribution of drainage network in Bida Urban using Geographic information System techniques. Relevant features were extracted from existing Bida based Map using un-screen digitization and x, y, z, data of existing drainages were acquired using handheld Global Positioning System (GPS). These data were uploaded into ArcGIS 9.2, software, and stored in the relational database structure that was used to produce the spatial data drainage network of the township. The result revealed that about 40 % of the drainages are blocked with sand and refuse, 35 % water-logged as a result of building across erosion channels and dilapidated bridges as a result of lack of drainage along major roads. The study thus concluded that drainage network systems in Bida community are not in good working condition and urgent measures must be initiated in order to avoid future disasters especially with the raining season setting in. Based on the above findings, the study therefore recommends that people within the locality should avoid dumping municipal waste within the drainage path while sand blocked or weed blocked drains should be clear by the authority concerned. In the same vein the authority should ensured that contract of drainage construction be awarded to professionals and all the natural drainages caused by erosion should be addressed to avoid future disasters.

Keywords: drainage network, spatial, digitization, relational database, waste

Procedia PDF Downloads 330
1321 A Quality Index Optimization Method for Non-Invasive Fetal ECG Extraction

Authors: Lucia Billeci, Gennaro Tartarisco, Maurizio Varanini

Abstract:

Fetal cardiac monitoring by fetal electrocardiogram (fECG) can provide significant clinical information about the healthy condition of the fetus. Despite this potentiality till now the use of fECG in clinical practice has been quite limited due to the difficulties in its measuring. The recovery of fECG from the signals acquired non-invasively by using electrodes placed on the maternal abdomen is a challenging task because abdominal signals are a mixture of several components and the fetal one is very weak. This paper presents an approach for fECG extraction from abdominal maternal recordings, which exploits the characteristics of pseudo-periodicity of fetal ECG. It consists of devising a quality index (fQI) for fECG and of finding the linear combinations of preprocessed abdominal signals, which maximize these fQI (quality index optimization - QIO). It aims at improving the performances of the most commonly adopted methods for fECG extraction, usually based on maternal ECG (mECG) estimating and canceling. The procedure for the fECG extraction and fetal QRS (fQRS) detection is completely unsupervised and based on the following steps: signal pre-processing; maternal ECG (mECG) extraction and maternal QRS detection; mECG component approximation and canceling by weighted principal component analysis; fECG extraction by fQI maximization and fetal QRS detection. The proposed method was compared with our previously developed procedure, which obtained the highest at the Physionet/Computing in Cardiology Challenge 2013. That procedure was based on removing the mECG from abdominal signals estimated by a principal component analysis (PCA) and applying the Independent component Analysis (ICA) on the residual signals. Both methods were developed and tuned using 69, 1 min long, abdominal measurements with fetal QRS annotation of the dataset A provided by PhysioNet/Computing in Cardiology Challenge 2013. The QIO-based and the ICA-based methods were compared in analyzing two databases of abdominal maternal ECG available on the Physionet site. The first is the Abdominal and Direct Fetal Electrocardiogram Database (ADdb) which contains the fetal QRS annotations thus allowing a quantitative performance comparison, the second is the Non-Invasive Fetal Electrocardiogram Database (NIdb), which does not contain the fetal QRS annotations so that the comparison between the two methods can be only qualitative. In particular, the comparison on NIdb was performed defining an index of quality for the fetal RR series. On the annotated database ADdb the QIO method, provided the performance indexes Sens=0.9988, PPA=0.9991, F1=0.9989 overcoming the ICA-based one, which provided Sens=0.9966, PPA=0.9972, F1=0.9969. The comparison on NIdb was performed defining an index of quality for the fetal RR series. The index of quality resulted higher for the QIO-based method compared to the ICA-based one in 35 records out 55 cases of the NIdb. The QIO-based method gave very high performances with both the databases. The results of this study foresees the application of the algorithm in a fully unsupervised way for the implementation in wearable devices for self-monitoring of fetal health.

Keywords: fetal electrocardiography, fetal QRS detection, independent component analysis (ICA), optimization, wearable

Procedia PDF Downloads 278
1320 Ground Motion Modeling Using the Least Absolute Shrinkage and Selection Operator

Authors: Yildiz Stella Dak, Jale Tezcan

Abstract:

Ground motion models that relate a strong motion parameter of interest to a set of predictive seismological variables describing the earthquake source, the propagation path of the seismic wave, and the local site conditions constitute a critical component of seismic hazard analyses. When a sufficient number of strong motion records are available, ground motion relations are developed using statistical analysis of the recorded ground motion data. In regions lacking a sufficient number of recordings, a synthetic database is developed using stochastic, theoretical or hybrid approaches. Regardless of the manner the database was developed, ground motion relations are developed using regression analysis. Development of a ground motion relation is a challenging process which inevitably requires the modeler to make subjective decisions regarding the inclusion criteria of the recordings, the functional form of the model and the set of seismological variables to be included in the model. Because these decisions are critically important to the validity and the applicability of the model, there is a continuous interest on procedures that will facilitate the development of ground motion models. This paper proposes the use of the Least Absolute Shrinkage and Selection Operator (LASSO) in selecting the set predictive seismological variables to be used in developing a ground motion relation. The LASSO can be described as a penalized regression technique with a built-in capability of variable selection. Similar to the ridge regression, the LASSO is based on the idea of shrinking the regression coefficients to reduce the variance of the model. Unlike ridge regression, where the coefficients are shrunk but never set equal to zero, the LASSO sets some of the coefficients exactly to zero, effectively performing variable selection. Given a set of candidate input variables and the output variable of interest, LASSO allows ranking the input variables in terms of their relative importance, thereby facilitating the selection of the set of variables to be included in the model. Because the risk of overfitting increases as the ratio of the number of predictors to the number of recordings increases, selection of a compact set of variables is important in cases where a small number of recordings are available. In addition, identification of a small set of variables can improve the interpretability of the resulting model, especially when there is a large number of candidate predictors. A practical application of the proposed approach is presented, using more than 600 recordings from the National Geospatial-Intelligence Agency (NGA) database, where the effect of a set of seismological predictors on the 5% damped maximum direction spectral acceleration is investigated. The set of candidate predictors considered are Magnitude, Rrup, Vs30. Using LASSO, the relative importance of the candidate predictors has been ranked. Regression models with increasing levels of complexity were constructed using one, two, three, and four best predictors, and the models’ ability to explain the observed variance in the target variable have been compared. The bias-variance trade-off in the context of model selection is discussed.

Keywords: ground motion modeling, least absolute shrinkage and selection operator, penalized regression, variable selection

Procedia PDF Downloads 328
1319 A Web Service Based Sensor Data Management System

Authors: Rose A. Yemson, Ping Jiang, Oyedeji L. Inumoh

Abstract:

The deployment of wireless sensor network has rapidly increased, however with the increased capacity and diversity of sensors, and applications ranging from biological, environmental, military etc. generates tremendous volume of data’s where more attention is placed on the distributed sensing and little on how to manage, analyze, retrieve and understand the data generated. This makes it more quite difficult to process live sensor data, run concurrent control and update because sensor data are either heavyweight, complex, and slow. This work will focus on developing a web service platform for automatic detection of sensors, acquisition of sensor data, storage of sensor data into a database, processing of sensor data using reconfigurable software components. This work will also create a web service based sensor data management system to monitor physical movement of an individual wearing wireless network sensor technology (SunSPOT). The sensor will detect movement of that individual by sensing the acceleration in the direction of X, Y and Z axes accordingly and then send the sensed reading to a database that will be interfaced with an internet platform. The collected sensed data will determine the posture of the person such as standing, sitting and lying down. The system is designed using the Unified Modeling Language (UML) and implemented using Java, JavaScript, html and MySQL. This system allows real time monitoring an individual closely and obtain their physical activity details without been physically presence for in-situ measurement which enables you to work remotely instead of the time consuming check of an individual. These details can help in evaluating an individual’s physical activity and generate feedback on medication. It can also help in keeping track of any mandatory physical activities required to be done by the individuals. These evaluations and feedback can help in maintaining a better health status of the individual and providing improved health care.

Keywords: HTML, java, javascript, MySQL, sunspot, UML, web-based, wireless network sensor

Procedia PDF Downloads 210
1318 Speech Emotion Recognition: A DNN and LSTM Comparison in Single and Multiple Feature Application

Authors: Thiago Spilborghs Bueno Meyer, Plinio Thomaz Aquino Junior

Abstract:

Through speech, which privileges the functional and interactive nature of the text, it is possible to ascertain the spatiotemporal circumstances, the conditions of production and reception of the discourse, the explicit purposes such as informing, explaining, convincing, etc. These conditions allow bringing the interaction between humans closer to the human-robot interaction, making it natural and sensitive to information. However, it is not enough to understand what is said; it is necessary to recognize emotions for the desired interaction. The validity of the use of neural networks for feature selection and emotion recognition was verified. For this purpose, it is proposed the use of neural networks and comparison of models, such as recurrent neural networks and deep neural networks, in order to carry out the classification of emotions through speech signals to verify the quality of recognition. It is expected to enable the implementation of robots in a domestic environment, such as the HERA robot from the RoboFEI@Home team, which focuses on autonomous service robots for the domestic environment. Tests were performed using only the Mel-Frequency Cepstral Coefficients, as well as tests with several characteristics of Delta-MFCC, spectral contrast, and the Mel spectrogram. To carry out the training, validation and testing of the neural networks, the eNTERFACE’05 database was used, which has 42 speakers from 14 different nationalities speaking the English language. The data from the chosen database are videos that, for use in neural networks, were converted into audios. It was found as a result, a classification of 51,969% of correct answers when using the deep neural network, when the use of the recurrent neural network was verified, with the classification with accuracy equal to 44.09%. The results are more accurate when only the Mel-Frequency Cepstral Coefficients are used for the classification, using the classifier with the deep neural network, and in only one case, it is possible to observe a greater accuracy by the recurrent neural network, which occurs in the use of various features and setting 73 for batch size and 100 training epochs.

Keywords: emotion recognition, speech, deep learning, human-robot interaction, neural networks

Procedia PDF Downloads 168
1317 Assessment of Image Databases Used for Human Skin Detection Methods

Authors: Saleh Alshehri

Abstract:

Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases.

Keywords: image databases, image processing, pattern recognition, neural networks

Procedia PDF Downloads 270
1316 Utilization of Biodiversity of Peaces Herbals Used as Food and Treat the Path of Economic Phu Sing District in Sisaket Province Thailand

Authors: Nopparet Thammasaranyakun

Abstract:

This research objects are: 1: To study the biodiversity of medicinal plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province. 2: To study the use of medicinal plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province. 3: To provide a database of information on biodiversity for food and medicinal plants and medicinal tourism economies along the Phu Sing district Sisaket province. 4: Learn to create a biodiversity of medicinal plants used as food and treatment by Journeys economic Phu Sing district Sisaket province Boundaries used in this study was the Phu Sing district. Population and Agricultural Development Center, rayong Mun due to the initiative for youth Local, Government Health officials, community leaders, teachers, students, schools, the local people and tourists. Sage wisdom to know the herbs and women's groups, OTOP Phu Sing district in SiisaKet province. By selecting the specific data that way. The process of participatory action research (PAR) is a community-based research. The method of collecting qualitative data. (Qualitative) tool is used from context, Community areas, interview and Taped recordings. Observation and focus group data was statistically analyzed using descriptive statistics (Descriptive Statistics). The results findings: 1- A study of the biodiversity of plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province. Were used in the dry season and the rainy season find the medicinal plants of 251 species 41 types of drugs. 2- The study utilized medicinal plants used as food and the treatment of indigenous Phu Sing Sisaket province. Found 251 species have medicinal properties that are used for food and medicinal purposes 41 types of drugs. 3- Of the database technology of biodiversity for food and medicinal plants used by local treatment Phu Sing district Sisaket province. A data base of 251 medicinal species 41 types of drugs is used for food and medicinal properties Sisaket province. 4- learning the biodiversity of medicinal plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province.

Keywords: utilization of biodiversity, peaces herbals, used as Food, Sing district, sisaket

Procedia PDF Downloads 357
1315 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 193
1314 Association of Clostridium difficile Infection and Bone Cancer

Authors: Daniela Prado, Lexi Frankel, Amalia Ardeljan, Lokesh Manjani, Matthew Cardeiro, Omar Rashid

Abstract:

Background: Clostridium difficile (C. diff) is a gram-positive bacterium that is known to cause life-threatening diarrhea and severe inflammation of the colon. It originates as an alteration of the gut microbiome and can be transmitted through spores. Recent studies have shown a high association between the development of C. diff in cancer patients due to extensive hospitalization. However, research is lacking regarding C. diff’s association in the causation or prevention of cancer. The objective of this study was to therefore assess the correlation between Clostridium difficile infection (CDI) and the incidence of bone cancer. Methods: This retrospective analysis used data provided by a Health Insurance Portability and Accountability Act (HIPAA) compliant national database to evaluate the patients infected versus patients not infected with C. diff using ICD-10 and ICD-9 codes. Access to the database was granted by the Holy Cross Health, Fort Lauderdale, for the purpose of academic research. Standard statistical methods were used. Results: Between January 2010 and December 2019, the query was analyzed and resulted in 78863 patients in both the infected and control group, respectively. The two groups were matched by age range and CCI score. The incidence of bone cancer was 659 patients (0.835%) in the C. diff group compared to 1941 patients (2.461%) in the control group. The difference was statistically significant by a P-value < 2.2x10^-16 with an odds ratio (OR)= 0.33 (0.31-0.37) with a 95% confidence interval (CI). Treatment for CDI was analyzed for both C. diff infected and noninfected populations. 91 out of 16,676 (0.55%) patients with a prior C. diff infection and treated with antibiotics were compared to the control group were 275 out of 16,676 (1.65%) patients with no history of CDI and received antibiotic treatment. Results remained statistically significant by P-value <2.2x10-16 with an OR= 0.42 (0.37, 0.48). and a 95% CI. Conclusion: The study shows a statistically significant correlation between C. diff and a reduced incidence of bone cancer. Further evaluation is recommended to assess the potential of C. difficile in reducing bone cancer incidence.

Keywords: bone cancer, colitis, clostridium difficile, microbiome

Procedia PDF Downloads 276
1313 ROSgeoregistration: Aerial Multi-Spectral Image Simulator for the Robot Operating System

Authors: Andrew R. Willis, Kevin Brink, Kathleen Dipple

Abstract:

This article describes a software package called ROS-georegistration intended for use with the robot operating system (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test, and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google’s Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and windspeed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Keywords: EO-to-EO, EO-to-SAR, flight simulation, georegistration, image generation, robot operating system, vision-based navigation

Procedia PDF Downloads 100
1312 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process

Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani

Abstract:

Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.

Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process

Procedia PDF Downloads 337
1311 Epidemiology of Private Prehospital Calls over the Last Decade in South Africa

Authors: Rhodine Hickman, Craig Wylie, Michael G. McCaul

Abstract:

Introduction: The World Health Organisation has called on governments around the world to recognise emergency conditions as a global public health problem and respond with appropriate steps for effective preventative strategies. However, to understand the magnitude of the problem, good quality epidemiological data is required. This is especially challenging in low and middle-income countries, where routine data is scarce, specifically within the prehospital setting. Methods: We conducted a retrospective cross-sectional study of a national prehospital private sector EMS database. The database being the property of ER24 (private Emergency Medical Services (EMS) company in South Africa) contains claims submitted by the majority of ambulance services in South Africa during the period between 1 January 2008 to 28 March 2017. We used descriptive statistics and control charts to describe the data using STATA 14. Results: 299,257 calls were included in the analysis. The top clinical conditions requiring ambulance transport were transport accidents (10% of total call volume) and ischaemic heart disease (4.4%). The number of transport accidents consistently increased between 2009 and 2014 and reached beyond the limit for normal variation in 2015. Victims of transport accidents required basic life support services 60% of the time with 80% of injuries being minor to moderate. The frequency of ischaemic heart disease had a steady incline from 2011 to 2016. Advanced life support services were required about 50% of the time, with 60% of patients needing urgent care. Conclusion: Transport accidents, followed by ischaemic heart disease, are the most prevalent conditions in South African private EMS. There is a potential to address these conditions by developing the capacity of low and mid-level providers in trauma and advanced EMS providers in ischaemic heart disease.

Keywords: emergency care, emergency medicine, prehospital providers, South Africa

Procedia PDF Downloads 175
1310 GIS-Based Identification of Overloaded Distribution Transformers and Calculation of Technical Electric Power Losses

Authors: Awais Ahmed, Javed Iqbal

Abstract:

Pakistan has been for many years facing extreme challenges in energy deficit due to the shortage of power generation compared to increasing demand. A part of this energy deficit is also contributed by the power lost in transmission and distribution network. Unfortunately, distribution companies are not equipped with modern technologies and methods to identify and eliminate these losses. According to estimate, total energy lost in early 2000 was between 20 to 26 percent. To address this issue the present research study was designed with the objectives of developing a standalone GIS application for distribution companies having the capability of loss calculation as well as identification of overloaded transformers. For this purpose, Hilal Road feeder in Faisalabad Electric Supply Company (FESCO) was selected as study area. An extensive GPS survey was conducted to identify each consumer, linking it to the secondary pole of the transformer, geo-referencing equipment and documenting conductor sizes. To identify overloaded transformer, accumulative kWH reading of consumer on transformer was compared with threshold kWH. Technical losses of 11kV and 220V lines were calculated using the data from substation and resistance of the network calculated from the geo-database. To automate the process a standalone GIS application was developed using ArcObjects with engineering analysis capabilities. The application uses GIS database developed for 11kV and 220V lines to display and query spatial data and present results in the form of graphs. The result shows that about 14% of the technical loss on both high tension (HT) and low tension (LT) network while about 4 out of 15 general duty transformers were found overloaded. The study shows that GIS can be a very effective tool for distribution companies in management and planning of their distribution network.

Keywords: geographical information system, GIS, power distribution, distribution transformers, technical losses, GPS, SDSS, spatial decision support system

Procedia PDF Downloads 373
1309 Musical Instruments Classification Using Machine Learning Techniques

Authors: Bhalke D. G., Bormane D. S., Kharate G. K.

Abstract:

This paper presents classification of musical instrument using machine learning techniques. The classification has been carried out using temporal, spectral, cepstral and wavelet features. Detail feature analysis is carried out using separate and combined features. Further, instrument model has been developed using K-Nearest Neighbor and Support Vector Machine (SVM). Benchmarked McGill university database has been used to test the performance of the system. Experimental result shows that SVM performs better as compared to KNN classifier.

Keywords: feature extraction, SVM, KNN, musical instruments

Procedia PDF Downloads 479
1308 Risk of Fractures at Different Anatomic Sites in Patients with Irritable Bowel Syndrome: A Nationwide Population-Based Cohort Study

Authors: Herng-Sheng Lee, Chi-Yi Chen, Wan-Ting Huang, Li-Jen Chang, Solomon Chih-Cheng Chen, Hsin-Yi Yang

Abstract:

A variety of gastrointestinal disorders, such as Crohn’s disease, ulcerative colitis, and coeliac disease, are recognized as risk factors for osteoporosis and osteoporotic fractures. One recent study suggests that individuals with irritable bowel syndrome (IBS) might also be at increased risk of osteoporosis and osteoporotic fractures. Up to now, the association between IBS and the risk of fractures at different anatomic sites occurrences is not completely clear. We conducted a population-based cohort analysis to investigate the fracture risk of IBS in comparison with non-IBS group. We identified 29,505 adults aged ≥ 20 years with newly diagnosed IBS using the Taiwan National Health Insurance Research Database in 2000-2012. A comparison group was constructed of patients without IBS who were matched according to gender and age. The occurrence of fracture was monitored until the end of 2013. We analyzed the risk of fracture events to occur in IBS by using Cox proportional hazards regression models. Patients with IBS had a higher incidence of osteoporotic fractures compared with non-IBS group (12.34 versus 9.45 per 1,000 person-years) and an increased risk of osteoporotic fractures (adjusted hazard ratio [aHR] = 1.27, 95 % confidence interval [CI] = 1.20 – 1.35). Site specific analysis showed that the IBS group had a higher risk of fractures for spine, forearm, hip and hand than did the non-IBS group. With further stratification for gender and age, a higher aHR value for osteoporotic fractures in IBS group was seen across all age groups in males, but seen in elderly females. In addition, female, elderly, low income, hypertension, coronary artery disease, cerebrovascular disease, and depressive disorders as independent osteoporotic fracture risk factors in IBS patients. The IBS is considered as a risk factor for osteoporotic fractures, particularly in female individuals and fracture sites located at the spine, forearm, hip and hand.

Keywords: irritable bowel syndrome, fracture, gender difference, longitudinal health insurance database, public health

Procedia PDF Downloads 228
1307 Investigate the Side Effects of Patients With Severe COVID-19 and Choose the Appropriate Medication Regimens to Deal With Them

Authors: Rasha Ahmadi

Abstract:

In December 2019, a coronavirus, currently identified as SARS-CoV-2, produced a series of acute atypical respiratory illnesses in Wuhan, Hubei Province, China. The sickness induced by this virus was named COVID-19. The virus is transmittable between humans and has caused pandemics worldwide. The number of death tolls continues to climb and a huge number of countries have been obliged to perform social isolation and lockdown. Lack of focused therapy continues to be a problem. Epidemiological research showed that senior patients were more susceptible to severe diseases, whereas children tend to have milder symptoms. In this study, we focus on other possible side effects of COVID-19 and more detailed treatment strategies. Using bioinformatics analysis, we first isolated the gene expression profile of patients with severe COVID-19 from the GEO database. Patients' blood samples were used in the GSE183071 dataset. We then categorized the genes with high and low expression. In the next step, we uploaded the genes separately to the Enrichr database and evaluated our data for signs and symptoms as well as related medication regimens. The results showed that 138 genes with high expression and 108 genes with low expression were observed differentially in the severe COVID-19 VS control group. Symptoms and diseases such as embolism and thrombosis of the abdominal aorta, ankylosing spondylitis, suicidal ideation or attempt, regional enteritis were observed in genes with high expression and in genes with low expression of acute and subacute forms of ischemic heart, CNS infection and poliomyelitis, synovitis and tenosynovitis. Following the detection of diseases and possible signs and symptoms, Carmustine, Bithionol, Leflunomide were evaluated more significantly for high-expression genes and Chlorambucil, Ifosfamide, Hydroxyurea, Bisphenol for low-expression genes. In general, examining the different and invisible aspects of COVID-19 and identifying possible treatments can help us significantly in the emergency and hospitalization of patients.

Keywords: phenotypes, drug regimens, gene expression profiles, bioinformatics analysis, severe COVID-19

Procedia PDF Downloads 139
1306 Molecular Docking Study of Rosmarinic Acid and Its Analog Compounds on Sickle Cell Hemoglobin

Authors: Roohallah Yousefi

Abstract:

Introduction: Voxelotor, also known as GBT 440, binds to the alpha cleft in HbS tetramers and promotes the stability of the relaxed or oxygenated state of HbS. This process hinders the conformational change of the HbS tetramers into the deoxygenated state. Voxelotor prevents interactions between HbS tetramers in the deoxygenated state, ultimately inhibiting the polymerization of HbS tetramers and resulting in significant clinical improvements, particularly in raising hemoglobin levels in patients. In this study, we have explored the use of herbal compound models, such as rosmarinic acid and compounds with similar structures that exhibit high binding affinity to Voxelotor's hemoglobin binding site. Materials and methods: The molecular model of hemoglobin (PDB: 5E83) was initially obtained from the RCSB PDB database. In addition, we collected 453 ligand models with structural similarity to rosmarinic acid from the PubChem database. To prepare these models for molecular docking, we utilized the Molegro Virtual Docker tool. Subsequently, we used the SwissADME web tool to predict the physicochemical properties and pharmacokinetics of these compounds. Results: We investigated the affinity and binding site of 453 compounds similar to rosmarinic acid on the hemoglobin model (PDB: 5E83). Our focus was on the alpha cleft between two alpha chains of the hemoglobin model (PDB: 5E83). The results showed that most compounds had molecular weights above 500 daltons, and some exhibited acceptable hydrophobicity. Furthermore, their solubility in aqueous solutions was good. None of the compounds were able to cross the blood-brain barrier or have gastrointestinal absorption. However, they did have varying inhibitory effects on CYP2C9 cytochromes. The skin penetration rate was generally low. Conclusion: Through our study, we identified three compounds (CID: 162739375, CID: 141386569, and CID: 24015539) with promising potential for further research. These compounds demonstrated high binding affinity to the hemoglobin model, favorable dissolution and digestive absorption rates, as well as suitable hydrophobicity, making them ideal candidates for continued laboratory investigation.

Keywords: voxelotor, binding site, hemoglobin, rosmarinic acid

Procedia PDF Downloads 6
1305 A Framework for an Automated Decision Support System for Selecting Safety-Conscious Contractors

Authors: Rawan A. Abdelrazeq, Ahmed M. Khalafallah, Nabil A. Kartam

Abstract:

Selection of competent contractors for construction projects is usually accomplished through competitive bidding or negotiated contracting in which the contract bid price is the basic criterion for selection. The evaluation of contractor’s safety performance is still not a typical criterion in the selection process, despite the existence of various safety prequalification procedures. There is a critical need for practical and automated systems that enable owners and decision makers to evaluate contractor safety performance, among other important contractor selection criteria. These systems should ultimately favor safety-conscious contractors to be selected by the virtue of their past good safety records and current safety programs. This paper presents an exploratory sequential mixed-methods approach to develop a framework for an automated decision support system that evaluates contractor safety performance based on a multitude of indicators and metrics that have been identified through a comprehensive review of construction safety research, and a survey distributed to domain experts. The framework is developed in three phases: (1) determining the indicators that depict contractor current and past safety performance; (2) soliciting input from construction safety experts regarding the identified indicators, their metrics, and relative significance; and (3) designing a decision support system using relational database models to integrate the identified indicators and metrics into a system that assesses and rates the safety performance of contractors. The proposed automated system is expected to hold several advantages including: (1) reducing the likelihood of selecting contractors with poor safety records; (2) enhancing the odds of completing the project safely; and (3) encouraging contractors to exert more efforts to improve their safety performance and practices in order to increase their bid winning opportunities which can lead to significant safety improvements in the construction industry. This should prove useful to decision makers and researchers, alike, and should help improve the safety record of the construction industry.

Keywords: construction safety, contractor selection, decision support system, relational database

Procedia PDF Downloads 279
1304 The Optimal Irrigation in the Mitidja Plain

Authors: Gherbi Khadidja

Abstract:

In the Mediterranean region, water resources are limited and very unevenly distributed in space and time. The main objective of this project is the development of a wireless network for the management of water resources in northern Algeria, the Mitidja plain, which helps farmers to irrigate in the most optimized way and solve the problem of water shortage in the region. Therefore, we will develop an aid tool that can modernize and replace some traditional techniques, according to the real needs of the crops and according to the soil conditions as well as the climatic conditions (soil moisture, precipitation, characteristics of the unsaturated zone), These data are collected in real-time by sensors and analyzed by an algorithm and displayed on a mobile application and the website. The results are essential information and alerts with recommendations for action to farmers to ensure the sustainability of the agricultural sector under water shortage conditions. In the first part: We want to set up a wireless sensor network, for precise management of water resources, by presenting another type of equipment that allows us to measure the water content of the soil, such as the Watermark probe connected to the sensor via the acquisition card and an Arduino Uno, which allows collecting the captured data and then program them transmitted via a GSM module that will send these data to a web site and store them in a database for a later study. In a second part: We want to display the results on a website or a mobile application using the database to remotely manage our smart irrigation system, which allows the farmer to use this technology and offers the possibility to the growers to access remotely via wireless communication to see the field conditions and the irrigation operation, at home or at the office. The tool to be developed will be based on satellite imagery as regards land use and soil moisture. These tools will make it possible to follow the evolution of the needs of the cultures in time, but also to time, and also to predict the impact on water resources. According to the references consulted, if such a tool is used, it can reduce irrigation volumes by up to up to 40%, which represents more than 100 million m3 of savings per year for the Mitidja. This volume is equivalent to a medium-size dam.

Keywords: optimal irrigation, soil moisture, smart irrigation, water management

Procedia PDF Downloads 108
1303 A Research on Determining the Viability of a Job Board Website for Refugees in Kenya

Authors: Prince Mugoya, Collins Oduor Ondiek, Patrick Kanyi Wamuyu

Abstract:

Refugee Job Board Website is a web-based application that provides a platform for organizations to post jobs specifically for refugees. Organizations upload job opportunities and refugees can view them on the website. The website also allows refugees to input their skills and qualifications. The methodology used to develop this system is a waterfall (traditional) methodology. Software development tools include Brackets which will be used to code the website and PhpMyAdmin to store all the data in a database.

Keywords: information technology, refugee, skills, utilization, economy, jobs

Procedia PDF Downloads 164
1302 Montelukast Doesn’t Decrease the Risk of Cardiovascular Disease in Asthma Patients in Taiwan

Authors: Sheng Yu Chen, Shi-Heng Wang

Abstract:

Aim: Based on human, animal experiments, and genetic studies, cysteinyl leukotrienes, LTC4, LTD4, and LTE4, are inflammatory substances that are metabolized by 5-lipooxygenase from arachidonic acid, and these substances trigger asthma. In addition, the synthetic pathway of cysteinyl leukotriene is relevant to the increase in cardiovascular diseases such as myocardial ischemia and stroke. Given the situation, we aim to investigate whether cysteinyl leukotrienes receptor antagonist (LTRA), montelukast which cures those who have asthma has potential protective effects on cardiovascular diseases. Method: We conducted a cohort study, and enrolled participants which are newly diagnosed with asthma (ICD-9 CM code 493. X) between 2002 to 2011. The data source is from Taiwan National Health Insurance Research Database Patients with a previous history of myocardial infarction or ischemic stroke were excluded. Among the remaining participants, every montelukast user was matched with two randomly non-users by sex, and age. The incident cardiovascular diseases, including myocardial infarction and ischemic stroke, were regarded as outcomes. We followed the participants until outcomes come first or the end of the following period. To explore the protective effect of montelukast on the risk of cardiovascular disease, we use multivariable Cox regression to estimate the hazard ratio with adjustment for potential confounding factors. Result: There are 55876 newly diagnosed asthma patients who had at least one claim of inpatient admission or at least three claims of outpatient records. We enrolled 5350 montelukast users and 10700 non-users in this cohort study. The following mean (±SD) time of the Montelukast group is 5 (±2.19 )years, and the non-users group is 6.2 5.47 (± 2.641) years. By using multivariable Cox regression, our analysis indicated that the risk of incident cardiovascular diseases between montelukast users (n=43, 0.8%) and non-users (n=111, 1.04%) is approximately equal. [adjusted hazard ratio 0.992; P-value:0.9643] Conclusion: In this population-based study, we found that the use of montelukast is not associated with a decrease in incident MI or IS.

Keywords: asthma, inflammation, montelukast, insurance research database, cardiovascular diseases

Procedia PDF Downloads 81
1301 DHL CSI Solution Design Project

Authors: Mohammed Al-Yamani, Yaser Miaji

Abstract:

DHL Customer Solutions and Innovation Department (CSI) have been experiencing difficulties while comparing quotes for different customers in different years. Currently, the employees are processing data by opening several loaded Excel files where the quotes are and manually copying values to another Excel Workbook where the comparison is made. This project consists of developing a new and effective database for DHL CSI department so that information is stored altogether on the same catalog. That being said, we have been assigned to find an efficient algorithm that can deal with the different formats of the Excel Workbooks to copy and store the express customer rates for core products (DOX, WPX, IMP) for comparisons purposes.

Keywords: DHL, solution design, ORACLE, EXCEL

Procedia PDF Downloads 409
1300 Corporate Performance and Balance Sheet Indicators: Evidence from Indian Manufacturing Companies

Authors: Hussain Bohra, Pradyuman Sharma

Abstract:

This study highlights the significance of Balance Sheet Indicators on the corporate performance in the case of Indian manufacturing companies. Balance sheet indicators show the actual financial health of the company and it helps to the external investors to choose the right company for their investment and it also help to external financing agency to give easy finance to the manufacturing companies. The period of study is 2000 to 2014 for 813 manufacturing companies for which the continuous data is available throughout the study period. The data is collected from PROWESS data base maintained by Centre for Monitoring Indian Economy Pvt. Ltd. Panel data methods like fixed effect and random effect methods are used for the analysis. The Likelihood Ratio test, Lagrange Multiplier test and Hausman test results proof the suitability of the fixed effect model for the estimation. Return on assets (ROA) is used as the proxy to measure corporate performance. ROA is the best proxy to measure corporate performance as it already used by the most of the authors who worked on the corporate performance. ROA shows return on long term investment projects of firms. Different ratios like Current Ratio, Debt-equity ratio, Receivable turnover ratio, solvency ratio have been used as the proxies for the Balance Sheet Indicators. Other firm specific variable like firm size, and sales as the control variables in the model. From the empirical analysis, it was found that all selected financial ratios have significant and positive impact on the corporate performance. Firm sales and firm size also found significant and positive impact on the corporate performance. To check the robustness of results, the sample was divided on the basis of different ratio like firm having high debt equity ratio and low debt equity ratio, firms having high current ratio and low current ratio, firms having high receivable turnover and low receivable ratio and solvency ratio in the form of firms having high solving ratio and low solvency ratio. We find that the results are robust to all types of companies having different form of selected balance sheet indicators ratio. The results for other variables are also in the same line as for the whole sample. These findings confirm that Balance sheet indicators play as significant role on the corporate performance in India. The findings of this study have the implications for the corporate managers to focus different ratio to maintain the minimum expected level of performance. Apart from that, they should also maintain adequate sales and total assets to improve corporate performance.

Keywords: balance sheet, corporate performance, current ratio, panel data method

Procedia PDF Downloads 263
1299 Artificial Neural Networks and Hidden Markov Model in Landslides Prediction

Authors: C. S. Subhashini, H. L. Premaratne

Abstract:

Landslides are the most recurrent and prominent disaster in Sri Lanka. Sri Lanka has been subjected to a number of extreme landslide disasters that resulted in a significant loss of life, material damage, and distress. It is required to explore a solution towards preparedness and mitigation to reduce recurrent losses associated with landslides. Artificial Neural Networks (ANNs) and Hidden Markov Model (HMMs) are now widely used in many computer applications spanning multiple domains. This research examines the effectiveness of using Artificial Neural Networks and Hidden Markov Model in landslides predictions and the possibility of applying the modern technology to predict landslides in a prominent geographical area in Sri Lanka. A thorough survey was conducted with the participation of resource persons from several national universities in Sri Lanka to identify and rank the influencing factors for landslides. A landslide database was created using existing topographic; soil, drainage, land cover maps and historical data. The landslide related factors which include external factors (Rainfall and Number of Previous Occurrences) and internal factors (Soil Material, Geology, Land Use, Curvature, Soil Texture, Slope, Aspect, Soil Drainage, and Soil Effective Thickness) are extracted from the landslide database. These factors are used to recognize the possibility to occur landslides by using an ANN and HMM. The model acquires the relationship between the factors of landslide and its hazard index during the training session. These models with landslide related factors as the inputs will be trained to predict three classes namely, ‘landslide occurs’, ‘landslide does not occur’ and ‘landslide likely to occur’. Once trained, the models will be able to predict the most likely class for the prevailing data. Finally compared two models with regards to prediction accuracy, False Acceptance Rates and False Rejection rates and This research indicates that the Artificial Neural Network could be used as a strong decision support system to predict landslides efficiently and effectively than Hidden Markov Model.

Keywords: landslides, influencing factors, neural network model, hidden markov model

Procedia PDF Downloads 383
1298 Associated Factors of Hypertension, Hypercholesterolemia and Double Burden Hypertension-Hypercholesterolemia in Patients With Congestive Heart Failure: Hospital Based Study

Authors: Pierre Mintom, William Djeukeu Asongni, Michelle Moni, William Dakam, Christine Fernande Nyangono Biyegue.

Abstract:

Background: In order to prevent congestive heart failure, control of hypertension and hypercholesterolemia is necessary because those risk factors frequently occur in combination. Objective: The aim of the study is to determine the prevalence and risk factors of hypertension, hypercholesterolemia and double burden HTA-Hypercholesterolemia in patients with congestive heart failure. Methodology: A database of 98 patients suffering from congestive heart failure was used. The latter were recruited from August 15, 2017, to March 5, 2018, in the Cardiology department of Deido District Hospital of Douala. This database provides information on sociodemographic parameters, biochemical examinations, characteristics of heart failure and food consumption. ESC/ESH and NCEP-ATPIII definitions were used to define Hypercholesterolemia (total cholesterol ≥200mg/dl), Hypertension (SBP≥140mmHg and/or DBP≥90mmHg). Double burden hypertension-hypercholesterolemia was defined as follows: total cholesterol (CT)≥200mg/dl, SBP≥140mmHg and DBP≥90mmHg. Results: The prevalence of hypertension (HTA), hypercholesterolemia (hyperchol) and double burden HTA-Hyperchol were 61.2%, 66.3% and 45.9%, respectively. No sociodemographic factor was associated with hypertension, hypercholesterolemia and double burden, but Male gender was significantly associated (p<0.05) with hypercholesterolemia. HypoHDLemia significantly increased hypercholesterolemia and the double burden by 19.664 times (p=0.001) and 14.968 times (p=0.021), respectively. Regarding dietary habits, the consumption of rice, peanuts and derivatives and cottonseed oil respectively significantly (p<0.05) exposed to the occurrence of hypertension. The consumption of tomatoes, green bananas, corn and derivatives, peanuts and derivatives and cottonseed oil significantly exposed (p<0.05) to the occurrence of hypercholesterolemia. The consumption of palm oil and cottonseed oil exposed the occurrence of the double burden of hypertension-hypercholesterolemia. Consumption of eggs protects against hypercholesterolemia, and consumption of peanuts and tomatoes protects against the double burden. Conclusion: hypercholesterolemia associated with hypertension appears as a complicating factor of congestive heart failure. Key risk factors are mainly diet-based, suggesting the importance of nutritional education for patients. New management protocols emphasizing diet should be considered.

Keywords: risk factors, hypertension, hypercholesterolemia, congestive heart failure

Procedia PDF Downloads 68
1297 Methodology to Achieve Non-Cooperative Target Identification Using High Resolution Range Profiles

Authors: Olga Hernán-Vega, Patricia López-Rodríguez, David Escot-Bocanegra, Raúl Fernández-Recio, Ignacio Bravo

Abstract:

Non-Cooperative Target Identification has become a key research domain in the Defense industry since it provides the ability to recognize targets at long distance and under any weather condition. High Resolution Range Profiles, one-dimensional radar images where the reflectivity of a target is projected onto the radar line of sight, are widely used for identification of flying targets. According to that, to face this problem, an approach to Non-Cooperative Target Identification based on the exploitation of Singular Value Decomposition to a matrix of range profiles is presented. Target Identification based on one-dimensional radar images compares a collection of profiles of a given target, namely test set, with the profiles included in a pre-loaded database, namely training set. The classification is improved by using Singular Value Decomposition since it allows to model each aircraft as a subspace and to accomplish recognition in a transformed domain where the main features are easier to extract hence, reducing unwanted information such as noise. Singular Value Decomposition permits to define a signal subspace which contain the highest percentage of the energy, and a noise subspace which will be discarded. This way, only the valuable information of each target is used in the recognition process. The identification algorithm is based on finding the target that minimizes the angle between subspaces and takes place in a transformed domain. Two metrics, F1 and F2, based on Singular Value Decomposition are accomplished in the identification process. In the case of F2, the angle is weighted, since the top vectors set the importance in the contribution to the formation of a target signal, on the contrary F1 simply shows the evolution of the unweighted angle. In order to have a wide database or radar signatures and evaluate the performance, range profiles are obtained through numerical simulation of seven civil aircraft at defined trajectories taken from an actual measurement. Taking into account the nature of the datasets, the main drawback of using simulated profiles instead of actual measured profiles is that the former implies an ideal identification scenario, since measured profiles suffer from noise, clutter and other unwanted information and simulated profiles don't. In this case, the test and training samples have similar nature and usually a similar high signal-to-noise ratio, so as to assess the feasibility of the approach, the addition of noise has been considered before the creation of the test set. The identification results applying the unweighted and weighted metrics are analysed for demonstrating which algorithm provides the best robustness against noise in an actual possible scenario. So as to confirm the validity of the methodology, identification experiments of profiles coming from electromagnetic simulations are conducted, revealing promising results. Considering the dissimilarities between the test and training sets when noise is added, the recognition performance has been improved when weighting is applied. Future experiments with larger sets are expected to be conducted with the aim of finally using actual profiles as test sets in a real hostile situation.

Keywords: HRRP, NCTI, simulated/synthetic database, SVD

Procedia PDF Downloads 353
1296 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 22
1295 Comparison of Machine Learning-Based Models for Predicting Streptococcus pyogenes Virulence Factors and Antimicrobial Resistance

Authors: Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Diego Santibañez Oyarce, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán

Abstract:

Streptococcus pyogenes is a gram-positive bacteria involved in a wide range of diseases and is a major-human-specific bacterial pathogen. In Chile, this year the 'Ministerio de Salud' declared an alert due to the increase in strains throughout the year. This increase can be attributed to the multitude of factors including antimicrobial resistance (AMR) and Virulence Factors (VF). Understanding these VF and AMR is crucial for developing effective strategies and improving public health responses. Moreover, experimental identification and characterization of these pathogenic mechanisms are labor-intensive and time-consuming. Therefore, new computational methods are required to provide robust techniques for accelerating this identification. Advances in Machine Learning (ML) algorithms represent the opportunity to refine and accelerate the discovery of VF associated with Streptococcus pyogenes. In this work, we evaluate the accuracy of various machine learning models in predicting the virulence factors and antimicrobial resistance of Streptococcus pyogenes, with the objective of providing new methods for identifying the pathogenic mechanisms of this organism.Our comprehensive approach involved the download of 32,798 genbank files of S. pyogenes from NCBI dataset, coupled with the incorporation of data from Virulence Factor Database (VFDB) and Antibiotic Resistance Database (CARD) which contains sequences of AMR gene sequence and resistance profiles. These datasets provided labeled examples of both virulent and non-virulent genes, enabling a robust foundation for feature extraction and model training. We employed preprocessing, characterization and feature extraction techniques on primary nucleotide/amino acid sequences and selected the optimal more for model training. The feature set was constructed using sequence-based descriptors (e.g., k-mers and One-hot encoding), and functional annotations based on database prediction. The ML models compared are logistic regression, decision trees, support vector machines, neural networks among others. The results of this work show some differences in accuracy between the algorithms, these differences allow us to identify different aspects that represent unique opportunities for a more precise and efficient characterization and identification of VF and AMR. This comparative analysis underscores the value of integrating machine learning techniques in predicting S. pyogenes virulence and AMR, offering potential pathways for more effective diagnostic and therapeutic strategies. Future work will focus on incorporating additional omics data, such as transcriptomics, and exploring advanced deep learning models to further enhance predictive capabilities.

Keywords: antibiotic resistance, streptococcus pyogenes, virulence factors., machine learning

Procedia PDF Downloads 29