Search results for: database annotation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1716

Search results for: database annotation

1356 Next Generation Sequencing Analysis of Circulating MiRNAs in Rheumatoid Arthritis and Osteoarthritis

Authors: Khalda Amr, Noha Eltaweel, Sherif Ismail, Hala Raslan

Abstract:

Introduction: Osteoarthritis is the most common form of arthritis that involves the wearing away of the cartilage that caps the bones in the joints. While rheumatoid arthritis is an autoimmune disease in which the immune system attacks the joints, beginning with the lining of joints. In this study, we aimed to study the top deregulated miRNAs that might be the cause of pathogenesis in both diseases. Methods: Eight cases were recruited in this study: 4 rheumatoid arthritis (RA), 2 osteoarthritis (OA) patients, as well as 2 healthy controls. Total RNA was isolated from plasma to be subjected to miRNA profiling by NGS. Sequencing libraries were constructed and generated using the NEBNextR UltraTM small RNA Sample Prep Kit for Illumina R (NEB, USA), according to the manufacturer’s instructions. The quality of samples were checked using fastqc and multiQC. Results were compared RA vs Controls and OA vs. Controls. Target gene prediction and functional annotation of the deregulated miRNAs were done using Mienturnet. The top deregulated miRNAs in each disease were selected for further validation using qRT-PCR. Results: The average number of sequencing reads per sample exceeded 2.2 million, of which approximately 57% were mapped to the human reference genome. The top DEMs in RA vs controls were miR-6724-5p, miR-1469, miR-194-3p (up), miR-1468-5p, miR-486-3p (down). In comparison, the top DEMs in OA vs controls were miR-1908-3p, miR-122b-3p, miR-3960 (up), miR-1468-5p, miR-15b-3p (down). The functional enrichment of the selected top deregulated miRNAs revealed the highly enriched KEGG pathways and GO terms. Six of the deregulated miRNAs (miR-15b, -128, -194, -328, -542 and -3180) had multiple target genes in the RA pathway, so they are more likely to affect the RA pathogenesis. Conclusion: Six of our studied deregulated miRNAs (miR-15b, -128, -194, -328, -542 and -3180) might be highly involved in the disease pathogenesis. Further functional studies are crucial to assess their functions and actual target genes.

Keywords: next generation sequencing, mirnas, rheumatoid arthritis, osteoarthritis

Procedia PDF Downloads 96
1355 Exploring Simple Sequence Repeats within Conserved microRNA Precursors Identified from Tea Expressed Sequence Tag (EST) Database

Authors: Anjan Hazra, Nirjhar Dasgupta, Chandan Sengupta, Sauren Das

Abstract:

Tea (Camellia sinensis) has received substantial attention from the scientific world time to time, not only for its commercial importance, but also for its demand to the health-conscious people across the world for its extensive use as potential sources of antioxidant supplement. These health-benefit traits primarily rely on some regulatory networks of different metabolic pathways. Development of microsatellite markers from the conserved genomic regions is being worthwhile for studying the genetic diversity of closely related species or self-pollinated species. Although several SSR markers have been reported, in tea the trait-specific Simple Sequence Repeats (SSRs) are yet to be identified, which can be used for marker assisted breeding technique. MicroRNAs are endogenous, noncoding, short RNAs directly involved in regulating gene expressions at the post-transcriptional level. It has been found that diversity in miRNA gene interferes the formation of its characteristic hair pin structure and the subsequent function. In the present study, the precursors of small regulatory RNAs (microRNAs) has been fished out from tea Expressed Sequence Tag (EST) database. Furthermore, the simple sequence repeat motifs within the putative miRNA precursor genes are also identified in order to experimentally validate their existence and function. It is already known that genic-SSR markers are very adept and breeder-friendly source for genetic diversity analysis. So, the potential outcome of this in-silico study would provide some novel clues in understanding the miRNA-triggered polymorphic genic expression controlling specific metabolic pathways, accountable for tea quality.

Keywords: micro RNA, simple sequence repeats, tea quality, trait specific marker

Procedia PDF Downloads 311
1354 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings

Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey

Abstract:

Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.

Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing

Procedia PDF Downloads 152
1353 Research on Health Emergency Management Based on the Bibliometrics

Authors: Meng-Na Dai, Bao-Fang Wen, Gao-Pei Zhu, Chen-Xi Zhang, Jing Sun, Chang-Hai Tang, Zhi-Qiang Feng, Wen-Qiang Yin

Abstract:

Based on the analysis of literature in the health emergency management in China with recent 10 years, this paper discusses the Chinese current research hotspots, development trends and shortcomings in this field, and provides references for scholars to conduct follow-up research. CNKI(China National Knowledge Infrastructure), Weipu, and Wanfang were the databases of this literature. The key words during the database search were health, emergency, and management with the time from 2009 to 2018. The duplicate, non-academic, and unrelated documents were excluded. 901 articles were included in the literature review database. The main indicators of abstraction were, the number of articles published every year, authors, institutions, periodicals, etc. There are some research findings through the analysis of the literature. Overall, the number of literature in the health emergency management in China has shown a fluctuating downward trend in recent 10 years. Specifically, there is a lack of close cooperation between authors, which has not constituted the core team among them yet. Meanwhile, in this field, the number of high-level periodicals and quality literature is scarce. In addition, there are a lot of research hotspots, such as emergency management system, mechanism research, capacity evaluation index system research, plans and capacity-building research, etc. In the future, we should increase the scientific research funding of the health emergency management, encourage collaborative innovation among authors in multi-disciplinary fields, and create high-quality and high-impact journals in this field. The states should encourage scholars in this field to carry out more academic cooperation and communication with the whole world and improve the research in breadth and depth. Generally speaking, the research in health emergency management in China is still insufficient and needs to be improved.

Keywords: health emergency management, research situation, bibliometrics, literature

Procedia PDF Downloads 137
1352 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 136
1351 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: time-series, features engineering methods for forecasting, energy demand forecasting, Azure Machine Learning

Procedia PDF Downloads 297
1350 A Survey of Digital Health Companies: Opportunities and Business Model Challenges

Authors: Iris Xiaohong Quan

Abstract:

The global digital health market reached 175 billion U.S. dollars in 2019, and is expected to grow at about 25% CAGR to over 650 billion USD by 2025. Different terms such as digital health, e-health, mHealth, telehealth have been used in the field, which can sometimes cause confusion. The term digital health was originally introduced to refer specifically to the use of interactive media, tools, platforms, applications, and solutions that are connected to the Internet to address health concerns of providers as well as consumers. While mHealth emphasizes the use of mobile phones in healthcare, telehealth means using technology to remotely deliver clinical health services to patients. According to FDA, “the broad scope of digital health includes categories such as mobile health (mHealth), health information technology (IT), wearable devices, telehealth and telemedicine, and personalized medicine.” Some researchers believe that digital health is nothing else but the cultural transformation healthcare has been going through in the 21st century because of digital health technologies that provide data to both patients and medical professionals. As digital health is burgeoning, but research in the area is still inadequate, our paper aims to clear the definition confusion and provide an overall picture of digital health companies. We further investigate how business models are designed and differentiated in the emerging digital health sector. Both quantitative and qualitative methods are adopted in the research. For the quantitative analysis, our research data came from two databases Crunchbase and CBInsights, which are well-recognized information sources for researchers, entrepreneurs, managers, and investors. We searched a few keywords in the Crunchbase database based on companies’ self-description: digital health, e-health, and telehealth. A search of “digital health” returned 941 unique results, “e-health” returned 167 companies, while “telehealth” 427. We also searched the CBInsights database for similar information. After merging and removing duplicate ones and cleaning up the database, we came up with a list of 1464 companies as digital health companies. A qualitative method will be used to complement the quantitative analysis. We will do an in-depth case analysis of three successful unicorn digital health companies to understand how business models evolve and discuss the challenges faced in this sector. Our research returned some interesting findings. For instance, we found that 86% of the digital health startups were founded in the recent decade since 2010. 75% of the digital health companies have less than 50 employees, and almost 50% with less than 10 employees. This shows that digital health companies are relatively young and small in scale. On the business model analysis, while traditional healthcare businesses emphasize the so-called “3P”—patient, physicians, and payer, digital health companies extend to “5p” by adding patents, which is the result of technology requirements (such as the development of artificial intelligence models), and platform, which is an effective value creation approach to bring the stakeholders together. Our case analysis will detail the 5p framework and contribute to the extant knowledge on business models in the healthcare industry.

Keywords: digital health, business models, entrepreneurship opportunities, healthcare

Procedia PDF Downloads 183
1349 The Current State Of Human Gait Simulator Development

Authors: Stepanov Ivan, Musalimov Viktor, Monahov Uriy

Abstract:

This report examines the current state of human gait simulator development based on the human hip joint model. This unit will create a database of human gait types, useful for setting up and calibrating mechano devices, as well as the creation of new systems of rehabilitation, exoskeletons and walking robots. The system has ample opportunity to configure the dimensions and stiffness, while maintaining relative simplicity.

Keywords: hip joint, human gait, physiotherapy, simulation

Procedia PDF Downloads 406
1348 Research Trends in Using Virtual Reality for the Analysis and Treatment of Lower-Limb Musculoskeletal Injury of Athletes: A Literature Review

Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes

Abstract:

There is little research applying virtual reality (VR) to the treatment of musculoskeletal injury in athletes. This is despite their prevalence, and the implications for physical and psychological health. Nevertheless, developments of wireless VR headsets better facilitate dynamic movement in VR environments (VREs), and more research is expected in this emerging field. This systematic review identified publications that used VR interventions for the analysis or treatment of lower-limb musculoskeletal injury of athletes. It established a search protocol, and through narrative discussion, identified existing trends. Database searches encompassed four term sets: 1) VR systems; 2) musculoskeletal injuries; 3) sporting population; 4) movement outcome analysis. Overall, a total of 126 publications were identified through database searching, and twelve were included in the final analysis and discussion. Many of the studies were pilot and proof of concept work. Seven of the twelve publications were observational studies. However, this may provide preliminary data from which clinical trials will branch. If specified, the focus of the literature was very narrow, with very similar population demographics and injuries. The trends in the literature findings emphasised the role of VR and attentional focus, the strategic manipulation of movement outcomes, and the transfer of skill to the real-world. Causal inferences may have been undermined by flaws, as most studies were limited by the practicality of conducting a two-factor clinical-VR-based study. In conclusion, by assessing the exploratory studies, and combining this with the use of numerous developments, techniques, and tools, a novel application could be established to utilise VR with dynamic movement, for the effective treatment of specific musculoskeletal injuries of athletes.

Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality

Procedia PDF Downloads 233
1347 Ibrutinib and the Potential Risk of Cardiac Failure: A Review of Pharmacovigilance Data

Authors: Abdulaziz Alakeel, Roaa Alamri, Abdulrahman Alomair, Mohammed Fouda

Abstract:

Introduction: Ibrutinib is a selective, potent, and irreversible small-molecule inhibitor of Bruton's tyrosine kinase (BTK). It forms a covalent bond with a cysteine residue (CYS-481) at the active site of Btk, leading to inhibition of Btk enzymatic activity. The drug is indicated to treat certain type of cancers such as mantle cell lymphoma (MCL), chronic lymphocytic leukaemia and Waldenström's macroglobulinaemia (WM). Cardiac failure is a condition referred to inability of heart muscle to pump adequate blood to human body organs. There are multiple types of cardiac failure including left and right-sided heart failure, systolic and diastolic heart failures. The aim of this review is to evaluate the risk of cardiac failure associated with the use of ibrutinib and to suggest regulatory recommendations if required. Methodology: Signal Detection team at the National Pharmacovigilance Center (NPC) of Saudi Food and Drug Authority (SFDA) performed a comprehensive signal review using its national database as well as the World Health Organization (WHO) database (VigiBase), to retrieve related information for assessing the causality between cardiac failure and ibrutinib. We used the WHO- Uppsala Monitoring Centre (UMC) criteria as standard for assessing the causality of the reported cases. Results: Case Review: The number of resulted cases for the combined drug/adverse drug reaction are 212 global ICSRs as of July 2020. The reviewers have selected and assessed the causality for the well-documented ICSRs with completeness scores of 0.9 and above (35 ICSRs); the value 1.0 presents the highest score for best-written ICSRs. Among the reviewed cases, more than half of them provides supportive association (four probable and 15 possible cases). Data Mining: The disproportionality of the observed and the expected reporting rate for drug/adverse drug reaction pair is estimated using information component (IC), a tool developed by WHO-UMC to measure the reporting ratio. Positive IC reflects higher statistical association while negative values indicates less statistical association, considering the null value equal to zero. The results of (IC=1.5) revealed a positive statistical association for the drug/ADR combination, which means “Ibrutinib” with “Cardiac Failure” have been observed more than expected when compared to other medications available in WHO database. Conclusion: Health regulators and health care professionals must be aware for the potential risk of cardiac failure associated with ibrutinib and the monitoring of any signs or symptoms in treated patients is essential. The weighted cumulative evidences identified from causality assessment of the reported cases and data mining are sufficient to support a causal association between ibrutinib and cardiac failure.

Keywords: cardiac failure, drug safety, ibrutinib, pharmacovigilance, signal detection

Procedia PDF Downloads 129
1346 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks

Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar

Abstract:

DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.

Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)

Procedia PDF Downloads 318
1345 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 265
1344 Bioinformatics Identification of Rare Codon Clusters in Proteins Structure of HBV

Authors: Abdorrasoul Malekpour, Mohammad Ghorbani Mojtaba Mortazavi, Mohammadreza Fattahi, Mohammad Hassan Meshkibaf, Ali Fakhrzad, Saeid Salehi, Saeideh Zahedi, Amir Ahmadimoghaddam, Parviz Farzadnia Dr., Mohammadreza Hajyani Asl Bs

Abstract:

Hepatitis B as an infectious disease has eight main genotypes (A–H). The aim of this study is to Bioinformatically identify Rare Codon Clusters (RCC) in proteins structure of HBV. For detection of protein family accession numbers (Pfam) of HBV proteins; used of uni-prot database and Pfam search tool were used. Obtained Pfam IDs were analyzed in Sherlocc program and RCCs in HBV proteins were detected. In further, the structures of TrEMBL entries proteins studied in PDB database and 3D structures of the HBV proteins and locations of RCCs were visualized and studied using Swiss PDB Viewer software. Pfam search tool have found nine significant hits and 0 insignificant hits in 3 frames. Results of Pfams studied in the Sherlocc program show this program not identified RCCs in the external core antigen (PF08290) and truncated HBeAg protein (PF08290). By contrast the RCCs become identified in Hepatitis core antigen (PF00906) Large envelope protein S (PF00695), X protein (PF00739), DNA polymerase (viral) N-terminal domain (PF00242) and Protein P (Pf00336). In HBV genome, seven RCC identified that found in hepatitis core antigen, large envelope protein S and DNA polymerase proteins and proteins structures of TrEMBL entries sequences that reported in Sherlocc program outputs are not complete. Based on situation of RCC in structure of HBV proteins, it suggested those RCCs are important in HBV life cycle. We hoped that this study provide a new and deep perspective in protein research and drug design for treatment of HBV.

Keywords: rare codon clusters, hepatitis B virus, bioinformatic study, infectious disease

Procedia PDF Downloads 488
1343 Analysis of Human Toxicity Potential of Major Building Material Production Stage Using Life Cycle Assessment

Authors: Rakhyun Kim, Sungho Tae

Abstract:

Global environmental issues such as abnormal weathers due to global warming, resource depletion, and ecosystem distortions have been escalating due to rapid increase of population growth, and expansion of industrial and economic development. Accordingly, initiatives have been implemented by many countries to protect the environment through indirect regulation methods such as Environmental Product Declaration (EPD), in addition to direct regulations such as various emission standards. Following this trend, life cycle assessment (LCA) techniques that provide quantitative environmental information, such as Human Toxicity Potential (HTP), for buildings are being developed in the construction industry. However, at present, the studies on the environmental database of building materials are not sufficient to provide this support adequately. The purpose of this study is to analysis human toxicity potential of major building material production stage using life cycle assessment. For this purpose, the theoretical consideration of the life cycle assessment and environmental impact category was performed and the direction of the study was set up. That is, the major material in the global warming potential view was drawn against the building and life cycle inventory database was selected. The classification was performed about 17 kinds of substance and impact index, such as human toxicity potential, that it specifies in CML2001. The environmental impact of analysis human toxicity potential for the building material production stage was calculated through the characterization. Meanwhile, the environmental impact of building material in the same category was analyze based on the characterization impact which was calculated in this study. In this study, establishment of environmental impact coefficients of major building material by complying with ISO 14040. Through this, it is believed to effectively support the decisions of stakeholders to improve the environmental performance of buildings and provide a basis for voluntary participation of architects in environment consideration activities.

Keywords: human toxicity potential, major building material, life cycle assessment, production stage

Procedia PDF Downloads 139
1342 A Convolutional Neural Network Based Vehicle Theft Detection, Location, and Reporting System

Authors: Michael Moeti, Khuliso Sigama, Thapelo Samuel Matlala

Abstract:

One of the principal challenges that the world is confronted with is insecurity. The crime rate is increasing exponentially, and protecting our physical assets especially in the motorist industry, is becoming impossible when applying our own strength. The need to develop technological solutions that detect and report theft without any human interference is inevitable. This is critical, especially for vehicle owners, to ensure theft detection and speedy identification towards recovery efforts in cases where a vehicle is missing or attempted theft is taking place. The vehicle theft detection system uses Convolutional Neural Network (CNN) to recognize the driver's face captured using an installed mobile phone device. The location identification function uses a Global Positioning System (GPS) to determine the real-time location of the vehicle. Upon identification of the location, Global System for Mobile Communications (GSM) technology is used to report or notify the vehicle owner about the whereabouts of the vehicle. The installed mobile app was implemented by making use of python as it is undoubtedly the best choice in machine learning. It allows easy access to machine learning algorithms through its widely developed library ecosystem. The graphical user interface was developed by making use of JAVA as it is better suited for mobile development. Google's online database (Firebase) was used as a means of storage for the application. The system integration test was performed using a simple percentage analysis. Sixty (60) vehicle owners participated in this study as a sample, and questionnaires were used in order to establish the acceptability of the system developed. The result indicates the efficiency of the proposed system, and consequently, the paper proposes the use of the system can effectively monitor the vehicle at any given place, even if it is driven outside its normal jurisdiction. More so, the system can be used as a database to detect, locate and report missing vehicles to different security agencies.

Keywords: CNN, location identification, tracking, GPS, GSM

Procedia PDF Downloads 166
1341 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis

Authors: Arin Ghazarian, Cyril Rakovski

Abstract:

Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter called

Keywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases

Procedia PDF Downloads 152
1340 A Prototype of an Information and Communication Technology Based Intervention Tool for Children with Dyslexia

Authors: Rajlakshmi Guha, Sajjad Ansari, Shazia Nasreen, Hirak Banerjee, Jiaul Paik

Abstract:

Dyslexia is a neurocognitive disorder, affecting around fifteen percent of the Indian population. The symptoms include difficulty in reading alphabet, words, and sentences. This can be difficult at the phonemic or recognition level and may further affect lexical structures. Therapeutic intervention of dyslexic children post assessment is generally done by special educators and psychologists through one on one interaction. Considering the large number of children affected and the scarcity of experts, access to care is limited in India. Moreover, unavailability of resources and timely communication with caregivers add on to the problem of proper intervention. With the development of Educational Technology and its use in India, access to information and care has been improved in such a large and diverse country. In this context, this paper proposes an ICT enabled home-based intervention program for dyslexic children which would support the child, and provide an interactive interface between expert, parents, and students. The paper discusses the details of the database design and system layout of the program. Along with, it also highlights the development of different technical aids required to build out personalized android applications for the Indian dyslexic population. These technical aids include speech database creation for children, automatic speech recognition system, serious game development, and color coded fonts. The paper also emphasizes the games developed to assist the dyslexic child on cognitive training primarily for attention, working memory, and spatial reasoning. In addition, it talks about the specific elements of the interactive intervention tool that makes it effective for home based intervention of dyslexia.

Keywords: Android applications, cognitive training, dyslexia, intervention

Procedia PDF Downloads 291
1339 An Assessment of Drainage Network System in Nigeria Urban Areas using Geographical Information Systems: A Case Study of Bida, Niger State

Authors: Yusuf Hussaini Atulukwu, Daramola Japheth, Tabitit S. Tabiti, Daramola Elizabeth Lara

Abstract:

In view of the recent limitations faced by the township concerning poorly constructed and in some cases non - existence of drainage facilities that resulted into incessant flooding in some parts of the community poses threat to life,property and the environment. The research seeks to address this issue by showing the spatial distribution of drainage network in Bida Urban using Geographic information System techniques. Relevant features were extracted from existing Bida based Map using un-screen digitization and x, y, z, data of existing drainages were acquired using handheld Global Positioning System (GPS). These data were uploaded into ArcGIS 9.2, software, and stored in the relational database structure that was used to produce the spatial data drainage network of the township. The result revealed that about 40 % of the drainages are blocked with sand and refuse, 35 % water-logged as a result of building across erosion channels and dilapidated bridges as a result of lack of drainage along major roads. The study thus concluded that drainage network systems in Bida community are not in good working condition and urgent measures must be initiated in order to avoid future disasters especially with the raining season setting in. Based on the above findings, the study therefore recommends that people within the locality should avoid dumping municipal waste within the drainage path while sand blocked or weed blocked drains should be clear by the authority concerned. In the same vein the authority should ensured that contract of drainage construction be awarded to professionals and all the natural drainages caused by erosion should be addressed to avoid future disasters.

Keywords: drainage network, spatial, digitization, relational database, waste

Procedia PDF Downloads 334
1338 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters

Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev

Abstract:

Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.

Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters

Procedia PDF Downloads 197
1337 Ground Motion Modeling Using the Least Absolute Shrinkage and Selection Operator

Authors: Yildiz Stella Dak, Jale Tezcan

Abstract:

Ground motion models that relate a strong motion parameter of interest to a set of predictive seismological variables describing the earthquake source, the propagation path of the seismic wave, and the local site conditions constitute a critical component of seismic hazard analyses. When a sufficient number of strong motion records are available, ground motion relations are developed using statistical analysis of the recorded ground motion data. In regions lacking a sufficient number of recordings, a synthetic database is developed using stochastic, theoretical or hybrid approaches. Regardless of the manner the database was developed, ground motion relations are developed using regression analysis. Development of a ground motion relation is a challenging process which inevitably requires the modeler to make subjective decisions regarding the inclusion criteria of the recordings, the functional form of the model and the set of seismological variables to be included in the model. Because these decisions are critically important to the validity and the applicability of the model, there is a continuous interest on procedures that will facilitate the development of ground motion models. This paper proposes the use of the Least Absolute Shrinkage and Selection Operator (LASSO) in selecting the set predictive seismological variables to be used in developing a ground motion relation. The LASSO can be described as a penalized regression technique with a built-in capability of variable selection. Similar to the ridge regression, the LASSO is based on the idea of shrinking the regression coefficients to reduce the variance of the model. Unlike ridge regression, where the coefficients are shrunk but never set equal to zero, the LASSO sets some of the coefficients exactly to zero, effectively performing variable selection. Given a set of candidate input variables and the output variable of interest, LASSO allows ranking the input variables in terms of their relative importance, thereby facilitating the selection of the set of variables to be included in the model. Because the risk of overfitting increases as the ratio of the number of predictors to the number of recordings increases, selection of a compact set of variables is important in cases where a small number of recordings are available. In addition, identification of a small set of variables can improve the interpretability of the resulting model, especially when there is a large number of candidate predictors. A practical application of the proposed approach is presented, using more than 600 recordings from the National Geospatial-Intelligence Agency (NGA) database, where the effect of a set of seismological predictors on the 5% damped maximum direction spectral acceleration is investigated. The set of candidate predictors considered are Magnitude, Rrup, Vs30. Using LASSO, the relative importance of the candidate predictors has been ranked. Regression models with increasing levels of complexity were constructed using one, two, three, and four best predictors, and the models’ ability to explain the observed variance in the target variable have been compared. The bias-variance trade-off in the context of model selection is discussed.

Keywords: ground motion modeling, least absolute shrinkage and selection operator, penalized regression, variable selection

Procedia PDF Downloads 330
1336 A Web Service Based Sensor Data Management System

Authors: Rose A. Yemson, Ping Jiang, Oyedeji L. Inumoh

Abstract:

The deployment of wireless sensor network has rapidly increased, however with the increased capacity and diversity of sensors, and applications ranging from biological, environmental, military etc. generates tremendous volume of data’s where more attention is placed on the distributed sensing and little on how to manage, analyze, retrieve and understand the data generated. This makes it more quite difficult to process live sensor data, run concurrent control and update because sensor data are either heavyweight, complex, and slow. This work will focus on developing a web service platform for automatic detection of sensors, acquisition of sensor data, storage of sensor data into a database, processing of sensor data using reconfigurable software components. This work will also create a web service based sensor data management system to monitor physical movement of an individual wearing wireless network sensor technology (SunSPOT). The sensor will detect movement of that individual by sensing the acceleration in the direction of X, Y and Z axes accordingly and then send the sensed reading to a database that will be interfaced with an internet platform. The collected sensed data will determine the posture of the person such as standing, sitting and lying down. The system is designed using the Unified Modeling Language (UML) and implemented using Java, JavaScript, html and MySQL. This system allows real time monitoring an individual closely and obtain their physical activity details without been physically presence for in-situ measurement which enables you to work remotely instead of the time consuming check of an individual. These details can help in evaluating an individual’s physical activity and generate feedback on medication. It can also help in keeping track of any mandatory physical activities required to be done by the individuals. These evaluations and feedback can help in maintaining a better health status of the individual and providing improved health care.

Keywords: HTML, java, javascript, MySQL, sunspot, UML, web-based, wireless network sensor

Procedia PDF Downloads 212
1335 Speech Emotion Recognition: A DNN and LSTM Comparison in Single and Multiple Feature Application

Authors: Thiago Spilborghs Bueno Meyer, Plinio Thomaz Aquino Junior

Abstract:

Through speech, which privileges the functional and interactive nature of the text, it is possible to ascertain the spatiotemporal circumstances, the conditions of production and reception of the discourse, the explicit purposes such as informing, explaining, convincing, etc. These conditions allow bringing the interaction between humans closer to the human-robot interaction, making it natural and sensitive to information. However, it is not enough to understand what is said; it is necessary to recognize emotions for the desired interaction. The validity of the use of neural networks for feature selection and emotion recognition was verified. For this purpose, it is proposed the use of neural networks and comparison of models, such as recurrent neural networks and deep neural networks, in order to carry out the classification of emotions through speech signals to verify the quality of recognition. It is expected to enable the implementation of robots in a domestic environment, such as the HERA robot from the RoboFEI@Home team, which focuses on autonomous service robots for the domestic environment. Tests were performed using only the Mel-Frequency Cepstral Coefficients, as well as tests with several characteristics of Delta-MFCC, spectral contrast, and the Mel spectrogram. To carry out the training, validation and testing of the neural networks, the eNTERFACE’05 database was used, which has 42 speakers from 14 different nationalities speaking the English language. The data from the chosen database are videos that, for use in neural networks, were converted into audios. It was found as a result, a classification of 51,969% of correct answers when using the deep neural network, when the use of the recurrent neural network was verified, with the classification with accuracy equal to 44.09%. The results are more accurate when only the Mel-Frequency Cepstral Coefficients are used for the classification, using the classifier with the deep neural network, and in only one case, it is possible to observe a greater accuracy by the recurrent neural network, which occurs in the use of various features and setting 73 for batch size and 100 training epochs.

Keywords: emotion recognition, speech, deep learning, human-robot interaction, neural networks

Procedia PDF Downloads 170
1334 Assessment of Image Databases Used for Human Skin Detection Methods

Authors: Saleh Alshehri

Abstract:

Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases.

Keywords: image databases, image processing, pattern recognition, neural networks

Procedia PDF Downloads 271
1333 Utilization of Biodiversity of Peaces Herbals Used as Food and Treat the Path of Economic Phu Sing District in Sisaket Province Thailand

Authors: Nopparet Thammasaranyakun

Abstract:

This research objects are: 1: To study the biodiversity of medicinal plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province. 2: To study the use of medicinal plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province. 3: To provide a database of information on biodiversity for food and medicinal plants and medicinal tourism economies along the Phu Sing district Sisaket province. 4: Learn to create a biodiversity of medicinal plants used as food and treatment by Journeys economic Phu Sing district Sisaket province Boundaries used in this study was the Phu Sing district. Population and Agricultural Development Center, rayong Mun due to the initiative for youth Local, Government Health officials, community leaders, teachers, students, schools, the local people and tourists. Sage wisdom to know the herbs and women's groups, OTOP Phu Sing district in SiisaKet province. By selecting the specific data that way. The process of participatory action research (PAR) is a community-based research. The method of collecting qualitative data. (Qualitative) tool is used from context, Community areas, interview and Taped recordings. Observation and focus group data was statistically analyzed using descriptive statistics (Descriptive Statistics). The results findings: 1- A study of the biodiversity of plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province. Were used in the dry season and the rainy season find the medicinal plants of 251 species 41 types of drugs. 2- The study utilized medicinal plants used as food and the treatment of indigenous Phu Sing Sisaket province. Found 251 species have medicinal properties that are used for food and medicinal purposes 41 types of drugs. 3- Of the database technology of biodiversity for food and medicinal plants used by local treatment Phu Sing district Sisaket province. A data base of 251 medicinal species 41 types of drugs is used for food and medicinal properties Sisaket province. 4- learning the biodiversity of medicinal plants used for food and medicinal tourism economies along the Phu Sing district Sisaket province.

Keywords: utilization of biodiversity, peaces herbals, used as Food, Sing district, sisaket

Procedia PDF Downloads 358
1332 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 194
1331 Association of Clostridium difficile Infection and Bone Cancer

Authors: Daniela Prado, Lexi Frankel, Amalia Ardeljan, Lokesh Manjani, Matthew Cardeiro, Omar Rashid

Abstract:

Background: Clostridium difficile (C. diff) is a gram-positive bacterium that is known to cause life-threatening diarrhea and severe inflammation of the colon. It originates as an alteration of the gut microbiome and can be transmitted through spores. Recent studies have shown a high association between the development of C. diff in cancer patients due to extensive hospitalization. However, research is lacking regarding C. diff’s association in the causation or prevention of cancer. The objective of this study was to therefore assess the correlation between Clostridium difficile infection (CDI) and the incidence of bone cancer. Methods: This retrospective analysis used data provided by a Health Insurance Portability and Accountability Act (HIPAA) compliant national database to evaluate the patients infected versus patients not infected with C. diff using ICD-10 and ICD-9 codes. Access to the database was granted by the Holy Cross Health, Fort Lauderdale, for the purpose of academic research. Standard statistical methods were used. Results: Between January 2010 and December 2019, the query was analyzed and resulted in 78863 patients in both the infected and control group, respectively. The two groups were matched by age range and CCI score. The incidence of bone cancer was 659 patients (0.835%) in the C. diff group compared to 1941 patients (2.461%) in the control group. The difference was statistically significant by a P-value < 2.2x10^-16 with an odds ratio (OR)= 0.33 (0.31-0.37) with a 95% confidence interval (CI). Treatment for CDI was analyzed for both C. diff infected and noninfected populations. 91 out of 16,676 (0.55%) patients with a prior C. diff infection and treated with antibiotics were compared to the control group were 275 out of 16,676 (1.65%) patients with no history of CDI and received antibiotic treatment. Results remained statistically significant by P-value <2.2x10-16 with an OR= 0.42 (0.37, 0.48). and a 95% CI. Conclusion: The study shows a statistically significant correlation between C. diff and a reduced incidence of bone cancer. Further evaluation is recommended to assess the potential of C. difficile in reducing bone cancer incidence.

Keywords: bone cancer, colitis, clostridium difficile, microbiome

Procedia PDF Downloads 277
1330 Energy Intensity: A Case of Indian Manufacturing Industries

Authors: Archana Soni, Arvind Mittal, Manmohan Kapshe

Abstract:

Energy has been recognized as one of the key inputs for the economic growth and social development of a country. High economic growth naturally means a high level of energy consumption. However, in the present energy scenario where there is a wide gap between the energy generation and energy consumption, it is extremely difficult to match the demand with the supply. India being one of the largest and rapidly growing developing countries, there is an impending energy crisis which requires immediate measures to be adopted. In this situation, the concept of Energy Intensity comes under special focus to ensure energy security in an environmentally sustainable way. Energy Intensity is defined as the energy consumed per unit output in the context of industrial energy practices. It is a key determinant of the projections of future energy demands which assists in policy making. Energy Intensity is inversely related to energy efficiency; lesser the energy required to produce a unit of output or service, the greater is the energy efficiency. Energy Intensity of Indian manufacturing industries is among the highest in the world and stands for enormous energy consumption. Hence, reducing the Energy Intensity of Indian manufacturing industries is one of the best strategies to achieve a low level of energy consumption and conserve energy. This study attempts to analyse the factors which influence the Energy Intensity of Indian manufacturing firms and how they can be used to reduce the Energy Intensity. The paper considers six of the largest energy consuming manufacturing industries in India viz. Aluminium, Cement, Iron & Steel Industries, Textile Industries, Fertilizer and Paper industries and conducts a detailed Energy Intensity analysis using the data from PROWESS database of the Centre for Monitoring Indian Economy (CMIE). A total of twelve independent explanatory variables based on various factors such as raw material, labour, machinery, repair and maintenance, production technology, outsourcing, research and development, number of employees, wages paid, profit margin and capital invested have been taken into consideration for the analysis.

Keywords: energy intensity, explanatory variables, manufacturing industries, PROWESS database

Procedia PDF Downloads 329
1329 ROSgeoregistration: Aerial Multi-Spectral Image Simulator for the Robot Operating System

Authors: Andrew R. Willis, Kevin Brink, Kathleen Dipple

Abstract:

This article describes a software package called ROS-georegistration intended for use with the robot operating system (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test, and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google’s Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and windspeed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Keywords: EO-to-EO, EO-to-SAR, flight simulation, georegistration, image generation, robot operating system, vision-based navigation

Procedia PDF Downloads 104
1328 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process

Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani

Abstract:

Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.

Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process

Procedia PDF Downloads 338
1327 Epidemiology of Private Prehospital Calls over the Last Decade in South Africa

Authors: Rhodine Hickman, Craig Wylie, Michael G. McCaul

Abstract:

Introduction: The World Health Organisation has called on governments around the world to recognise emergency conditions as a global public health problem and respond with appropriate steps for effective preventative strategies. However, to understand the magnitude of the problem, good quality epidemiological data is required. This is especially challenging in low and middle-income countries, where routine data is scarce, specifically within the prehospital setting. Methods: We conducted a retrospective cross-sectional study of a national prehospital private sector EMS database. The database being the property of ER24 (private Emergency Medical Services (EMS) company in South Africa) contains claims submitted by the majority of ambulance services in South Africa during the period between 1 January 2008 to 28 March 2017. We used descriptive statistics and control charts to describe the data using STATA 14. Results: 299,257 calls were included in the analysis. The top clinical conditions requiring ambulance transport were transport accidents (10% of total call volume) and ischaemic heart disease (4.4%). The number of transport accidents consistently increased between 2009 and 2014 and reached beyond the limit for normal variation in 2015. Victims of transport accidents required basic life support services 60% of the time with 80% of injuries being minor to moderate. The frequency of ischaemic heart disease had a steady incline from 2011 to 2016. Advanced life support services were required about 50% of the time, with 60% of patients needing urgent care. Conclusion: Transport accidents, followed by ischaemic heart disease, are the most prevalent conditions in South African private EMS. There is a potential to address these conditions by developing the capacity of low and mid-level providers in trauma and advanced EMS providers in ischaemic heart disease.

Keywords: emergency care, emergency medicine, prehospital providers, South Africa

Procedia PDF Downloads 175