Search results for: clinical error
5051 Machine Learning for Feature Selection and Classification of Systemic Lupus Erythematosus
Authors: H. Zidoum, A. AlShareedah, S. Al Sawafi, A. Al-Ansari, B. Al Lawati
Abstract:
Systemic lupus erythematosus (SLE) is an autoimmune disease with genetic and environmental components. SLE is characterized by a wide variability of clinical manifestations and a course frequently subject to unpredictable flares. Despite recent progress in classification tools, the early diagnosis of SLE is still an unmet need for many patients. This study proposes an interpretable disease classification model that combines the high and efficient predictive performance of CatBoost and the model-agnostic interpretation tools of Shapley Additive exPlanations (SHAP). The CatBoost model was trained on a local cohort of 219 Omani patients with SLE as well as other control diseases. Furthermore, the SHAP library was used to generate individual explanations of the model's decisions as well as rank clinical features by contribution. Overall, we achieved an AUC score of 0.945, F1-score of 0.92 and identified four clinical features (alopecia, renal disorders, cutaneous lupus, and hemolytic anemia) along with the patient's age that was shown to have the greatest contribution on the prediction.Keywords: feature selection, classification, systemic lupus erythematosus, model interpretation, SHAP, Catboost
Procedia PDF Downloads 835050 A Greedy Alignment Algorithm Supporting Medication Reconciliation
Authors: David Tresner-Kirsch
Abstract:
Reconciling patient medication lists from multiple sources is a critical task supporting the safe delivery of patient care. Manual reconciliation is a time-consuming and error-prone process, and recently attempts have been made to develop efficiency- and safety-oriented automated support for professionals performing the task. An important capability of any such support system is automated alignment – finding which medications from a list correspond to which medications from a different source, regardless of misspellings, naming differences (e.g. brand name vs. generic), or changes in treatment (e.g. switching a patient from one antidepressant class to another). This work describes a new algorithmic solution to this alignment task, using a greedy matching approach based on string similarity, edit distances, concept extraction and normalization, and synonym search derived from the RxNorm nomenclature. The accuracy of this algorithm was evaluated against a gold-standard corpus of 681 medication records; this evaluation found that the algorithm predicted alignments with 99% precision and 91% recall. This performance is sufficient to support decision support applications for medication reconciliation.Keywords: clinical decision support, medication reconciliation, natural language processing, RxNorm
Procedia PDF Downloads 2855049 Effect of Non-Surgical Periodontal Therapy According to Periodontal Severity
Authors: Jungbin Lim, Bohee Kang, Heelim Lee, Sunjin Kim, GeumHee Choi, Jae-Suk Jung, Suk Ji
Abstract:
Nonsurgical periodontal therapies have, for several decades, been the basis of periodontal treatment concepts. The aim of this paper is to investigate the effectiveness of non-surgical periodontal therapy according to the severity of periodontitis disease. Methods: Retrospective data of patients who visited Department of periodontics in Ajou University Medical Center from 2016 to 2022 were collected. Among the patients, those who took full mouth examination of clinical parameters and non-surgical periodontal therapy were chosen for this study. Selected patients were divided into initial, moderate, and severe periodontitis based on severity and complexity of management (2018 World Workshop EFP/AAP consensus). Recall visits with clinical periodontal examination were scheduled for 1,2,3 months or 1,3,6 months after the treatment. The results were evaluated by recordings of mean probing pocket depth (mean PD), mean clinical attachment levels (mean CAL), bleeding on probing (BOP%), mean gingival index (mean GI), mean regression, mean sulcus bleeding index (mean SBI), mean plaque scores (mean PI). All statistical analyses were performed with R software, version 4.3.0. A level of significance, P<0.05, was considered to be statistically significant. Results: A total of 92 patients were included in this study. 15 patients were diagnosed as initial periodontitis, 14 moderate periodontitis, and 63 severe periodontitis. The all parameters except for mean recession decreased over time in all groups. The amount of mean PD decreased were the greatest in severe periodontitis group followed by moderate and initial, which was found to be statistically significant. The changes of mean PD were 0.15±0.05 mm, 0.37±0.06 mm, and 1.01±0.07 mm (initial, moderate, and severe, respectively, P<0.001). When comparing before and after treatment, the reductions in BOP(%), mean GI, mean SBI, and mean PI were statistically significant. Conclusion: All patients who received non-surgical periodontal therapy showed periodontal healing in terms of improvements in clinical parameters, and it was greater in the severe group.Keywords: periodontology, clinical periodontology, oral treatment, comprehensive preventive dentistry, non-surgical periodontal therapy
Procedia PDF Downloads 785048 The Use of Respiratory Index of Severity in Children (RISC) for Predicting Clinical Outcomes for 3 Months-59 Months Old Patients Hospitalized with Community-Acquired Pneumonia in Visayas Community Medical Center, Cebu City from January 2013 - June 2
Authors: Karl Owen L. Suan, Juliet Marie S. Lambayan, Floramay P. Salo-Curato
Abstract:
Objective: To predict the outcome among patients admitted with community-acquired pneumonia (ages 3 months to 59 months old) admitted in Visayas Community Medical Center using the Respiratory Index of Severity in Children (RISC). Design: A cross-sectional study design was used. Setting: The study was done in Visayas Community Medical Center, which is a private tertiary level in Cebu City from January-June 2013. Patients/Participants: A total of 72 patients were initially enrolled in the study. However, 1 patient transferred to another institution, thus 71 patients were included in this study. Within 24 hours from admission, patients were assigned a RISC score. Statistical Analysis: Cohen’s kappa coefficient was used for inter-rater agreement for categorical data. This study used frequency and percentage distribution for qualitative data. Mean, standard deviation and range were used for quantitative data. To determine the relationship of each RISC score parameter and the total RISC score with the outcome, a Mann Whitney U Test and 2x2 Fischer Exact test for testing associations were used. A p value less of than 0.05 alpha was considered significant. Results: There was a statistical significance between RISC score and clinical outcome. RISC score of greater than 4 was correlated with intubation and/or mortality. Conclusion: The RISC scoring system is a simple combination of clinical parameters and a reliable tool that will help stratify patients aged 3 months to 59 months in predicting clinical outcome.Keywords: RISC, clinical outcome, community-acquired pneumonia, patients
Procedia PDF Downloads 3025047 The Ecosystem of Food Allergy Clinical Trials: A Systematic Review
Authors: Eimar Yadir Quintero Tapias
Abstract:
Background: Science is not generally self-correcting; many clinical studies end with the same conclusion "more research is needed." This study hypothesizes that first, we need a better appraisal of the available (and unavailable) evidence instead of creating more of the same false inquiries. Methods: Systematic review of ClinicalTrials.gov study records using the following Boolean operators: (food OR nut OR milk OR egg OR shellfish OR wheat OR peanuts) AND (allergy OR allergies OR hypersensitivity OR hypersensitivities). Variables included the status of the study (e g., active and completed), availability of results, sponsor type, sample size, among others. To determine the rates of non-publication in journals indexed by PubMed, an advanced search query using the specific Number of Clinical Trials (e.g., NCT000001 OR NCT000002 OR...) was performed. As a prophylactic measure to prevent P-hacking, data analyses only included descriptive statistics and not inferential approaches. Results: A total of 2092 study records matched the search query described above (date: September 13, 2019). Most studies were interventional (n = 1770; 84.6%) and the remainder observational (n = 322; 15.4%). Universities, hospitals, and research centers sponsored over half of these investigations (n = 1208; 57.7%), 308 studies (14.7%) were industry-funded, and 147 received NIH grants; the remaining studies got mixed sponsorship. Regarding completed studies (n = 1156; 55.2%), 248 (21.5%) have results available at the registry site, and 417 (36.1%) matched NCT numbers of journal papers indexed by PubMed. Conclusions: The internal and external validity of human research is critical for the appraisal of medical evidence. It is imperative to analyze the entire dataset of clinical studies, preferably at a patient-level anonymized raw data, before rushing to conclusions with insufficient and inadequate information. Publication bias and non-registration of clinical trials limit the evaluation of the evidence concerning therapeutic interventions for food allergy, such as oral and sublingual immunotherapy, as well as any other medical condition. Over half of the food allergy human research remains unpublished.Keywords: allergy, clinical trials, immunology, systematic reviews
Procedia PDF Downloads 1375046 Interventions and Supervision in Mental Health Services: Experiences of a Working Group in Brazil
Authors: Sonia Alberti
Abstract:
The Regional Conference to Restructure Psychiatric Care in Latin America, convened by the Pan American Health Organization (PAHO) in 1990, oriented the Brazilian Federal Act in 2001 that stipulated the psychiatric reform which requires deinstitutionalization and community-based treatment. Since then, the 15 years’ experience of different working teams in mental health led an academic working group – supervisors from personal practices, professors and researchers – to discuss certain clinical issues, as well as supervisions, and to organize colloquia in different cities as a methodology. These colloquia count on the participation of different working teams from the cities in which they are held, with team members with different levels of educational degrees and prior experiences, in order to increase dialogue right where it does not always appear to be possible. The principal aim of these colloquia is to gain interlocution between practitioners and academics. Working with the theory of case constructions, this methodology revealed itself helpful in unfolding new solutions. The paper also observes that there is not always harmony between what the psychiatric reform demands and clinical ethics.Keywords: mental health, supervision, clinical cases, Brazilian experience
Procedia PDF Downloads 2735045 MMSE-Based Beamforming for Chip Interleaved CDMA in Aeronautical Mobile Radio Channel
Authors: Sherif K. El Dyasti, Esam A. Hagras, Adel E. El-Hennawy
Abstract:
This paper addresses the performance of antenna array beam-forming on Chip-Interleaved Code Division Multiple Access (CI_CDMA) system based on Minimum Mean Square Error (MMSE) detector in aeronautical mobile radio channel. Multipath fading, Doppler shifts caused by the speed of the aircraft, and Multiple Access Interference (MAI) are the most important reasons that affect and reduce the performance of aeronautical system. In this paper, we suggested the CI-CDMA with antenna array to combat this fading and improve the bit error rate (BER) performance. We further evaluate the performance of the proposed system in the four standard scenarios in aeronautical mobile radio channel.Keywords: aeronautical channel, CI-CDMA, beamforming, communication, information
Procedia PDF Downloads 4165044 Machine Learning Models for the Prediction of Heating and Cooling Loads of a Residential Building
Authors: Aaditya U. Jhamb
Abstract:
Due to the current energy crisis that many countries are battling, energy-efficient buildings are the subject of extensive research in the modern technological era because of growing worries about energy consumption and its effects on the environment. The paper explores 8 factors that help determine energy efficiency for a building: (relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area, and glazing area distribution), with Tsanas and Xifara providing a dataset. The data set employed 768 different residential building models to anticipate heating and cooling loads with a low mean squared error. By optimizing these characteristics, machine learning algorithms may assess and properly forecast a building's heating and cooling loads, lowering energy usage while increasing the quality of people's lives. As a result, the paper studied the magnitude of the correlation between these input factors and the two output variables using various statistical methods of analysis after determining which input variable was most closely associated with the output loads. The most conclusive model was the Decision Tree Regressor, which had a mean squared error of 0.258, whilst the least definitive model was the Isotonic Regressor, which had a mean squared error of 21.68. This paper also investigated the KNN Regressor and the Linear Regression, which had to mean squared errors of 3.349 and 18.141, respectively. In conclusion, the model, given the 8 input variables, was able to predict the heating and cooling loads of a residential building accurately and precisely.Keywords: energy efficient buildings, heating load, cooling load, machine learning models
Procedia PDF Downloads 955043 Proportional and Integral Controller-Based Direct Current Servo Motor Speed Characterization
Authors: Adel Salem Bahakeem, Ahmad Jamal, Mir Md. Maruf Morshed, Elwaleed Awad Khidir
Abstract:
Direct Current (DC) servo motors, or simply DC motors, play an important role in many industrial applications such as manufacturing of plastics, precise positioning of the equipment, and operating computer-controlled systems where speed of feed control, maintaining the position, and ensuring to have a constantly desired output is very critical. These parameters can be controlled with the help of control systems such as the Proportional Integral Derivative (PID) controller. The aim of the current work is to investigate the effects of Proportional (P) and Integral (I) controllers on the steady state and transient response of the DC motor. The controller gains are varied to observe their effects on the error, damping, and stability of the steady and transient motor response. The current investigation is conducted experimentally on a servo trainer CE 110 using analog PI controller CE 120 and theoretically using Simulink in MATLAB. Both experimental and theoretical work involves varying integral controller gain to obtain the response to a steady-state input, varying, individually, the proportional and integral controller gains to obtain the response to a step input function at a certain frequency, and theoretically obtaining the proportional and integral controller gains for desired values of damping ratio and response frequency. Results reveal that a proportional controller helps reduce the steady-state and transient error between the input signal and output response and makes the system more stable. In addition, it also speeds up the response of the system. On the other hand, the integral controller eliminates the error but tends to make the system unstable with induced oscillations and slow response to eliminate the error. From the current work, it is desired to achieve a stable response of the servo motor in terms of its angular velocity subjected to steady-state and transient input signals by utilizing the strengths of both P and I controllers.Keywords: DC servo motor, proportional controller, integral controller, controller gain optimization, Simulink
Procedia PDF Downloads 1105042 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network
Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse
Abstract:
Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM
Procedia PDF Downloads 2365041 Variable Tree Structure QR Decomposition-M Algorithm (QRD-M) in Multiple Input Multiple Output-Orthogonal Frequency Division Multiplexing (MIMO-OFDM) Systems
Authors: Jae-Hyun Ro, Jong-Kwang Kim, Chang-Hee Kang, Hyoung-Kyu Song
Abstract:
In multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) systems, QR decomposition-M algorithm (QRD-M) has suboptimal error performance. However, the QRD-M has still high complexity due to many calculations at each layer in tree structure. To reduce the complexity of the QRD-M, proposed QRD-M modifies existing tree structure by eliminating unnecessary candidates at almost whole layers. The method of the elimination is discarding the candidates which have accumulated squared Euclidean distances larger than calculated threshold. The simulation results show that the proposed QRD-M has same bit error rate (BER) performance with lower complexity than the conventional QRD-M.Keywords: complexity, MIMO-OFDM, QRD-M, squared Euclidean distance
Procedia PDF Downloads 3325040 Mathematical Modeling for Diabetes Prediction: A Neuro-Fuzzy Approach
Authors: Vijay Kr. Yadav, Nilam Rathi
Abstract:
Accurate prediction of glucose level for diabetes mellitus is required to avoid affecting the functioning of major organs of human body. This study describes the fundamental assumptions and two different methodologies of the Blood glucose prediction. First is based on the back-propagation algorithm of Artificial Neural Network (ANN), and second is based on the Neuro-Fuzzy technique, called Fuzzy Inference System (FIS). Errors between proposed methods further discussed through various statistical methods such as mean square error (MSE), normalised mean absolute error (NMAE). The main objective of present study is to develop mathematical model for blood glucose prediction before 12 hours advanced using data set of three patients for 60 days. The comparative studies of the accuracy with other existing models are also made with same data set.Keywords: back-propagation, diabetes mellitus, fuzzy inference system, neuro-fuzzy
Procedia PDF Downloads 2575039 A Comparative Study on a Tilt-Integral-Derivative Controller with Proportional-Integral-Derivative Controller for a Pacemaker
Authors: Aysan Esgandanian, Sabalan Daneshvar
Abstract:
The study is done to determine the comparison between proportional-integral-derivative controller (PID controller) and tilt-integral-derivative (TID controller) for cardiac pacemaker systems, which can automatically control the heart rate to accurately track a desired preset profile. The controller offers good adaption of heart to the physiological needs of the patient. The parameters of the both controllers are tuned by particle swarm optimization (PSO) algorithm which uses the integral of time square error as a fitness function to be minimized. Simulation results are performed on the developed cardiovascular system of humans and results demonstrate that the TID controller produces superior control performance than PID controllers. In this paper, all simulations were performed in Matlab.Keywords: integral of time square error, pacemaker systems, proportional-integral-derivative controller, PSO algorithm, tilt-integral-derivative controller
Procedia PDF Downloads 4625038 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm
Authors: Muhammad Bilal, Zhongfeng Qiu
Abstract:
Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.Keywords: AEORNET, AOD, SARA, GOCI, Beijing
Procedia PDF Downloads 1715037 Major Factors That Enhance Economic Growth in South Africa: A Re-Examination Using a Vector Error Correction Mechanism
Authors: Temitope L. A. Leshoro
Abstract:
This study explored several variables that enhance economic growth in South Africa, based on different growth theories while using the vector error correction model (VECM) technique. The impacts and contributions of each of these variables on GDP in South Africa were investigated. The motivation for this study was as a result of the weak economic growth that the country has been experiencing lately, as well as the continuous increase in unemployment rate and deteriorating health care system. Annual data spanning over the period 1974 to 2013 was employed. The results showed that the major determinants of GDP are trade openness, government spending, and health indicator; as these variables are not only economically significant but also statistically significant in explaining the changes in GDP in South Africa. Policy recommendations for economic growth enhancement are suggested based on the findings of this study.Keywords: economic growth, GDP, investment, health indicator, VECM
Procedia PDF Downloads 2765036 Clinical Characteristics of Children Presenting with History of Child Sexual Abuse to a Tertiary Care Centre in India
Authors: T. S. Sowmya Bhaskaran, Shekhar Seshadri
Abstract:
This study aims to study the clinical features of with a history of Child Sexual Abuse (CSA). A chart review of 40 children (<16 years) with history of CSA evaluated at the Department of Child and Adolescent Psychiatry of NIMHANS during a two year period was performed. Results:The most common form of abuse was contact penetrative abuse (65%) followed by non-contact penetrative abuse (32.5%). 75% (N=30) had a psychiatric diagnosis at baseline. 50% of these children had one or more psychiatric comorbidities. Anxiety disorder was the most common diagnosis (27.5%) which included PTSD (11%) followed by Depressive disorder (25.2%). Children abused by multiple perpetrators were found to be more likely to have depression, to having a comorbid psychiatric disorder and more prone to exhibit sexualized behaviour. Children who also experienced physical violence at home were more likely to develop psychiatric illness following child sexual abuse. Psychiatric morbidity is high in clinic population of children with history of CSA. It is important to increase the awareness regarding the consequences of CSA in order to increase help seeking.Keywords: child sexual abuse, India, tertiary care centre, clinical characteristics
Procedia PDF Downloads 4575035 Fractional Euler Method and Finite Difference Formula Using Conformable Fractional Derivative
Authors: Ramzi B. Albadarneh
Abstract:
In this paper, we use the new definition of fractional derivative called conformable fractional derivative to derive some finite difference formulas and its error terms which are used to solve fractional differential equations and fractional partial differential equations, also to derive fractional Euler method and its error terms which can be applied to solve fractional differential equations. To provide the contribution of our work some applications on finite difference formulas and Euler Method are given.Keywords: conformable fractional derivative, finite difference formula, fractional derivative, finite difference formula
Procedia PDF Downloads 4395034 Inpatient Drug Related Problems and Pharmacist Intervention at a Tertiary Care Teaching Hospital in South India: A Retrospective Study
Authors: Bollu Mounica
Abstract:
Background: Nowadays drug related problems were seen very commonly within the health care practice. These could result in the medication errors, adverse events, drug interactions and harm to patients. Pharmacist has an identified role in minimizing and preventing such type of problems. Objectives: To detect the incidence of drug related problems for the hospitalized patient, and to analyze the clinical pharmacist interventions performed during the review of prescription orders of the general medicine, psychiatry, surgery, pediatrics, gynaecology units of a large tertiary care teaching hospital. Methods: It was a retrospective, observational and interventional study. The analysis took place daily with the following parameters: dose, rate of administration, presentation and/or dosage form, presence of inappropriate/unnecessary drugs, necessity of additional medication, more proper alternative therapies, presence of relevant drug interactions, inconsistencies in prescription orders, physical-chemical incompatibilities/solution stability. From this evaluation, the drug therapy problems were classified, as well as the resulting clinical interventions. For a period starting November 2012 until December 2014, the inpatient medication charts and orders were identified and rectified by ward and practicing clinical pharmacists within the inpatient pharmacy services in a tertiary care teaching hospital on routine daily activities. Data was collected and evaluated. The causes of this problem were identified. Results: A total of 360 patients were followed. Male (71.66%) predominance was noted over females (28.33%). Drug related problems were more commonly seen in patients aged in between 31-60. Most of the DRP observed in the study resulted from the dispensing errors (26.11%), improper drug selection (17.22%), followed by untreated indications (14.4%) Majority of the clinical pharmacist recommendations were on need for proper dispensing (26.11%), and drug change (18.05%). Minor significance of DRPs were noted high (41.11 %), whereas (35.27 %) were moderate and (23.61 %) were major. The acceptance rate of intervening clinical pharmacist recommendation and change in drug therapy was found to be high (86.66%). Conclusion: Our study showed that the prescriptions reviewed had some drug therapy problem and the pharmacist interventions have promoted positive changes needed in the prescriptions. In this context, routine participation of clinical pharmacists in clinical medical rounds facilitates the identification of DRPs and may prevent their occurrence.Keywords: drug related problems, clinical pharmacist, drug prescriptions, drug related problems, intervention
Procedia PDF Downloads 3045033 Corrective Feedback and Uptake Patterns in English Speaking Lessons at Hanoi Law University
Authors: Nhac Thanh Huong
Abstract:
New teaching methods have led to the changes in the teachers’ roles in an English class, in which teachers’ error correction is an integral part. Language error and corrective feedback have been the interest of many researchers in foreign language teaching. However, the techniques and the effectiveness of teachers’ feedback have been a question of much controversy. This present case study has been carried out with a view to finding out the patterns of teachers’ corrective feedback and their impact on students’ uptake in English speaking lessons of legal English major students at Hanoi Law University. In order to achieve those aims, the study makes use of classroom observations as the main method of data collection to seeks answers to the two following questions: 1. What patterns of corrective feedback occur in English speaking lessons for second- year legal English major students in Hanoi Law University?; 2. To what extent does that corrective feedback lead to students’ uptake? The study provided some important findings, among which was a close relationship between corrective feedback and uptake. In particular, recast was the most commonly used feedback type, yet it was the least effective in terms of students’ uptake and repair, while the most successful feedback, namely meta-linguistic feedback, clarification requests and elicitation, which led to students’ generated repair, was used at a much lower rate by teachers. Furthermore, it revealed that different types of errors needed different types of feedback. Also, the use of feedback depended on the students’ English proficiency level. In the light of findings, a number of pedagogical implications have been drawn in the hope of enhancing the effectiveness of teachers’ corrective feedback to students’ uptake in foreign language acquisition process.Keywords: corrective feedback, error, uptake, speaking English lesson
Procedia PDF Downloads 2625032 Decision Making System for Clinical Datasets
Authors: P. Bharathiraja
Abstract:
Computer Aided decision making system is used to enhance diagnosis and prognosis of diseases and also to assist clinicians and junior doctors in clinical decision making. Medical Data used for decision making should be definite and consistent. Data Mining and soft computing techniques are used for cleaning the data and for incorporating human reasoning in decision making systems. Fuzzy rule based inference technique can be used for classification in order to incorporate human reasoning in the decision making process. In this work, missing values are imputed using the mean or mode of the attribute. The data are normalized using min-ma normalization to improve the design and efficiency of the fuzzy inference system. The fuzzy inference system is used to handle the uncertainties that exist in the medical data. Equal-width-partitioning is used to partition the attribute values into appropriate fuzzy intervals. Fuzzy rules are generated using Class Based Associative rule mining algorithm. The system is trained and tested using heart disease data set from the University of California at Irvine (UCI) Machine Learning Repository. The data was split using a hold out approach into training and testing data. From the experimental results it can be inferred that classification using fuzzy inference system performs better than trivial IF-THEN rule based classification approaches. Furthermore it is observed that the use of fuzzy logic and fuzzy inference mechanism handles uncertainty and also resembles human decision making. The system can be used in the absence of a clinical expert to assist junior doctors and clinicians in clinical decision making.Keywords: decision making, data mining, normalization, fuzzy rule, classification
Procedia PDF Downloads 5175031 Multi Response Optimization in Drilling Al6063/SiC/15% Metal Matrix Composite
Authors: Hari Singh, Abhishek Kamboj, Sudhir Kumar
Abstract:
This investigation proposes a grey-based Taguchi method to solve the multi-response problems. The grey-based Taguchi method is based on the Taguchi’s design of experimental method, and adopts Grey Relational Analysis (GRA) to transfer multi-response problems into single-response problems. In this investigation, an attempt has been made to optimize the drilling process parameters considering weighted output response characteristics using grey relational analysis. The output response characteristics considered are surface roughness, burr height and hole diameter error under the experimental conditions of cutting speed, feed rate, step angle, and cutting environment. The drilling experiments were conducted using L27 orthogonal array. A combination of orthogonal array, design of experiments and grey relational analysis was used to ascertain best possible drilling process parameters that give minimum surface roughness, burr height and hole diameter error. The results reveal that combination of Taguchi design of experiment and grey relational analysis improves surface quality of drilled hole.Keywords: metal matrix composite, drilling, optimization, step drill, surface roughness, burr height, hole diameter error
Procedia PDF Downloads 3175030 GIS Application in Surface Runoff Estimation for Upper Klang River Basin, Malaysia
Authors: Suzana Ramli, Wardah Tahir
Abstract:
Estimation of surface runoff depth is a vital part in any rainfall-runoff modeling. It leads to stream flow calculation and later predicts flood occurrences. GIS (Geographic Information System) is an advanced and opposite tool used in simulating hydrological model due to its realistic application on topography. The paper discusses on calculation of surface runoff depth for two selected events by using GIS with Curve Number method for Upper Klang River basin. GIS enables maps intersection between soil type and land use that later produces curve number map. The results show good correlation between simulated and observed values with more than 0.7 of R2. Acceptable performance of statistical measurements namely mean error, absolute mean error, RMSE, and bias are also deduced in the paper.Keywords: surface runoff, geographic information system, curve number method, environment
Procedia PDF Downloads 2815029 Perceptual Image Coding by Exploiting Internal Generative Mechanism
Authors: Kuo-Cheng Liu
Abstract:
In the perceptual image coding, the objective is to shape the coding distortion such that the amplitude of distortion does not exceed the error visibility threshold, or to remove perceptually redundant signals from the image. While most researches focus on color image coding, the perceptual-based quantizer developed for luminance signals are always directly applied to chrominance signals such that the color image compression methods are inefficient. In this paper, the internal generative mechanism is integrated into the design of a color image compression method. The internal generative mechanism working model based on the structure-based spatial masking is used to assess the subjective distortion visibility thresholds that are visually consistent to human eyes better. The estimation method of structure-based distortion visibility thresholds for color components is further presented in a locally adaptive way to design quantization process in the wavelet color image compression scheme. Since the lowest subband coefficient matrix of images in the wavelet domain preserves the local property of images in the spatial domain, the error visibility threshold inherent in each coefficient of the lowest subband for each color component is estimated by using the proposed spatial error visibility threshold assessment. The threshold inherent in each coefficient of other subbands for each color component is then estimated in a local adaptive fashion based on the distortion energy allocation. By considering that the error visibility thresholds are estimated using predicting and reconstructed signals of the color image, the coding scheme incorporated with locally adaptive perceptual color quantizer does not require side information. Experimental results show that the entropies of three color components obtained by using proposed IGM-based color image compression scheme are lower than that obtained by using the existing color image compression method at perceptually lossless visual quality.Keywords: internal generative mechanism, structure-based spatial masking, visibility threshold, wavelet domain
Procedia PDF Downloads 2485028 Hierarchical Operation Strategies for Grid Connected Building Microgrid with Energy Storage and Photovoltatic Source
Authors: Seon-Ho Yoon, Jin-Young Choi, Dong-Jun Won
Abstract:
This paper presents hierarchical operation strategies which are minimizing operation error between day ahead operation plan and real time operation. Operating power systems between centralized and decentralized approaches can be represented as hierarchical control scheme, featured as primary control, secondary control and tertiary control. Primary control is known as local control, featuring fast response. Secondary control is referred to as microgrid Energy Management System (EMS). Tertiary control is responsible of coordinating the operations of multi-microgrids. In this paper, we formulated 3 stage microgrid operation strategies which are similar to hierarchical control scheme. First stage is to set a day ahead scheduled output power of Battery Energy Storage System (BESS) which is only controllable source in microgrid and it is optimized to minimize cost of exchanged power with main grid using Particle Swarm Optimization (PSO) method. Second stage is to control the active and reactive power of BESS to be operated in day ahead scheduled plan in case that State of Charge (SOC) error occurs between real time and scheduled plan. The third is rescheduling the system when the predicted error is over the limited value. The first stage can be compared with the secondary control in that it adjusts the active power. The second stage is comparable to the primary control in that it controls the error in local manner. The third stage is compared with the secondary control in that it manages power balancing. The proposed strategies will be applied to one of the buildings in Electronics and Telecommunication Research Institute (ETRI). The building microgrid is composed of Photovoltaic (PV) generation, BESS and load and it will be interconnected with the main grid. Main purpose of that is minimizing operation cost and to be operated in scheduled plan. Simulation results support validation of proposed strategies.Keywords: Battery Energy Storage System (BESS), Energy Management System (EMS), Microgrid (MG), Particle Swarm Optimization (PSO)
Procedia PDF Downloads 2485027 Application of Double Side Approach Method on Super Elliptical Winkler Plate
Authors: Hsiang-Wen Tang, Cheng-Ying Lo
Abstract:
In this study, the static behavior of super elliptical Winkler plate is analyzed by applying the double side approach method. The lack of information about super elliptical Winkler plates is the motivation of this study and we use the double side approach method to solve this problem because of its superior ability on efficiently treating problems with complex boundary shape. The double side approach method has the advantages of high accuracy, easy calculation procedure and less calculation load required. Most important of all, it can give the error bound of the approximate solution. The numerical results not only show that the double side approach method works well on this problem but also provide us the knowledge of static behavior of super elliptical Winkler plate in practical use.Keywords: super elliptical winkler plate, double side approach method, error bound, mechanic
Procedia PDF Downloads 3555026 Human Errors in IT Services, HFACS Model in Root Cause Categorization
Authors: Kari Saarelainen, Marko Jantti
Abstract:
IT service trending of root causes of service incidents and problems is an important part of proactive problem management and service improvement. Human error related root causes are an important root cause category also in IT service management, although it’s proportion among root causes is smaller than in the other industries. The research problem in this study is: How root causes of incidents related to human errors should be categorized in an ITSM organization to effectively support service improvement. Categorization based on IT service management processes and based on Human Factors Analysis and Classification System (HFACS) taxonomy was studied in a case study. HFACS is widely used in human error root cause categorization across many industries. Combining these two categorization models in a two dimensional matrix was found effective, yet impractical for daily work.Keywords: IT service management, ITIL, incident, problem, HFACS, swiss cheese model
Procedia PDF Downloads 4885025 Mapping Poverty in the Philippines: Insights from Satellite Data and Spatial Econometrics
Authors: Htet Khaing Lin
Abstract:
This study explores the relationship between a diverse set of variables, encompassing both environmental and socio-economic factors, and poverty levels in the Philippines for the years 2012, 2015, and 2018. Employing Ordinary Least Squares (OLS), Spatial Lag Models (SLM), and Spatial Error Models (SEM), this study delves into the dynamics of key indicators, including daytime and nighttime land surface temperature, cropland surface, urban land surface, rainfall, population size, normalized difference water, vegetation, and drought indices. The findings reveal consistent patterns and unexpected correlations, highlighting the need for nuanced policies that address the multifaceted challenges arising from the interplay of environmental and socio-economic factors.Keywords: poverty analysis, OLS, spatial lag models, spatial error models, Philippines, google earth engine, satellite data, environmental dynamics, socio-economic factors
Procedia PDF Downloads 995024 Impact of Workers’ Remittances on Poverty in Pakistan: A Time Series Analysis by Ardl
Authors: Syed Aziz Rasool, Ayesha Zaman
Abstract:
Poverty is one of the most important problems for any developing nation. Workers’ remittances and investment plays a crucial role in development of any country by reducing the poverty level in Pakistan. This research studies the relationship between workers’ remittances and poverty alleviation. It also focused the significant effect on poverty reduction. This study uses time series data for the period of 1972-2013. Autoregressive Distributed Lag (ARDL)Model and Error Correction (ECM)Model has been used in order to find out the long run and short run relationship between the worker’s remittances and poverty level respectively. Thus, inflow of remittances showed the significant and negative impact on poverty level. Moreover, coefficient of error correction model explains the adjustment towards convergence and it has highly significant and negative value. According to this research, Policy makers should strongly focus on positive and effective policies to attract more remittances. JELCODE: JEL: J61 Procedia PDF Downloads 2865023 A Comparative Analysis of the Performance of COSMO and WRF Models in Quantitative Rainfall Prediction
Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Mary Nsabagwa, Triphonia Jacob Ngailo, Joachim Reuder, Sch¨attler Ulrich, Musa Semujju
Abstract:
The Numerical weather prediction (NWP) models are considered powerful tools for guiding quantitative rainfall prediction. A couple of NWP models exist and are used at many operational weather prediction centers. This study considers two models namely the Consortium for Small–scale Modeling (COSMO) model and the Weather Research and Forecasting (WRF) model. It compares the models’ ability to predict rainfall over Uganda for the period 21st April 2013 to 10th May 2013 using the root mean square (RMSE) and the mean error (ME). In comparing the performance of the models, this study assesses their ability to predict light rainfall events and extreme rainfall events. All the experiments used the default parameterization configurations and with same horizontal resolution (7 Km). The results show that COSMO model had a tendency of largely predicting no rain which explained its under–prediction. The COSMO model (RMSE: 14.16; ME: -5.91) presented a significantly (p = 0.014) higher magnitude of error compared to the WRF model (RMSE: 11.86; ME: -1.09). However the COSMO model (RMSE: 3.85; ME: 1.39) performed significantly (p = 0.003) better than the WRF model (RMSE: 8.14; ME: 5.30) in simulating light rainfall events. All the models under–predicted extreme rainfall events with the COSMO model (RMSE: 43.63; ME: -39.58) presenting significantly higher error magnitudes than the WRF model (RMSE: 35.14; ME: -26.95). This study recommends additional diagnosis of the models’ treatment of deep convection over the tropics.Keywords: comparative performance, the COSMO model, the WRF model, light rainfall events, extreme rainfall events
Procedia PDF Downloads 2615022 Clinico-Microbiological Study of S. aureus from Various Clinical Samples with Reference to Methicillin Resistant S. aureus (MRSA)
Authors: T. G. Pathrikar, A. D. Urhekar, M. P. Bansal
Abstract:
To find out S. aureus from patient samples on the basis of coagulase test. We have evaluated slide coagulase (n=46 positive), tube coagulase (n=48 positive) and DNase test (n=44, positive) , We have isolated and identified MRSA from various clinical samples and specimens by disc diffusion method determined the incidence of MRSA 50% in patients. Found out the in vitro antimicrobial susceptibility pattern of MRSA isolates and also the MIC of MRSA of oxacillin by E-Test.Keywords: cefoxitin disc diffusion MRSA detection, e – test, S. aureus devastating pathogen, tube coagulase confirmation
Procedia PDF Downloads 491