Search results for: underestimation errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 981

Search results for: underestimation errors

831 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement

Authors: Hu Zhenxing, Gao Jianxin

Abstract:

Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.

Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D

Procedia PDF Downloads 498
830 Design of a Pneumonia Ontology for Diagnosis Decision Support System

Authors: Sabrina Azzi, Michal Iglewski, Véronique Nabelsi

Abstract:

Diagnosis error problem is frequent and one of the most important safety problems today. One of the main objectives of our work is to propose an ontological representation that takes into account the diagnostic criteria in order to improve the diagnostic. We choose pneumonia disease since it is one of the frequent diseases affected by diagnosis errors and have harmful effects on patients. To achieve our aim, we use a semi-automated method to integrate diverse knowledge sources that include publically available pneumonia disease guidelines from international repositories, biomedical ontologies and electronic health records. We follow the principles of the Open Biomedical Ontologies (OBO) Foundry. The resulting ontology covers symptoms and signs, all the types of pneumonia, antecedents, pathogens, and diagnostic testing. The first evaluation results show that most of the terms are covered by the ontology. This work is still in progress and represents a first and major step toward a development of a diagnosis decision support system for pneumonia.

Keywords: Clinical decision support system, Diagnostic errors, Ontology, Pneumonia

Procedia PDF Downloads 189
829 A Tutorial on Model Predictive Control for Spacecraft Maneuvering Problem with Theory, Experimentation and Applications

Authors: O. B. Iskender, K. V. Ling, V. Dubanchet, L. Simonini

Abstract:

This paper discusses the recent advances and future prospects of spacecraft position and attitude control using Model Predictive Control (MPC). First, the challenges of the space missions are summarized, in particular, taking into account the errors, uncertainties, and constraints imposed by the mission, spacecraft and, onboard processing capabilities. The summary of space mission errors and uncertainties provided in categories; initial condition errors, unmodeled disturbances, sensor, and actuator errors. These previous constraints are classified into two categories: physical and geometric constraints. Last, real-time implementation capability is discussed regarding the required computation time and the impact of sensor and actuator errors based on the Hardware-In-The-Loop (HIL) experiments. The rationales behind the scenarios’ are also presented in the scope of space applications as formation flying, attitude control, rendezvous and docking, rover steering, and precision landing. The objectives of these missions are explained, and the generic constrained MPC problem formulations are summarized. Three key design elements used in MPC design: the prediction model, the constraints formulation and the objective cost function are discussed. The prediction models can be linear time invariant or time varying depending on the geometry of the orbit, whether it is circular or elliptic. The constraints can be given as linear inequalities for input or output constraints, which can be written in the same form. Moreover, the recent convexification techniques for the non-convex geometrical constraints (i.e., plume impingement, Field-of-View (FOV)) are presented in detail. Next, different objectives are provided in a mathematical framework and explained accordingly. Thirdly, because MPC implementation relies on finding in real-time the solution to constrained optimization problems, computational aspects are also examined. In particular, high-speed implementation capabilities and HIL challenges are presented towards representative space avionics. This covers an analysis of future space processors as well as the requirements of sensors and actuators on the HIL experiments outputs. The HIL tests are investigated for kinematic and dynamic tests where robotic arms and floating robots are used respectively. Eventually, the proposed algorithms and experimental setups are introduced and compared with the authors' previous work and future plans. The paper concludes with a conjecture that MPC paradigm is a promising framework at the crossroads of space applications while could be further advanced based on the challenges mentioned throughout the paper and the unaddressed gap.

Keywords: convex optimization, model predictive control, rendezvous and docking, spacecraft autonomy

Procedia PDF Downloads 110
828 Design Optimization of a Compact Quadrupole Electromagnet for CLS 2.0

Authors: Md. Armin Islam, Les Dallin, Mark Boland, W. J. Zhang

Abstract:

This paper reports a study on the optimal magnetic design of a compact quadrupole electromagnet for the Canadian Light Source (CLS 2.0). The nature of the design is to determine a quadrupole with low relative higher order harmonics and better field quality. The design problem was formulated as an optimization model, in which the objective function is the higher order harmonics (multipole errors) and the variable to be optimized is the material distribution on the pole. The higher order harmonics arose in the quadrupole due to truncating the ideal hyperbola at a certain point to make the pole. In this project, the arisen harmonics have been optimized both transversely and longitudinally by adjusting material on the poles in a controlled way. For optimization, finite element analysis (FEA) has been conducted. A better higher order harmonics amplitudes and field quality have been achieved through the optimization. On the basis of the optimized magnetic design, electrical and cooling calculation has been performed for the magnet.

Keywords: drift, electrical, and cooling calculation, integrated field, magnetic field gradient, multipole errors, quadrupole

Procedia PDF Downloads 143
827 Effects of Manufacture and Assembly Errors on the Output Error of Globoidal Cam Mechanisms

Authors: Shuting Ji, Yueming Zhang, Jing Zhao

Abstract:

The output error of the globoidal cam mechanism can be considered as a relevant indicator of mechanism performance, because it determines kinematic and dynamical behavior of mechanical transmission. Based on the differential geometry and the rigid body transformations, the mathematical model of surface geometry of the globoidal cam is established. Then we present the analytical expression of the output error (including the transmission error and the displacement error along the output axis) by considering different manufacture and assembly errors. The effects of the center distance error, the perpendicular error between input and output axes and the rotational angle error of the globoidal cam on the output error are systematically analyzed. A globoidal cam mechanism which is widely used in automatic tool changer of CNC machines is applied for illustration. Our results show that the perpendicular error and the rotational angle error have little effects on the transmission error but have great effects on the displacement error along the output axis. This study plays an important role in the design, manufacture and assembly of the globoidal cam mechanism.

Keywords: globoidal cam mechanism, manufacture error, transmission error, automatic tool changer

Procedia PDF Downloads 574
826 Advanced Digital Manufacturing: Case Study

Authors: Abdelrahman Abdelazim

Abstract:

Most industries are looking for technologies that are easy to use, efficient and fast to accomplish. To implement these, factories tend to use advanced systems that could alter complicity to simplicity and rudimentary to advancement. Cloud Manufacturing is a new movement that aims to mirror and integrate cloud computing into manufacturing. Amongst cloud manufacturing various advantages are decreasing the human involvements and increasing the dependency on automated machines, which in turns decreases human errors and increases efficiency. A reliable and extraordinary performance processes with minimum errors are highly desired factors of today’s manufacturers. At the glance it seems to be the best alternative, however, the implementation of a cloud system can be very challenging. This work investigates cloud manufacturing in details, it outlines its advantages and disadvantages by converting a local factory in Kuwait to a cloud-ready system. Initially the flow of the factory’s manufacturing process has been analyzed identifying the bottlenecks and illustrating how cloud manufacturing can eliminate them. Following this an automation process has been analyzed and implemented. A comparison between the process before and after the adaptation has been carried out showing the effects on the cost, the output and the efficiency of the process.

Keywords: cloud manufacturing, automation, Kuwait industrial sector, advanced digital manufacturing

Procedia PDF Downloads 771
825 Pathological Gambling and Impulsivity: Comparison of the Eight Laboratory Measures of Inhibition Capacities

Authors: Semion Kertzman, Pinhas Dannon

Abstract:

Impulsive behaviour and the underlying brain processes are hypothesized to be central in the development and maintenance of pathological gambling. Inhibition ability can be differentially impaired in pathological gamblers (PGs). Aims: This study aimed to compare the ability of eight widely used inhibition measures to discriminate between PGs and healthy controls (HCs). Methods: PGs (N=51) and demographically matched HCs (N=51) performed cognitive inhibition (the Stroop), motor inhibition (the Go/NoGo) and reflective inhibition (the Matching Familiar Figures (MFFT)) tasks. Results: An augmented total interference response time in the Stroop task (η² =0.054), a large number of commission errors (η² =0.053) in the Go/NoGo task, and the total number of errors in the MFFT (η² =0.05) can discriminate PGs from HCs. Other measures are unable to differentiate between PGs and HCs. No significant correlations were observed between inhibition measures. Conclusion: Inhibition measures varied in the ability to discriminate PGs from HCs. Most inhibition measures were not relevant to gambling behaviour. PGs do not express rash, impulsive behaviour, such as quickly choosing an answer without thinking. In contrast, in PGs, inhibition impairment was related to slow-inaccurate performance.

Keywords: pathological gambling, impulsivity, neurocognition, addiction

Procedia PDF Downloads 302
824 Error Analysis of the Pronunciation of English Consonants and Arabic Consonants by Egyptian Learners

Authors: Marwa A. Nasser

Abstract:

This is an empirical study that provides an investigation of the most significant errors of Egyptian learners in producing English consonants and Arabic consonants, and advice on how these can be remedied. The study adopts a descriptive approach and the analysis is based on audio recordings of two groups of people. The first group includes six volunteers of Egyptian learners belonging to the English Department at Faculty of Women who learn English as a foreign language. The other group includes six Egyptian learners who are studying Tajweed (how to recite Quran correctly). The audio recordings were examined, and sounds were analyzed in an attempt to highlight the most common error done by the learners while reading English or reading (or reciting) Quran. Results show that the two groups of learners have problems with certain phonemic contrasts. Both groups share common errors although both languages are different and not related (e.g. pre-aspiration of fortis stops, incorrect articulation of consonants and velarization of certain sounds).

Keywords: consonant articulations, Egyptian learners of English, Egyptian learners of Quran, empirical study, error analysis, pronunciation problems

Procedia PDF Downloads 269
823 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions

Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini

Abstract:

This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.

Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing

Procedia PDF Downloads 146
822 Evaluating Forecasts Through Stochastic Loss Order

Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio

Abstract:

We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.

Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test

Procedia PDF Downloads 89
821 The Influence of Cognitive Load in the Acquisition of Words through Sentence or Essay Writing

Authors: Breno Barrreto Silva, Agnieszka Otwinowska, Katarzyna Kutylowska

Abstract:

Research comparing lexical learning following the writing of sentences and longer texts with keywords is limited and contradictory. One possibility is that the recursivity of writing may enhance processing and increase lexical learning; another possibility is that the higher cognitive load of complex-text writing (e.g., essays), at least when timed, may hinder the learning of words. In our study, we selected 2 sets of 10 academic keywords matched for part of speech, length (number of characters), frequency (SUBTLEXus), and concreteness, and we asked 90 L1-Polish advanced-level English majors to use the keywords when writing sentences, timed (60 minutes) or untimed essays. First, all participants wrote a timed Control essay (60 minutes) without keywords. Then different groups produced Timed essays (60 minutes; n=33), Untimed essays (n=24), or Sentences (n=33) using the two sets of glossed keywords (counterbalanced). The comparability of the participants in the three groups was ensured by matching them for proficiency in English (LexTALE), and for few measures derived from the control essay: VocD (assessing productive lexical diversity), normed errors (assessing productive accuracy), words per minute (assessing productive written fluency), and holistic scores (assessing overall quality of production). We measured lexical learning (depth and breadth) via an adapted Vocabulary Knowledge Scale (VKS) and a free association test. Cognitive load was measured in the three essays (Control, Timed, Untimed) using normed number of errors and holistic scores (TOEFL criteria). The number of errors and essay scores were obtained from two raters (interrater reliability Pearson’s r=.78-91). Generalized linear mixed models showed no difference in the breadth and depth of keyword knowledge after writing Sentences, Timed essays, and Untimed essays. The task-based measurements found that Control and Timed essays had similar holistic scores, but that Untimed essay had better quality than Timed essay. Also, Untimed essay was the most accurate, and Timed essay the most error prone. Concluding, using keywords in Timed, but not Untimed, essays increased cognitive load, leading to more errors and lower quality. Still, writing sentences and essays yielded similar lexical learning, and differences in the cognitive load between Timed and Untimed essays did not affect lexical acquisition.

Keywords: learning academic words, writing essays, cognitive load, english as an L2

Procedia PDF Downloads 73
820 Using Real Truck Tours Feedback for Address Geocoding Correction

Authors: Dalicia Bouallouche, Jean-Baptiste Vioix, Stéphane Millot, Eric Busvelle

Abstract:

When researchers or logistics software developers deal with vehicle routing optimization, they mainly focus on minimizing the total travelled distance or the total time spent in the tours by the trucks, and maximizing the number of visited customers. They assume that the upstream real data given to carry the optimization of a transporter tours is free from errors, like customers’ real constraints, customers’ addresses and their GPS-coordinates. However, in real transporter situations, upstream data is often of bad quality because of address geocoding errors and the irrelevance of received addresses from the EDI (Electronic Data Interchange). In fact, geocoders are not exempt from errors and could give impertinent GPS-coordinates. Also, even with a good geocoding, an inaccurate address can lead to a bad geocoding. For instance, when the geocoder has trouble with geocoding an address, it returns those of the center of the city. As well, an obvious geocoding issue is that the mappings used by the geocoders are not regularly updated. Thus, new buildings could not exist on maps until the next update. Even so, trying to optimize tours with impertinent customers GPS-coordinates, which are the most important and basic input data to take into account for solving a vehicle routing problem, is not really useful and will lead to a bad and incoherent solution tours because the locations of the customers used for the optimization are very different from their real positions. Our work is supported by a logistics software editor Tedies and a transport company Upsilon. We work with Upsilon's truck routes data to carry our experiments. In fact, these trucks are equipped with TOMTOM GPSs that continuously save their tours data (positions, speeds, tachograph-information, etc.). We, then, retrieve these data to extract the real truck routes to work with. The aim of this work is to use the experience of the driver and the feedback of the real truck tours to validate GPS-coordinates of well geocoded addresses, and bring a correction to the badly geocoded addresses. Thereby, when a vehicle makes its tour, for each visited customer, the vehicle might have trouble with finding this customer’s address at most once. In other words, the vehicle would be wrong at most once for each customer’s address. Our method significantly improves the quality of the geocoding. Hence, we achieve to automatically correct an average of 70% of GPS-coordinates of a tour addresses. The rest of the GPS-coordinates are corrected in a manual way by giving the user indications to help him to correct them. This study shows the importance of taking into account the feedback of the trucks to gradually correct address geocoding errors. Indeed, the accuracy of customer’s address and its GPS-coordinates play a major role in tours optimization. Unfortunately, address writing errors are very frequent. This feedback is naturally and usually taken into account by transporters (by asking drivers, calling customers…), to learn about their tours and bring corrections to the upcoming tours. Hence, we develop a method to do a big part of that automatically.

Keywords: driver experience feedback, geocoding correction, real truck tours

Procedia PDF Downloads 674
819 A New Approach to the Boom Welding Technique by Determining Seam Profile Tracking

Authors: Muciz Özcan, Mustafa Sacid Endiz, Veysel Alver

Abstract:

In this paper we present a new approach to the boom welding related to the mobile cranes manufacturing, implementing a new method in order to get homogeneous welding quality and reduced energy usage during booms production. We aim to get the realization of the same welding quality carried out on the boom in every region during the manufacturing process and to detect the possible welding errors whether they could be eliminated using laser sensors. We determine the position of the welding region directly through our system and with the help of the welding oscillator we are able to perform a proper boom welding. Errors that may occur in the welding process can be observed by monitoring and eliminated by means of an operator. The major modification in the production of the crane booms will be their form of the booms. Although conventionally, more than one welding is required to perform this process, with the suggested concept, only one particular welding is sufficient, which will be more energy and environment-friendly. Consequently, as only one welding is needed for the manufacturing of the boom, the particular welding quality becomes more essential. As a way to satisfy the welding quality, a welding manipulator was made and fabricated. By using this welding manipulator, the risks of involving dangerous gases formed during the welding process for the operator and the surroundings are diminished as much as possible.

Keywords: boom welding, seam tracking, energy saving, global warming

Procedia PDF Downloads 346
818 Direct Phoenix Identification and Antimicrobial Susceptibility Testing from Positive Blood Culture Broths

Authors: Waad Al Saleemi, Badriya Al Adawi, Zaaima Al Jabri, Sahim Al Ghafri, Jalila Al Hadhramia

Abstract:

Objectives: Using standard lab methods, a positive blood culture requires a minimum of two days (two occasions of overnight incubation) to obtain a final identification (ID) and antimicrobial susceptibility results (AST) report. In this study, we aimed to evaluate the accuracy and precision of identification and antimicrobial susceptibility testing of an alternative method (direct method) that will reduce the turnaround time by 24 hours. This method involves the direct inoculation of positive blood culture broths into the Phoenix system using serum separation tubes (SST). Method: This prospective study included monomicrobial-positive blood cultures obtained from January 2022 to May 2023 in SQUH. Blood cultures containing a mixture of organisms, fungi, or anaerobic organisms were excluded from this study. The result of the new “direct method” under study was compared with the current “standard method” used in the lab. The accuracy and precision were evaluated for the ID and AST using Clinical and Laboratory Standards Institute (CLSI) recommendations. The categorical agreement, essential agreement, and the rates of very major errors (VME), major errors (ME), and minor errors (MIE) for both gram-negative and gram-positive bacteria were calculated. Passing criteria were set according to CLSI. Result: The results of ID and AST were available for a total of 158 isolates. Of 77 isolates of gram-negative bacteria, 71 (92%) were correctly identified at the species level. Of 70 isolates of gram-positive bacteria, 47(67%) isolates were correctly identified. For gram-negative bacteria, the essential agreement of the direct method was ≥92% when compared to the standard method, while the categorical agreement was ≥91% for all tested antibiotics. The precision of ID and AST were noted to be 100% for all tested isolates. For gram-positive bacteria, the essential agreement was >93%, while the categorical agreement was >92% for all tested antibiotics except moxifloxacin. Many antibiotics were noted to have an unacceptable higher rate of very major errors including penicillin, cotrimoxazole, clindamycin, ciprofloxacin, and moxifloxacin. However, no error was observed in the results of vancomycin, linezolid, and daptomycin. Conclusion: The direct method of ID and AST for positive blood cultures using SST is reliable for gram negative bacteria. It will significantly decrease the turnaround time and will facilitate antimicrobial stewardship.

Keywords: bloodstream infection, oman, direct ast, blood culture, rapid identification, antimicrobial susceptibility, phoenix, direct inoculation

Procedia PDF Downloads 64
817 ESL Students’ Engagement with Written Corrective Feedback

Authors: Khaled Karim

Abstract:

Although a large number of studies have examined the effectiveness of written corrective feedback (WCF) in L2 writing, very few studies have investigated students’ attitudes towards the feedback and their perspectives regarding the usefulness of different types of feedback. Using prompted stimulated recall interviews, this study investigated ESL students’ perceptions and attitudes towards the CF they received as well as their preferences and reactions to the corrections. 24 ESL students first received direct (e.g., providing target forms after crossing out erroneous forms) and indirect (e.g., underlining and underline+metalinguistic) CF on four written tasks and then participated in an interview with the researcher. The analysis revealed that both direct and indirect CF were judged to be useful strategies for correction but in different ways. Underline only CF helped them think about the nature and type of the errors they made while metalinguistic CF was useful as it provided clues about the nature and type of the errors. Most participants indicated that indirect correction needed sufficient prior knowledge of the form to be effective. The majority of the students found the combination of underlining with metalinguistic information as the most effective method of providing feedback. Detailed findings will be presented, and pedagogical implications of the study will be discussed.

Keywords: ESL writing, error correction, feedback, written corrective feedback

Procedia PDF Downloads 236
816 Differences in the Perception of Behavior Problems in Pre-school Children among the Teachers and Parents

Authors: Jana Kožárová

Abstract:

Even the behavior problems in pre-school children might be considered as a transitional problem which may disappear by their transition into elementary school; it is an issue that needs a lot of attention because of the fact that the behavioral patterns are adopted in the children especially in this age. Common issue in the process of elimination of the behavior problems in the group of pre-school children is a difference in the perception of the importance and gravity of the symptoms. The underestimation of the children's problems by parents often result into conflicts with kindergarten teachers. Thus, the child does not get the support that his/her problems require and this might result into a school failure and can negatively influence his/her future school performance and success. The research sample consisted of 4 children with behavior problems, their teachers and parents. To determine the most problematic area in the child's behavior, Child Behavior Checklist (CBCL) filled by parents and Caregiver/Teacher Form (CTF-R) filled by teachers were used. Scores from the CBCL and the CTR-F were compared with Pearson correlation coefficient in order to find the differences in the perception of behavior problems in pre-school children.

Keywords: behavior problems, Child Behavior Checklist, Caregiver/Teacher Form, Pearson correlation coefficient, pre-school age

Procedia PDF Downloads 434
815 Applying Simulation-Based Digital Teaching Plans and Designs in Operating Medical Equipment

Authors: Kuo-Kai Lin, Po-Lun Chang

Abstract:

Background: The Emergency Care Research Institute released a list for the top 10 medical technology hazards in 2017, with the following hazard topping the list: ‘infusion errors can be deadly if simple safety steps are overlooked.’ In addition, hospitals use various assessment items to evaluate the safety of their medical equipment, confirming the importance of medical equipment safety. In recent years, the topic of patient safety has garnered increasing attention. Accordingly, various agencies have established patient safety-related committees to coordinate, collect, and analyze information regarding abnormal events associated with medical practice. Activities to promote and improve employee training have been introduced to diminish the recurrence of medical malpractice. Objective: To allow nursing personnel to acquire the skills needed to operate common medical equipment and update and review such skills whenever necessary to elevate medical care quality and reduce patient injuries caused by medical equipment operation errors. Method: In this study, a quasi-experimental design was adopted and nurses from a regional teaching hospital were selected as the study sample. Online videos instructing the operation method of common medical equipment were made and quick response codes were designed for the nursing personnel to quickly access the videos when necessary. Senior nursing supervisors and equipment experts were invited to formulate a ‘Scale-based Questionnaire for Assessing Nursing Personnel’s Operational Knowledge of Common Medical Equipment’ to evaluate the nursing personnel’s literacy regarding the operation of the medical equipment. From March to October 2017, an employee training on medical equipment operation and a practice course (simulation course) were implemented, after which the effectiveness of the training and practice course were assessed. Results: Prior to and after the training and practice course, the 66 participating nurses scored 58 and 87 on ‘operational knowledge of common medical equipment,’ respectively (showing a significant statistical difference; t = -9.407, p < .001); 53.5 and 86.3 on ‘operational knowledge of 12-lead electrocardiography’ (z = -2.087, p < .01), respectively; 40 and 79.5 on ‘operational knowledge of cardiac defibrillators’ (z = -3.849, p < .001), respectively; 90 and 98 on ‘operational knowledge of Abbott pumps’ (z = -1.841, p = 0.066), respectively; and 8.7 and 13.7 on ‘perceived competence’ (showing a significant statistical difference; t = -2.77, p < .05). In the participating hospital, medical equipment operation errors were observed in both 2016 and 2017. However, since the implementation of the intervention, medical equipment operation errors have not yet been observed up to October 2017, which can be regarded as the secondary outcome of this study. Conclusion: In this study, innovative teaching strategies were adopted to effectively enhance the professional literacy and skills of nursing personnel in operating medical equipment. The training and practice course also elevated the nursing personnel’s related literacy and perceived competence of operating medical equipment. The nursing personnel was thus able to accurately operate the medical equipment and avoid operational errors that might jeopardize patient safety.

Keywords: medical equipment, digital teaching plan, simulation-based teaching plan, operational knowledge, patient safety

Procedia PDF Downloads 138
814 An Alternative Stratified Cox Model for Correlated Variables in Infant Mortality

Authors: K. A. Adeleke

Abstract:

Often in epidemiological research, introducing stratified Cox model can account for the existence of interactions of some inherent factors with some major/noticeable factors. This research work aimed at modelling correlated variables in infant mortality with the existence of some inherent factors affecting the infant survival function. An alternative semiparametric Stratified Cox model is proposed with a view to take care of multilevel factors that have interactions with others. This, however, was used as a tool to model infant mortality data from Nigeria Demographic and Health Survey (NDHS) with some multilevel factors (Tetanus, Polio, and Breastfeeding) having correlation with main factors (Sex, Size, and Mode of Delivery). Asymptotic properties of the estimators are also studied via simulation. The tested model via data showed good fit and performed differently depending on the levels of the interaction of the strata variable Z*. An evidence that the baseline hazard functions and regression coefficients are not the same from stratum to stratum provides a gain in information as against the usage of Cox model. Simulation result showed that the present method produced better estimates in terms of bias, lower standard errors, and or mean square errors.

Keywords: stratified Cox, semiparametric model, infant mortality, multilevel factors, cofounding variables

Procedia PDF Downloads 557
813 Smart Card Technology Adaption in a Hospital Setting

Authors: H. K. V. Narayan

Abstract:

This study was conducted at Tata Memorial Hospital (TMH), Mumbai, India. The study was to evaluate the impact of adapting Smart Card (SC) for clinical and business transactions in order to reduce Lead times and to enforce business rules of the hospital. The objective for implementing the Smart Card was to improve the patient perception of quality in terms of structures process and outcomes and also to improve the productivity of the Institution. The Smart Card was implemented in phases from 2011 and integrated with the Hospital Information System (HIS/EMR). The implementation was a learning curve for all the stake holders as software obviated the need to use hardcopies of transactions. The acceptability to the stake holders was challenge in change management. The study assessed the impact 3 years into the implementation and the observed trends have suggested that it has decreased the lead times for services and increased the no of transactions and thereby the productivity. Patients who used to complain of multiple queues and cumbersome transactions now compliment the administration for effective use of Information and Communication Technology.

Keywords: smart card, high availability of health care information, reduction in potential medical errors due to elimination of transcription errors, reduction in no of queues, increased transactions, augmentation of revenue

Procedia PDF Downloads 285
812 A Real Time Ultra-Wideband Location System for Smart Healthcare

Authors: Mingyang Sun, Guozheng Yan, Dasheng Liu, Lei Yang

Abstract:

Driven by the demand of intelligent monitoring in rehabilitation centers or hospitals, a high accuracy real-time location system based on UWB (ultra-wideband) technology was proposed. The system measures precise location of a specific person, traces his movement and visualizes his trajectory on the screen for doctors or administrators. Therefore, doctors could view the position of the patient at any time and find them immediately and exactly when something emergent happens. In our design process, different algorithms were discussed, and their errors were analyzed. In addition, we discussed about a , simple but effective way of correcting the antenna delay error, which turned out to be effective. By choosing the best algorithm and correcting errors with corresponding methods, the system attained a good accuracy. Experiments indicated that the ranging error of the system is lower than 7 cm, the locating error is lower than 20 cm, and the refresh rate exceeds 5 times per second. In future works, by embedding the system in wearable IoT (Internet of Things) devices, it could provide not only physical parameters, but also the activity status of the patient, which would help doctors a lot in performing healthcare.

Keywords: intelligent monitoring, ultra-wideband technology, real-time location, IoT devices, smart healthcare

Procedia PDF Downloads 140
811 Peer Corrective Feedback on Written Errors in Computer-Mediated Communication

Authors: S. H. J. Liu

Abstract:

This paper aims to explore the role of peer Corrective Feedback (CF) in improving written productions by English-as-a- foreign-language (EFL) learners who work together via Wikispaces. It attempted to determine the effect of peer CF on form accuracy in English, such as grammar and lexis. Thirty-four EFL learners at the tertiary level were randomly assigned into the experimental (with peer feedback) or the control (without peer feedback) group; each group was subdivided into small groups of two or three. This resulted in six and seven small groups in the experimental and control groups, respectively. In the experimental group, each learner played a role as an assessor (providing feedback to others), as well as an assessee (receiving feedback from others). Each participant was asked to compose his/her written work and revise it based on the feedback. In the control group, on the other hand, learners neither provided nor received feedback but composed and revised their written work on their own. Data collected from learners’ compositions and post-task interviews were analyzed and reported in this study. Following the completeness of three writing tasks, 10 participants were selected and interviewed individually regarding their perception of collaborative learning in the Computer-Mediated Communication (CMC) environment. Language aspects to be analyzed included lexis (e.g., appropriate use of words), verb tenses (e.g., present and past simple), prepositions (e.g., in, on, and between), nouns, and articles (e.g., a/an). Feedback types consisted of CF, affective, suggestive, and didactic. Frequencies of feedback types and the accuracy of the language aspects were calculated. The results first suggested that accurate items were found more in the experimental group than in the control group. Such results entail that those who worked collaboratively outperformed those who worked non-collaboratively on the accuracy of linguistic aspects. Furthermore, the first type of CF (e.g., corrections directly related to linguistic errors) was found to be the most frequently employed type, whereas affective and didactic were the least used by the experimental group. The results further indicated that most participants perceived that peer CF was helpful in improving the language accuracy, and they demonstrated a favorable attitude toward working with others in the CMC environment. Moreover, some participants stated that when they provided feedback to their peers, they tended to pay attention to linguistic errors in their peers’ work but overlook their own errors (e.g., past simple tense) when writing. Finally, L2 or FL teachers or practitioners are encouraged to employ CMC technologies to train their students to give each other feedback in writing to improve the accuracy of the language and to motivate them to attend to the language system.

Keywords: peer corrective feedback, computer-mediated communication (CMC), second or foreign language (L2 or FL) learning, Wikispaces

Procedia PDF Downloads 245
810 Market Illiquidity and Pricing Errors in the Term Structure of CDS

Authors: Lidia Sanchis-Marco, Antonio Rubia, Pedro Serrano

Abstract:

This paper studies the informational content of pricing errors in the term structure of sovereign CDS spreads. The residuals from a non-arbitrage model are employed to construct a Price discrepancy estimate, or noise measure. The noise estimate is understood as an indicator of market distress and reflects frictions such as illiquidity. Empirically, the noise measure is computed for an extensive panel of CDS spreads. Our results reveal an important fraction of systematic risk is not priced in default swap contracts. When projecting the noise measure onto a set of financial variables, the panel-data estimates show that greater price discrepancies are systematically related to a higher level of offsetting transactions of CDS contracts. This evidence suggests that arbitrage capital flows exit the marketplace during time of distress, and this consistent with a market segmentation among investors and arbitrageurs where professional arbitrageurs are particularly ineffective at bringing prices to their fundamental values during turbulent periods. Our empirical findings are robust for the most common CDS pricing models employed in the industry.

Keywords: credit default swaps, noise measure, illiquidity, capital arbitrage

Procedia PDF Downloads 569
809 Improvement of Bone Scintography Image Using Image Texture Analysis

Authors: Yousif Mohamed Y. Abdallah, Eltayeb Wagallah

Abstract:

Image enhancement allows the observer to see details in images that may not be immediately observable in the original image. Image enhancement is the transformation or mapping of one image to another. The enhancement of certain features in images is accompanied by undesirable effects. To achieve maximum image quality after denoising, a new, low order, local adaptive Gaussian scale mixture model and median filter were presented, which accomplishes nonlinearities from scattering a new nonlinear approach for contrast enhancement of bones in bone scan images using both gamma correction and negative transform methods. The usual assumption of a distribution of gamma and Poisson statistics only lead to overestimation of the noise variance in regions of low intensity but to underestimation in regions of high intensity and therefore to non-optional results. The contrast enhancement results were obtained and evaluated using MatLab program in nuclear medicine images of the bones. The optimal number of bins, in particular the number of gray-levels, is chosen automatically using entropy and average distance between the histogram of the original gray-level distribution and the contrast enhancement function’s curve.

Keywords: bone scan, nuclear medicine, Matlab, image processing technique

Procedia PDF Downloads 509
808 An Error Analysis of English Communication of Suan Sunandha Rajabhat University Students

Authors: Chantima Wangsomchok

Abstract:

The main purposes of this study are (1) to test the students’ communicative competence within six main functions: greeting, parting, thanking, offering, requesting and suggesting, (2) to employ error analysis in the students’ communicative competence within those functions, and (3) to compare the characteristics of the error found from the investigation. The subjects of the study is 328 first-year undergraduates taking the Foundation English course in the first semester of the 2008 academic year at Suan Sunandha Rajabhat University. This study found that while the subjects showed high communicative competence in the use of the following three functions: greeting, thanking, and offering, they seemed to show poor communicative competence in suggesting, requesting and parting instead. In addition, this study found that the grammatical errors were likely to be most frequently found in the parting function. In the same way, the type of errors which were less frequently found was in the functions of thanking and requesting respectively. Instead, the students tended to have high pragmatic failure in the use of greeting and suggesting functions.

Keywords: error analysis, functions of English language, communicative competence, cognitive science

Procedia PDF Downloads 431
807 Data Integrity: Challenges in Health Information Systems in South Africa

Authors: T. Thulare, M. Herselman, A. Botha

Abstract:

Poor system use, including inappropriate design of health information systems, causes difficulties in communication with patients and increased time spent by healthcare professionals in recording the necessary health information for medical records. System features like pop-up reminders, complex menus, and poor user interfaces can make medical records far more time consuming than paper cards as well as affect decision-making processes. Although errors associated with health information and their real and likely effect on the quality of care and patient safety have been documented for many years, more research is needed to measure the occurrence of these errors and determine the causes to implement solutions. Therefore, the purpose of this paper is to identify data integrity challenges in hospital information systems through a scoping review and based on the results provide recommendations on how to manage these. Only 34 papers were found to be most suitable out of 297 publications initially identified in the field. The results indicated that human and computerized systems are the most common challenges associated with data integrity and factors such as policy, environment, health workforce, and lack of awareness attribute to these challenges but if measures are taken the data integrity challenges can be managed.

Keywords: data integrity, data integrity challenges, hospital information systems, South Africa

Procedia PDF Downloads 181
806 GGA-PBEsol+TB-MBJ Studies of SrxPb1-xS Ternary Semiconductor Alloys

Authors: Y. Benallou, K. Amara, O. Arbouche

Abstract:

In this paper, we report a density functional study of the structural, electronic and elastic properties of the ordered phases of SrxPb1-xS ternary semiconductor alloys namely rocksalt compounds: PbS and SrS and the rocksalt-based compounds: SrPb3S4, SrPbS2, and Sr3PbS4. These First-principles calculations have been performed using the full potential linearized augmented plane wave method (FP-LAPW) within the Generalized Gradient Approximation developed by Perdew–Burke–Ernzerhor for solids (PBEsol). The calculated structural parameters like the lattice parameters, the bulk modulus B and their pressure derivative B' are in reasonable agreement with the available experimental and theoretical data. In addition, the elastic properties such as elastic constants (C11, C12, and C44), the shear modulus G, the Young modulus E, the Poisson’s ratio ν and the B/G ratio are also given. For the electronic properties calculations, the exchange and correlation effects were treated by the Tran-Blaha modified Becke-Johnson (TB-mBJ) potential to prevent the shortcoming of the underestimation of the energy gaps in both LDA and GGA approximations. The obtained results are compared to available experimental data and to other theoretical calculations.

Keywords: SrxPb1-xS, GGA-PBEsol+TB-MBJ, density functional, Perdew–Burke–Ernzerhor, FP-LAPW

Procedia PDF Downloads 398
805 The Search of Possibility of Running Six Sigma Process in It Education Center

Authors: Mohammad Amini, Aliakbar Alijarahi

Abstract:

This research that is collected and title as ‘ the search of possibility of running six sigma process in IT education center ‘ goals to test possibility of running the six sigma process and using in IT education center system. This process is a good method that is used for reducing process, errors. To evaluate running off six sigma in the IT education center, some variables relevant to this process is selected. These variables are: - The amount of support from organization master boss to process. - The current specialty. - The ability of training system for compensating reduction. - The amount of match between current culture whit six sigma culture . - The amount of current quality by comparing whit quality gain from running six sigma. For evaluation these variables we select four question and to gain the answers, we set a questionnaire from with 28 question and distribute it in our typical society. Since, our working environment is a very competition, and organization needs to decree the errors to minimum, otherwise it lasts their customers. The questionnaire from is given to 55 persons, they were filled and returned by 50 persons, after analyzing the forms these results is gained: - IT education center needs to use and run this system (six sigma) for improving their process qualities. - The most factors need to run the six sigma exist in the IT education center, but there is a need to support.

Keywords: education, customer, self-action, quality, continuous improvement process

Procedia PDF Downloads 340
804 A Longitudinal Case Study of Greek as a Second Language

Authors: M. Vassou, A. Karasimos

Abstract:

A primary concern in the field of Second Language Acquisition (SLA) research is to determine the innate mechanisms of second language learning and acquisition through the systematic study of a learner's interlanguage. Errors emerge while a learner attempts to communicate using the target-language and can be seen either as the observable linguistic product of the latent cognitive and language process of mental representations or as an indispensable learning mechanism. Therefore, the study of the learner’s erroneous forms may depict the various strategies and mechanisms that take place during the language acquisition process resulting in deviations from the target-language norms and difficulties in communication. Mapping the erroneous utterances of a late adult learner in the process of acquiring Greek as a second language constitutes one of the main aims of this study. For our research purposes, we created an error-tagged learner corpus composed of the participant’s written texts produced throughout a period of a 4- year instructed language acquisition. Error analysis and interlanguage theory constitute the methodological and theoretical framework, respectively. The research questions pertain to the learner's most frequent errors per linguistic category and per year as well as his choices concerning the Greek Article System. According to the quantitative analysis of the data, the most frequent errors are observed in the categories of the stress system and syntax, whereas a significant fluctuation and/or gradual reduction throughout the 4 years of instructed acquisition indicate the emergence of developmental stages. The findings with regard to the article usage bespeak fossilization of erroneous structures in certain contexts. In general, our results point towards the existence and further development of an established learner’s (inter-) language system governed not only by mother- tongue and target-language influences but also by the learner’s assumptions and set of rules as the result of a complex cognitive process. It is expected that this study will contribute not only to the knowledge in the field of Greek as a second language and SLA generally, but it will also provide an insight into the cognitive mechanisms and strategies developed by multilingual learners of late adulthood.

Keywords: Greek as a second language, error analysis, interlanguage, late adult learner

Procedia PDF Downloads 127
803 Machine Learning Approach for Mutation Testing

Authors: Michael Stewart

Abstract:

Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.

Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing

Procedia PDF Downloads 198
802 Development & Standardization of a Literacy Free Cognitive Rehabilitation Program for Patients Post Traumatic Brain Injury

Authors: Sakshi Chopra, Ashima Nehra, Sumit Sinha, Harsimarpreet Kaur, Ravindra Mohan Pandey

Abstract:

Background: Cognitive rehabilitation aims to retrain brain injured individuals with cognitive deficits to restore or compensate lost functions. As illiterates or people with low literacy levels represent a significant proportion of the world, specific rehabilitation modules for such populations are indispensable. Literacy is significantly associated with all neuropsychological measures and retraining programs widely use written or spoken techniques which essentially require the patient to read or write. So, the aim of the study was to develop and standardize a literacy free neuropsychological rehabilitation program for improving cognitive functioning in patients with mild and moderate Traumatic Brain Injury (TBI). Several studies have pointed out to the impairments seen in memory, executive functioning, and attention and concentration post-TBI, so the rehabilitation program focussed on these domains. Visual item memorization, stick constructions, symbol cancellations, and colouring techniques were used to construct the retraining program. Methodology: The development of the program consisted of planning, preparing, analyzing, and revising the different modules. The construction focussed on areas of retraining immediate and delayed visual memory, planning ability, focused and divided attention, concentration, and response inhibition (to control irritability and aggression). A total of 98 home based retraining modules were prepared in the 4 domains (42 for memory, 42 for executive functioning, 7 for attention and concentration, and 7 for response inhibition). The standardization was done on 20 healthy controls to review, select and edit items. For each module, the time, errors made and errors per second were noted down, to establish the difficulty level of each module and were arranged in increasing level of difficulty over a period of 6 weeks. The retraining tasks were then administered on 11 brain injured individuals (5 after Mild TBI and 6 after Moderate TBI). These patients were referred from the Trauma Centre to Clinical Neuropsychology OPD, All India Institute of Medical Sciences, New Delhi, India. Results: The time was taken, errors made and errors per second were analysed for all domains. Education levels were divided into illiterates, up to 10 years, 10 years to graduation and graduation and above. Mean and standard deviations were calculated. Between group and within group analysis was done using the t-test. The performance of 20 healthy controls was analyzed and only a significant difference was observed on the time taken for the attention tasks and all other domains had non-significant differences in performance between different education levels. Comparing the errors, time taken between patient and control group, there was a significant difference in all the domains at the 0.01 level except the errors made on executive functioning, indicating that the tool can successfully differentiate between healthy controls and patient groups. Conclusions: Apart from the time taken for symbol cancellations, the entire cognitive rehabilitation program is literacy free. As it taps the major areas of impairment post-TBI, it could be a useful tool to rehabilitate the patient population with low literacy levels across the world. The next step is already underway to test its efficacy in improving cognitive functioning in a randomized clinical controlled trial.

Keywords: cognitive rehabilitation, illiterates, India, traumatic brain injury

Procedia PDF Downloads 333