Search results for: dispensing errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 980

Search results for: dispensing errors

830 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 670
829 Precise Determination of the Residual Stress Gradient in Composite Laminates Using a Configurable Numerical-Experimental Coupling Based on the Incremental Hole Drilling Method

Authors: A. S. Ibrahim Mamane, S. Giljean, M.-J. Pac, G. L’Hostis

Abstract:

Fiber reinforced composite laminates are particularly subject to residual stresses due to their heterogeneity and the complex chemical, mechanical and thermal mechanisms that occur during their processing. Residual stresses are now well known to cause damage accumulation, shape instability, and behavior disturbance in composite parts. Many works exist in the literature on techniques for minimizing residual stresses in thermosetting and thermoplastic composites mainly. To study in-depth the influence of processing mechanisms on the formation of residual stresses and to minimize them by establishing a reliable correlation, it is essential to be able to measure very precisely the profile of residual stresses in the composite. Residual stresses are important data to consider when sizing composite parts and predicting their behavior. The incremental hole drilling is very effective in measuring the gradient of residual stresses in composite laminates. This method is semi-destructive and consists of drilling incrementally a hole through the thickness of the material and measuring relaxation strains around the hole for each increment using three strain gauges. These strains are then converted into residual stresses using a matrix of coefficients. These coefficients, called calibration coefficients, depending on the diameter of the hole and the dimensions of the gauges used. The reliability of the incremental hole drilling depends on the accuracy with which the calibration coefficients are determined. These coefficients are calculated using a finite element model. The samples’ features and the experimental conditions must be considered in the simulation. Any mismatch can lead to inadequate calibration coefficients, thus introducing errors on residual stresses. Several calibration coefficient correction methods exist for isotropic material, but there is a lack of information on this subject concerning composite laminates. In this work, a Python program was developed to automatically generate the adequate finite element model. This model allowed us to perform a parametric study to assess the influence of experimental errors on the calibration coefficients. The results highlighted the sensitivity of the calibration coefficients to the considered errors and gave an order of magnitude of the precisions required on the experimental device to have reliable measurements. On the basis of these results, improvements were proposed on the experimental device. Furthermore, a numerical method was proposed to correct the calibration coefficients for different types of materials, including thick composite parts for which the analytical approach is too complex. This method consists of taking into account the experimental errors in the simulation. Accurate measurement of the experimental errors (such as eccentricity of the hole, angular deviation of the gauges from their theoretical position, or errors on increment depth) is therefore necessary. The aim is to determine more precisely the residual stresses and to expand the validity domain of the incremental hole drilling technique.

Keywords: fiber reinforced composites, finite element simulation, incremental hole drilling method, numerical correction of the calibration coefficients, residual stresses

Procedia PDF Downloads 132
828 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: software quality, fuzzy logic, perception, prediction

Procedia PDF Downloads 317
827 Integrating Deterministic and Probabilistic Safety Assessment to Decrease Risk & Energy Consumption in a Typical PWR

Authors: Ebrahim Ghanbari, Mohammad Reza Nematollahi

Abstract:

Integrating deterministic and probabilistic safety assessment (IDPSA) is one of the most commonly used issues in the field of safety analysis of power plant accident. It has also been recognized today that the role of human error in creating these accidents is not less than systemic errors, so the human interference and system errors in fault and event sequences are necessary. The integration of these analytical topics will be reflected in the frequency of core damage and also the study of the use of water resources in an accident such as the loss of all electrical power of the plant. In this regard, the SBO accident was simulated for the pressurized water reactor in the deterministic analysis issue, and by analyzing the operator's behavior in controlling the accident, the results of the combination of deterministic and probabilistic assessment were identified. The results showed that the best performance of the plant operator would reduce the risk of an accident by 10%, as well as a decrease of 6.82 liters/second of the water sources of the plant.

Keywords: IDPSA, human error, SBO, risk

Procedia PDF Downloads 129
826 Towards a Complete Automation Feature Recognition System for Sheet Metal Manufacturing

Authors: Bahaa Eltahawy, Mikko Ylihärsilä, Reino Virrankoski, Esko Petäjä

Abstract:

Sheet metal processing is automated, but the step from product models to the production machine control still requires human intervention. This may cause time consuming bottlenecks in the production process and increase the risk of human errors. In this paper we present a system, which automatically recognizes features from the CAD-model of the sheet metal product. By using these features, the system produces a complete model of the particular sheet metal product. Then the model is used as an input for the sheet metal processing machine. Currently the system is implemented, capable to recognize more than 11 of the most common sheet metal structural features, and the procedure is fully automated. This provides remarkable savings in the production time, and protects against the human errors. This paper presents the developed system architecture, applied algorithms and system software implementation and testing.

Keywords: feature recognition, automation, sheet metal manufacturing, CAD, CAM

Procedia PDF Downloads 354
825 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints

Authors: Maxime Merheb, Rachel Matar

Abstract:

Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.

Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR

Procedia PDF Downloads 400
824 Improved Pitch Detection Using Fourier Approximation Method

Authors: Balachandra Kumaraswamy, P. G. Poonacha

Abstract:

Automatic Music Information Retrieval has been one of the challenging topics of research for a few decades now with several interesting approaches reported in the literature. In this paper we have developed a pitch extraction method based on a finite Fourier series approximation to the given window of samples. We then estimate pitch as the fundamental period of the finite Fourier series approximation to the given window of samples. This method uses analysis of the strength of harmonics present in the signal to reduce octave as well as harmonic errors. The performance of our method is compared with three best known methods for pitch extraction, namely, Yin, Windowed Special Normalization of the Auto-Correlation Function and Harmonic Product Spectrum methods of pitch extraction. Our study with artificially created signals as well as music files show that Fourier Approximation method gives much better estimate of pitch with less octave and harmonic errors.

Keywords: pitch, fourier series, yin, normalization of the auto- correlation function, harmonic product, mean square error

Procedia PDF Downloads 412
823 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement

Authors: Hu Zhenxing, Gao Jianxin

Abstract:

Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.

Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D

Procedia PDF Downloads 498
822 Design of a Pneumonia Ontology for Diagnosis Decision Support System

Authors: Sabrina Azzi, Michal Iglewski, Véronique Nabelsi

Abstract:

Diagnosis error problem is frequent and one of the most important safety problems today. One of the main objectives of our work is to propose an ontological representation that takes into account the diagnostic criteria in order to improve the diagnostic. We choose pneumonia disease since it is one of the frequent diseases affected by diagnosis errors and have harmful effects on patients. To achieve our aim, we use a semi-automated method to integrate diverse knowledge sources that include publically available pneumonia disease guidelines from international repositories, biomedical ontologies and electronic health records. We follow the principles of the Open Biomedical Ontologies (OBO) Foundry. The resulting ontology covers symptoms and signs, all the types of pneumonia, antecedents, pathogens, and diagnostic testing. The first evaluation results show that most of the terms are covered by the ontology. This work is still in progress and represents a first and major step toward a development of a diagnosis decision support system for pneumonia.

Keywords: Clinical decision support system, Diagnostic errors, Ontology, Pneumonia

Procedia PDF Downloads 188
821 A Tutorial on Model Predictive Control for Spacecraft Maneuvering Problem with Theory, Experimentation and Applications

Authors: O. B. Iskender, K. V. Ling, V. Dubanchet, L. Simonini

Abstract:

This paper discusses the recent advances and future prospects of spacecraft position and attitude control using Model Predictive Control (MPC). First, the challenges of the space missions are summarized, in particular, taking into account the errors, uncertainties, and constraints imposed by the mission, spacecraft and, onboard processing capabilities. The summary of space mission errors and uncertainties provided in categories; initial condition errors, unmodeled disturbances, sensor, and actuator errors. These previous constraints are classified into two categories: physical and geometric constraints. Last, real-time implementation capability is discussed regarding the required computation time and the impact of sensor and actuator errors based on the Hardware-In-The-Loop (HIL) experiments. The rationales behind the scenarios’ are also presented in the scope of space applications as formation flying, attitude control, rendezvous and docking, rover steering, and precision landing. The objectives of these missions are explained, and the generic constrained MPC problem formulations are summarized. Three key design elements used in MPC design: the prediction model, the constraints formulation and the objective cost function are discussed. The prediction models can be linear time invariant or time varying depending on the geometry of the orbit, whether it is circular or elliptic. The constraints can be given as linear inequalities for input or output constraints, which can be written in the same form. Moreover, the recent convexification techniques for the non-convex geometrical constraints (i.e., plume impingement, Field-of-View (FOV)) are presented in detail. Next, different objectives are provided in a mathematical framework and explained accordingly. Thirdly, because MPC implementation relies on finding in real-time the solution to constrained optimization problems, computational aspects are also examined. In particular, high-speed implementation capabilities and HIL challenges are presented towards representative space avionics. This covers an analysis of future space processors as well as the requirements of sensors and actuators on the HIL experiments outputs. The HIL tests are investigated for kinematic and dynamic tests where robotic arms and floating robots are used respectively. Eventually, the proposed algorithms and experimental setups are introduced and compared with the authors' previous work and future plans. The paper concludes with a conjecture that MPC paradigm is a promising framework at the crossroads of space applications while could be further advanced based on the challenges mentioned throughout the paper and the unaddressed gap.

Keywords: convex optimization, model predictive control, rendezvous and docking, spacecraft autonomy

Procedia PDF Downloads 110
820 Design Optimization of a Compact Quadrupole Electromagnet for CLS 2.0

Authors: Md. Armin Islam, Les Dallin, Mark Boland, W. J. Zhang

Abstract:

This paper reports a study on the optimal magnetic design of a compact quadrupole electromagnet for the Canadian Light Source (CLS 2.0). The nature of the design is to determine a quadrupole with low relative higher order harmonics and better field quality. The design problem was formulated as an optimization model, in which the objective function is the higher order harmonics (multipole errors) and the variable to be optimized is the material distribution on the pole. The higher order harmonics arose in the quadrupole due to truncating the ideal hyperbola at a certain point to make the pole. In this project, the arisen harmonics have been optimized both transversely and longitudinally by adjusting material on the poles in a controlled way. For optimization, finite element analysis (FEA) has been conducted. A better higher order harmonics amplitudes and field quality have been achieved through the optimization. On the basis of the optimized magnetic design, electrical and cooling calculation has been performed for the magnet.

Keywords: drift, electrical, and cooling calculation, integrated field, magnetic field gradient, multipole errors, quadrupole

Procedia PDF Downloads 143
819 Effects of Manufacture and Assembly Errors on the Output Error of Globoidal Cam Mechanisms

Authors: Shuting Ji, Yueming Zhang, Jing Zhao

Abstract:

The output error of the globoidal cam mechanism can be considered as a relevant indicator of mechanism performance, because it determines kinematic and dynamical behavior of mechanical transmission. Based on the differential geometry and the rigid body transformations, the mathematical model of surface geometry of the globoidal cam is established. Then we present the analytical expression of the output error (including the transmission error and the displacement error along the output axis) by considering different manufacture and assembly errors. The effects of the center distance error, the perpendicular error between input and output axes and the rotational angle error of the globoidal cam on the output error are systematically analyzed. A globoidal cam mechanism which is widely used in automatic tool changer of CNC machines is applied for illustration. Our results show that the perpendicular error and the rotational angle error have little effects on the transmission error but have great effects on the displacement error along the output axis. This study plays an important role in the design, manufacture and assembly of the globoidal cam mechanism.

Keywords: globoidal cam mechanism, manufacture error, transmission error, automatic tool changer

Procedia PDF Downloads 574
818 Advanced Digital Manufacturing: Case Study

Authors: Abdelrahman Abdelazim

Abstract:

Most industries are looking for technologies that are easy to use, efficient and fast to accomplish. To implement these, factories tend to use advanced systems that could alter complicity to simplicity and rudimentary to advancement. Cloud Manufacturing is a new movement that aims to mirror and integrate cloud computing into manufacturing. Amongst cloud manufacturing various advantages are decreasing the human involvements and increasing the dependency on automated machines, which in turns decreases human errors and increases efficiency. A reliable and extraordinary performance processes with minimum errors are highly desired factors of today’s manufacturers. At the glance it seems to be the best alternative, however, the implementation of a cloud system can be very challenging. This work investigates cloud manufacturing in details, it outlines its advantages and disadvantages by converting a local factory in Kuwait to a cloud-ready system. Initially the flow of the factory’s manufacturing process has been analyzed identifying the bottlenecks and illustrating how cloud manufacturing can eliminate them. Following this an automation process has been analyzed and implemented. A comparison between the process before and after the adaptation has been carried out showing the effects on the cost, the output and the efficiency of the process.

Keywords: cloud manufacturing, automation, Kuwait industrial sector, advanced digital manufacturing

Procedia PDF Downloads 771
817 Pathological Gambling and Impulsivity: Comparison of the Eight Laboratory Measures of Inhibition Capacities

Authors: Semion Kertzman, Pinhas Dannon

Abstract:

Impulsive behaviour and the underlying brain processes are hypothesized to be central in the development and maintenance of pathological gambling. Inhibition ability can be differentially impaired in pathological gamblers (PGs). Aims: This study aimed to compare the ability of eight widely used inhibition measures to discriminate between PGs and healthy controls (HCs). Methods: PGs (N=51) and demographically matched HCs (N=51) performed cognitive inhibition (the Stroop), motor inhibition (the Go/NoGo) and reflective inhibition (the Matching Familiar Figures (MFFT)) tasks. Results: An augmented total interference response time in the Stroop task (η² =0.054), a large number of commission errors (η² =0.053) in the Go/NoGo task, and the total number of errors in the MFFT (η² =0.05) can discriminate PGs from HCs. Other measures are unable to differentiate between PGs and HCs. No significant correlations were observed between inhibition measures. Conclusion: Inhibition measures varied in the ability to discriminate PGs from HCs. Most inhibition measures were not relevant to gambling behaviour. PGs do not express rash, impulsive behaviour, such as quickly choosing an answer without thinking. In contrast, in PGs, inhibition impairment was related to slow-inaccurate performance.

Keywords: pathological gambling, impulsivity, neurocognition, addiction

Procedia PDF Downloads 302
816 Error Analysis of the Pronunciation of English Consonants and Arabic Consonants by Egyptian Learners

Authors: Marwa A. Nasser

Abstract:

This is an empirical study that provides an investigation of the most significant errors of Egyptian learners in producing English consonants and Arabic consonants, and advice on how these can be remedied. The study adopts a descriptive approach and the analysis is based on audio recordings of two groups of people. The first group includes six volunteers of Egyptian learners belonging to the English Department at Faculty of Women who learn English as a foreign language. The other group includes six Egyptian learners who are studying Tajweed (how to recite Quran correctly). The audio recordings were examined, and sounds were analyzed in an attempt to highlight the most common error done by the learners while reading English or reading (or reciting) Quran. Results show that the two groups of learners have problems with certain phonemic contrasts. Both groups share common errors although both languages are different and not related (e.g. pre-aspiration of fortis stops, incorrect articulation of consonants and velarization of certain sounds).

Keywords: consonant articulations, Egyptian learners of English, Egyptian learners of Quran, empirical study, error analysis, pronunciation problems

Procedia PDF Downloads 269
815 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions

Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini

Abstract:

This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.

Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing

Procedia PDF Downloads 146
814 Evaluating Forecasts Through Stochastic Loss Order

Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio

Abstract:

We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.

Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test

Procedia PDF Downloads 89
813 The Influence of Cognitive Load in the Acquisition of Words through Sentence or Essay Writing

Authors: Breno Barrreto Silva, Agnieszka Otwinowska, Katarzyna Kutylowska

Abstract:

Research comparing lexical learning following the writing of sentences and longer texts with keywords is limited and contradictory. One possibility is that the recursivity of writing may enhance processing and increase lexical learning; another possibility is that the higher cognitive load of complex-text writing (e.g., essays), at least when timed, may hinder the learning of words. In our study, we selected 2 sets of 10 academic keywords matched for part of speech, length (number of characters), frequency (SUBTLEXus), and concreteness, and we asked 90 L1-Polish advanced-level English majors to use the keywords when writing sentences, timed (60 minutes) or untimed essays. First, all participants wrote a timed Control essay (60 minutes) without keywords. Then different groups produced Timed essays (60 minutes; n=33), Untimed essays (n=24), or Sentences (n=33) using the two sets of glossed keywords (counterbalanced). The comparability of the participants in the three groups was ensured by matching them for proficiency in English (LexTALE), and for few measures derived from the control essay: VocD (assessing productive lexical diversity), normed errors (assessing productive accuracy), words per minute (assessing productive written fluency), and holistic scores (assessing overall quality of production). We measured lexical learning (depth and breadth) via an adapted Vocabulary Knowledge Scale (VKS) and a free association test. Cognitive load was measured in the three essays (Control, Timed, Untimed) using normed number of errors and holistic scores (TOEFL criteria). The number of errors and essay scores were obtained from two raters (interrater reliability Pearson’s r=.78-91). Generalized linear mixed models showed no difference in the breadth and depth of keyword knowledge after writing Sentences, Timed essays, and Untimed essays. The task-based measurements found that Control and Timed essays had similar holistic scores, but that Untimed essay had better quality than Timed essay. Also, Untimed essay was the most accurate, and Timed essay the most error prone. Concluding, using keywords in Timed, but not Untimed, essays increased cognitive load, leading to more errors and lower quality. Still, writing sentences and essays yielded similar lexical learning, and differences in the cognitive load between Timed and Untimed essays did not affect lexical acquisition.

Keywords: learning academic words, writing essays, cognitive load, english as an L2

Procedia PDF Downloads 73
812 Using Real Truck Tours Feedback for Address Geocoding Correction

Authors: Dalicia Bouallouche, Jean-Baptiste Vioix, Stéphane Millot, Eric Busvelle

Abstract:

When researchers or logistics software developers deal with vehicle routing optimization, they mainly focus on minimizing the total travelled distance or the total time spent in the tours by the trucks, and maximizing the number of visited customers. They assume that the upstream real data given to carry the optimization of a transporter tours is free from errors, like customers’ real constraints, customers’ addresses and their GPS-coordinates. However, in real transporter situations, upstream data is often of bad quality because of address geocoding errors and the irrelevance of received addresses from the EDI (Electronic Data Interchange). In fact, geocoders are not exempt from errors and could give impertinent GPS-coordinates. Also, even with a good geocoding, an inaccurate address can lead to a bad geocoding. For instance, when the geocoder has trouble with geocoding an address, it returns those of the center of the city. As well, an obvious geocoding issue is that the mappings used by the geocoders are not regularly updated. Thus, new buildings could not exist on maps until the next update. Even so, trying to optimize tours with impertinent customers GPS-coordinates, which are the most important and basic input data to take into account for solving a vehicle routing problem, is not really useful and will lead to a bad and incoherent solution tours because the locations of the customers used for the optimization are very different from their real positions. Our work is supported by a logistics software editor Tedies and a transport company Upsilon. We work with Upsilon's truck routes data to carry our experiments. In fact, these trucks are equipped with TOMTOM GPSs that continuously save their tours data (positions, speeds, tachograph-information, etc.). We, then, retrieve these data to extract the real truck routes to work with. The aim of this work is to use the experience of the driver and the feedback of the real truck tours to validate GPS-coordinates of well geocoded addresses, and bring a correction to the badly geocoded addresses. Thereby, when a vehicle makes its tour, for each visited customer, the vehicle might have trouble with finding this customer’s address at most once. In other words, the vehicle would be wrong at most once for each customer’s address. Our method significantly improves the quality of the geocoding. Hence, we achieve to automatically correct an average of 70% of GPS-coordinates of a tour addresses. The rest of the GPS-coordinates are corrected in a manual way by giving the user indications to help him to correct them. This study shows the importance of taking into account the feedback of the trucks to gradually correct address geocoding errors. Indeed, the accuracy of customer’s address and its GPS-coordinates play a major role in tours optimization. Unfortunately, address writing errors are very frequent. This feedback is naturally and usually taken into account by transporters (by asking drivers, calling customers…), to learn about their tours and bring corrections to the upcoming tours. Hence, we develop a method to do a big part of that automatically.

Keywords: driver experience feedback, geocoding correction, real truck tours

Procedia PDF Downloads 674
811 A New Approach to the Boom Welding Technique by Determining Seam Profile Tracking

Authors: Muciz Özcan, Mustafa Sacid Endiz, Veysel Alver

Abstract:

In this paper we present a new approach to the boom welding related to the mobile cranes manufacturing, implementing a new method in order to get homogeneous welding quality and reduced energy usage during booms production. We aim to get the realization of the same welding quality carried out on the boom in every region during the manufacturing process and to detect the possible welding errors whether they could be eliminated using laser sensors. We determine the position of the welding region directly through our system and with the help of the welding oscillator we are able to perform a proper boom welding. Errors that may occur in the welding process can be observed by monitoring and eliminated by means of an operator. The major modification in the production of the crane booms will be their form of the booms. Although conventionally, more than one welding is required to perform this process, with the suggested concept, only one particular welding is sufficient, which will be more energy and environment-friendly. Consequently, as only one welding is needed for the manufacturing of the boom, the particular welding quality becomes more essential. As a way to satisfy the welding quality, a welding manipulator was made and fabricated. By using this welding manipulator, the risks of involving dangerous gases formed during the welding process for the operator and the surroundings are diminished as much as possible.

Keywords: boom welding, seam tracking, energy saving, global warming

Procedia PDF Downloads 346
810 Direct Phoenix Identification and Antimicrobial Susceptibility Testing from Positive Blood Culture Broths

Authors: Waad Al Saleemi, Badriya Al Adawi, Zaaima Al Jabri, Sahim Al Ghafri, Jalila Al Hadhramia

Abstract:

Objectives: Using standard lab methods, a positive blood culture requires a minimum of two days (two occasions of overnight incubation) to obtain a final identification (ID) and antimicrobial susceptibility results (AST) report. In this study, we aimed to evaluate the accuracy and precision of identification and antimicrobial susceptibility testing of an alternative method (direct method) that will reduce the turnaround time by 24 hours. This method involves the direct inoculation of positive blood culture broths into the Phoenix system using serum separation tubes (SST). Method: This prospective study included monomicrobial-positive blood cultures obtained from January 2022 to May 2023 in SQUH. Blood cultures containing a mixture of organisms, fungi, or anaerobic organisms were excluded from this study. The result of the new “direct method” under study was compared with the current “standard method” used in the lab. The accuracy and precision were evaluated for the ID and AST using Clinical and Laboratory Standards Institute (CLSI) recommendations. The categorical agreement, essential agreement, and the rates of very major errors (VME), major errors (ME), and minor errors (MIE) for both gram-negative and gram-positive bacteria were calculated. Passing criteria were set according to CLSI. Result: The results of ID and AST were available for a total of 158 isolates. Of 77 isolates of gram-negative bacteria, 71 (92%) were correctly identified at the species level. Of 70 isolates of gram-positive bacteria, 47(67%) isolates were correctly identified. For gram-negative bacteria, the essential agreement of the direct method was ≥92% when compared to the standard method, while the categorical agreement was ≥91% for all tested antibiotics. The precision of ID and AST were noted to be 100% for all tested isolates. For gram-positive bacteria, the essential agreement was >93%, while the categorical agreement was >92% for all tested antibiotics except moxifloxacin. Many antibiotics were noted to have an unacceptable higher rate of very major errors including penicillin, cotrimoxazole, clindamycin, ciprofloxacin, and moxifloxacin. However, no error was observed in the results of vancomycin, linezolid, and daptomycin. Conclusion: The direct method of ID and AST for positive blood cultures using SST is reliable for gram negative bacteria. It will significantly decrease the turnaround time and will facilitate antimicrobial stewardship.

Keywords: bloodstream infection, oman, direct ast, blood culture, rapid identification, antimicrobial susceptibility, phoenix, direct inoculation

Procedia PDF Downloads 62
809 ESL Students’ Engagement with Written Corrective Feedback

Authors: Khaled Karim

Abstract:

Although a large number of studies have examined the effectiveness of written corrective feedback (WCF) in L2 writing, very few studies have investigated students’ attitudes towards the feedback and their perspectives regarding the usefulness of different types of feedback. Using prompted stimulated recall interviews, this study investigated ESL students’ perceptions and attitudes towards the CF they received as well as their preferences and reactions to the corrections. 24 ESL students first received direct (e.g., providing target forms after crossing out erroneous forms) and indirect (e.g., underlining and underline+metalinguistic) CF on four written tasks and then participated in an interview with the researcher. The analysis revealed that both direct and indirect CF were judged to be useful strategies for correction but in different ways. Underline only CF helped them think about the nature and type of the errors they made while metalinguistic CF was useful as it provided clues about the nature and type of the errors. Most participants indicated that indirect correction needed sufficient prior knowledge of the form to be effective. The majority of the students found the combination of underlining with metalinguistic information as the most effective method of providing feedback. Detailed findings will be presented, and pedagogical implications of the study will be discussed.

Keywords: ESL writing, error correction, feedback, written corrective feedback

Procedia PDF Downloads 236
808 Applying Simulation-Based Digital Teaching Plans and Designs in Operating Medical Equipment

Authors: Kuo-Kai Lin, Po-Lun Chang

Abstract:

Background: The Emergency Care Research Institute released a list for the top 10 medical technology hazards in 2017, with the following hazard topping the list: ‘infusion errors can be deadly if simple safety steps are overlooked.’ In addition, hospitals use various assessment items to evaluate the safety of their medical equipment, confirming the importance of medical equipment safety. In recent years, the topic of patient safety has garnered increasing attention. Accordingly, various agencies have established patient safety-related committees to coordinate, collect, and analyze information regarding abnormal events associated with medical practice. Activities to promote and improve employee training have been introduced to diminish the recurrence of medical malpractice. Objective: To allow nursing personnel to acquire the skills needed to operate common medical equipment and update and review such skills whenever necessary to elevate medical care quality and reduce patient injuries caused by medical equipment operation errors. Method: In this study, a quasi-experimental design was adopted and nurses from a regional teaching hospital were selected as the study sample. Online videos instructing the operation method of common medical equipment were made and quick response codes were designed for the nursing personnel to quickly access the videos when necessary. Senior nursing supervisors and equipment experts were invited to formulate a ‘Scale-based Questionnaire for Assessing Nursing Personnel’s Operational Knowledge of Common Medical Equipment’ to evaluate the nursing personnel’s literacy regarding the operation of the medical equipment. From March to October 2017, an employee training on medical equipment operation and a practice course (simulation course) were implemented, after which the effectiveness of the training and practice course were assessed. Results: Prior to and after the training and practice course, the 66 participating nurses scored 58 and 87 on ‘operational knowledge of common medical equipment,’ respectively (showing a significant statistical difference; t = -9.407, p < .001); 53.5 and 86.3 on ‘operational knowledge of 12-lead electrocardiography’ (z = -2.087, p < .01), respectively; 40 and 79.5 on ‘operational knowledge of cardiac defibrillators’ (z = -3.849, p < .001), respectively; 90 and 98 on ‘operational knowledge of Abbott pumps’ (z = -1.841, p = 0.066), respectively; and 8.7 and 13.7 on ‘perceived competence’ (showing a significant statistical difference; t = -2.77, p < .05). In the participating hospital, medical equipment operation errors were observed in both 2016 and 2017. However, since the implementation of the intervention, medical equipment operation errors have not yet been observed up to October 2017, which can be regarded as the secondary outcome of this study. Conclusion: In this study, innovative teaching strategies were adopted to effectively enhance the professional literacy and skills of nursing personnel in operating medical equipment. The training and practice course also elevated the nursing personnel’s related literacy and perceived competence of operating medical equipment. The nursing personnel was thus able to accurately operate the medical equipment and avoid operational errors that might jeopardize patient safety.

Keywords: medical equipment, digital teaching plan, simulation-based teaching plan, operational knowledge, patient safety

Procedia PDF Downloads 138
807 An Alternative Stratified Cox Model for Correlated Variables in Infant Mortality

Authors: K. A. Adeleke

Abstract:

Often in epidemiological research, introducing stratified Cox model can account for the existence of interactions of some inherent factors with some major/noticeable factors. This research work aimed at modelling correlated variables in infant mortality with the existence of some inherent factors affecting the infant survival function. An alternative semiparametric Stratified Cox model is proposed with a view to take care of multilevel factors that have interactions with others. This, however, was used as a tool to model infant mortality data from Nigeria Demographic and Health Survey (NDHS) with some multilevel factors (Tetanus, Polio, and Breastfeeding) having correlation with main factors (Sex, Size, and Mode of Delivery). Asymptotic properties of the estimators are also studied via simulation. The tested model via data showed good fit and performed differently depending on the levels of the interaction of the strata variable Z*. An evidence that the baseline hazard functions and regression coefficients are not the same from stratum to stratum provides a gain in information as against the usage of Cox model. Simulation result showed that the present method produced better estimates in terms of bias, lower standard errors, and or mean square errors.

Keywords: stratified Cox, semiparametric model, infant mortality, multilevel factors, cofounding variables

Procedia PDF Downloads 557
806 Smart Card Technology Adaption in a Hospital Setting

Authors: H. K. V. Narayan

Abstract:

This study was conducted at Tata Memorial Hospital (TMH), Mumbai, India. The study was to evaluate the impact of adapting Smart Card (SC) for clinical and business transactions in order to reduce Lead times and to enforce business rules of the hospital. The objective for implementing the Smart Card was to improve the patient perception of quality in terms of structures process and outcomes and also to improve the productivity of the Institution. The Smart Card was implemented in phases from 2011 and integrated with the Hospital Information System (HIS/EMR). The implementation was a learning curve for all the stake holders as software obviated the need to use hardcopies of transactions. The acceptability to the stake holders was challenge in change management. The study assessed the impact 3 years into the implementation and the observed trends have suggested that it has decreased the lead times for services and increased the no of transactions and thereby the productivity. Patients who used to complain of multiple queues and cumbersome transactions now compliment the administration for effective use of Information and Communication Technology.

Keywords: smart card, high availability of health care information, reduction in potential medical errors due to elimination of transcription errors, reduction in no of queues, increased transactions, augmentation of revenue

Procedia PDF Downloads 285
805 A Real Time Ultra-Wideband Location System for Smart Healthcare

Authors: Mingyang Sun, Guozheng Yan, Dasheng Liu, Lei Yang

Abstract:

Driven by the demand of intelligent monitoring in rehabilitation centers or hospitals, a high accuracy real-time location system based on UWB (ultra-wideband) technology was proposed. The system measures precise location of a specific person, traces his movement and visualizes his trajectory on the screen for doctors or administrators. Therefore, doctors could view the position of the patient at any time and find them immediately and exactly when something emergent happens. In our design process, different algorithms were discussed, and their errors were analyzed. In addition, we discussed about a , simple but effective way of correcting the antenna delay error, which turned out to be effective. By choosing the best algorithm and correcting errors with corresponding methods, the system attained a good accuracy. Experiments indicated that the ranging error of the system is lower than 7 cm, the locating error is lower than 20 cm, and the refresh rate exceeds 5 times per second. In future works, by embedding the system in wearable IoT (Internet of Things) devices, it could provide not only physical parameters, but also the activity status of the patient, which would help doctors a lot in performing healthcare.

Keywords: intelligent monitoring, ultra-wideband technology, real-time location, IoT devices, smart healthcare

Procedia PDF Downloads 140
804 Peer Corrective Feedback on Written Errors in Computer-Mediated Communication

Authors: S. H. J. Liu

Abstract:

This paper aims to explore the role of peer Corrective Feedback (CF) in improving written productions by English-as-a- foreign-language (EFL) learners who work together via Wikispaces. It attempted to determine the effect of peer CF on form accuracy in English, such as grammar and lexis. Thirty-four EFL learners at the tertiary level were randomly assigned into the experimental (with peer feedback) or the control (without peer feedback) group; each group was subdivided into small groups of two or three. This resulted in six and seven small groups in the experimental and control groups, respectively. In the experimental group, each learner played a role as an assessor (providing feedback to others), as well as an assessee (receiving feedback from others). Each participant was asked to compose his/her written work and revise it based on the feedback. In the control group, on the other hand, learners neither provided nor received feedback but composed and revised their written work on their own. Data collected from learners’ compositions and post-task interviews were analyzed and reported in this study. Following the completeness of three writing tasks, 10 participants were selected and interviewed individually regarding their perception of collaborative learning in the Computer-Mediated Communication (CMC) environment. Language aspects to be analyzed included lexis (e.g., appropriate use of words), verb tenses (e.g., present and past simple), prepositions (e.g., in, on, and between), nouns, and articles (e.g., a/an). Feedback types consisted of CF, affective, suggestive, and didactic. Frequencies of feedback types and the accuracy of the language aspects were calculated. The results first suggested that accurate items were found more in the experimental group than in the control group. Such results entail that those who worked collaboratively outperformed those who worked non-collaboratively on the accuracy of linguistic aspects. Furthermore, the first type of CF (e.g., corrections directly related to linguistic errors) was found to be the most frequently employed type, whereas affective and didactic were the least used by the experimental group. The results further indicated that most participants perceived that peer CF was helpful in improving the language accuracy, and they demonstrated a favorable attitude toward working with others in the CMC environment. Moreover, some participants stated that when they provided feedback to their peers, they tended to pay attention to linguistic errors in their peers’ work but overlook their own errors (e.g., past simple tense) when writing. Finally, L2 or FL teachers or practitioners are encouraged to employ CMC technologies to train their students to give each other feedback in writing to improve the accuracy of the language and to motivate them to attend to the language system.

Keywords: peer corrective feedback, computer-mediated communication (CMC), second or foreign language (L2 or FL) learning, Wikispaces

Procedia PDF Downloads 245
803 Market Illiquidity and Pricing Errors in the Term Structure of CDS

Authors: Lidia Sanchis-Marco, Antonio Rubia, Pedro Serrano

Abstract:

This paper studies the informational content of pricing errors in the term structure of sovereign CDS spreads. The residuals from a non-arbitrage model are employed to construct a Price discrepancy estimate, or noise measure. The noise estimate is understood as an indicator of market distress and reflects frictions such as illiquidity. Empirically, the noise measure is computed for an extensive panel of CDS spreads. Our results reveal an important fraction of systematic risk is not priced in default swap contracts. When projecting the noise measure onto a set of financial variables, the panel-data estimates show that greater price discrepancies are systematically related to a higher level of offsetting transactions of CDS contracts. This evidence suggests that arbitrage capital flows exit the marketplace during time of distress, and this consistent with a market segmentation among investors and arbitrageurs where professional arbitrageurs are particularly ineffective at bringing prices to their fundamental values during turbulent periods. Our empirical findings are robust for the most common CDS pricing models employed in the industry.

Keywords: credit default swaps, noise measure, illiquidity, capital arbitrage

Procedia PDF Downloads 569
802 An Error Analysis of English Communication of Suan Sunandha Rajabhat University Students

Authors: Chantima Wangsomchok

Abstract:

The main purposes of this study are (1) to test the students’ communicative competence within six main functions: greeting, parting, thanking, offering, requesting and suggesting, (2) to employ error analysis in the students’ communicative competence within those functions, and (3) to compare the characteristics of the error found from the investigation. The subjects of the study is 328 first-year undergraduates taking the Foundation English course in the first semester of the 2008 academic year at Suan Sunandha Rajabhat University. This study found that while the subjects showed high communicative competence in the use of the following three functions: greeting, thanking, and offering, they seemed to show poor communicative competence in suggesting, requesting and parting instead. In addition, this study found that the grammatical errors were likely to be most frequently found in the parting function. In the same way, the type of errors which were less frequently found was in the functions of thanking and requesting respectively. Instead, the students tended to have high pragmatic failure in the use of greeting and suggesting functions.

Keywords: error analysis, functions of English language, communicative competence, cognitive science

Procedia PDF Downloads 431
801 Data Integrity: Challenges in Health Information Systems in South Africa

Authors: T. Thulare, M. Herselman, A. Botha

Abstract:

Poor system use, including inappropriate design of health information systems, causes difficulties in communication with patients and increased time spent by healthcare professionals in recording the necessary health information for medical records. System features like pop-up reminders, complex menus, and poor user interfaces can make medical records far more time consuming than paper cards as well as affect decision-making processes. Although errors associated with health information and their real and likely effect on the quality of care and patient safety have been documented for many years, more research is needed to measure the occurrence of these errors and determine the causes to implement solutions. Therefore, the purpose of this paper is to identify data integrity challenges in hospital information systems through a scoping review and based on the results provide recommendations on how to manage these. Only 34 papers were found to be most suitable out of 297 publications initially identified in the field. The results indicated that human and computerized systems are the most common challenges associated with data integrity and factors such as policy, environment, health workforce, and lack of awareness attribute to these challenges but if measures are taken the data integrity challenges can be managed.

Keywords: data integrity, data integrity challenges, hospital information systems, South Africa

Procedia PDF Downloads 181