Search results for: recurrent errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1271

Search results for: recurrent errors

1091 Outcomes of Pregnancy in Women with TPO Positive Status after Appropriate Dose Adjustments of Thyroxin: A Prospective Cohort Study

Authors: Revathi S. Rajan, Pratibha Malik, Nupur Garg, Smitha Avula, Kamini A. Rao

Abstract:

This study aimed to analyse the pregnancy outcomes in patients with TPO positivity after appropriate L-Thyroxin supplementation with close surveillance. All pregnant women attending the antenatal clinic at Milann-The Fertility Center, Bangalore, India- from Aug 2013 to Oct 2014 whose booking TSH was more than 2.5 mIU/L were included along with those pregnant women with prior hypothyroidism who were TPO positive. Those with TPO positive status were vigorously managed with appropriate thyroxin supplementation and the doses were readjusted every 3 to 4 weeks until delivery. Women with recurrent pregnancy loss were also tested for TPO positivity and if tested positive, were monitored serially with TSH and fT4 levels every 3 to 4 weeks and appropriately supplemented with thyroxin when the levels fluctuated. The testing was done after an informed consent in all these women. The statistical software namely SAS 9.2, SPSS 15.0, Stata 10.1, MedCalc 9.0.1, Systat 12.0 and R environment ver.2.11.1 were used for the analysis of the data. 460 pregnant women were screened for thyroid dysfunction at booking of which 52% were hypothyroid. Majority of them (31.08%) were subclinically hypothyroid and the remaining were overt. 25% of the total no. of patients screened were TPO positive. The various pregnancy complications that were observed in the TPO positive women were gestational glucose intolerance [60%], threatened abortion [21%], midtrimester abortion [4.3%], premature rupture of membranes [4.3%], cervical funneling [4.3%] and fetal growth restriction [3.5%]. 95.6% of the patients who followed up till the end delivered beyond 30 weeks. 42.6% of these patients had previous history of recurrent abortions or adverse obstetric outcome and 21.7% of the delivered babies required NICU admission. Obstetric outcomes in our study in terms of midtrimester abortions, placental abruption, and preterm delivery improved for the better after close monitoring of the thyroid hormone [TSH and fT4] levels every 3 to 4 weeks with appropriate dose adjustment throughout pregnancy. Euthyroid women with TPO positive status enrolled in the study incidentally were those with recurrent abortions/infertility and required thyroxin supplements due to elevated Thyroid hormone (TSH, fT4) levels during the course of their pregnancy. Significant associations were found with age>30 years and Hyperhomocysteinemia [p=0.017], recurrent pregnancy loss or previous adverse obstetric outcomes [p=0.067] and APLA [p=0.029]. TPO antibody levels >600 I U/ml were significantly associated with development of gestational hypertension [p=0.041] and fetal growth restriction [p=0.082]. Euthyroid women with TPO positivity were also screened periodically to counter fluctuations of the thyroid hormone levels with appropriate thyroxin supplementation. Thus, early identification along with aggressive management of thyroid dysfunction and stratification of these patients based on their TPO status with appropriate thyroxin supplementation beginning in the first trimester will aid risk modulation and also help avert complications.

Keywords: TPO antibody, subclinical hypothyroidism, anti nuclear antibody, thyroxin

Procedia PDF Downloads 309
1090 A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.

Keywords: proxy signature, fault tolerance, rsa, key agreement protocol

Procedia PDF Downloads 270
1089 Performance Assessment of GSO Satellites before and after Enhancing the Pointing Effect

Authors: Amr Emam, Joseph Victor, Mohamed Abd Elghany

Abstract:

The paper presents the effect of the orbit inclination on the pointing error of the satellite antenna and consequently on its footprint on earth for a typical Ku- band payload system. The performance assessment is examined both theoretically and by means of practical measurements, taking also into account all additional sources of pointing errors, such as East-West station keeping, orbit eccentricity and actual attitude control performance. An implementation and computation of the sinusoidal biases in satellite roll and pitch used to compensate the pointing error of the satellite antenna coverage is studied and evaluated before and after the pointing corrections performed. A method for evaluation of the performance of the implemented biases has been introduced through measuring satellite received level from a tracking 11m and fixed 4.8m transmitting antenna before and after the implementation of the pointing corrections.

Keywords: satellite, inclined orbit, pointing errors, coverage optimization

Procedia PDF Downloads 375
1088 Deep Learning Framework for Predicting Bus Travel Times with Multiple Bus Routes: A Single-Step Multi-Station Forecasting Approach

Authors: Muhammad Ahnaf Zahin, Yaw Adu-Gyamfi

Abstract:

Bus transit is a crucial component of transportation networks, especially in urban areas. Any intelligent transportation system must have accurate real-time information on bus travel times since it minimizes waiting times for passengers at different stations along a route, improves service reliability, and significantly optimizes travel patterns. Bus agencies must enhance the quality of their information service to serve their passengers better and draw in more travelers since people waiting at bus stops are frequently anxious about when the bus will arrive at their starting point and when it will reach their destination. For solving this issue, different models have been developed for predicting bus travel times recently, but most of them are focused on smaller road networks due to their relatively subpar performance in high-density urban areas on a vast network. This paper develops a deep learning-based architecture using a single-step multi-station forecasting approach to predict average bus travel times for numerous routes, stops, and trips on a large-scale network using heterogeneous bus transit data collected from the GTFS database. Over one week, data was gathered from multiple bus routes in Saint Louis, Missouri. In this study, Gated Recurrent Unit (GRU) neural network was followed to predict the mean vehicle travel times for different hours of the day for multiple stations along multiple routes. Historical time steps and prediction horizon were set up to 5 and 1, respectively, which means that five hours of historical average travel time data were used to predict average travel time for the following hour. The spatial and temporal information and the historical average travel times were captured from the dataset for model input parameters. As adjacency matrices for the spatial input parameters, the station distances and sequence numbers were used, and the time of day (hour) was considered for the temporal inputs. Other inputs, including volatility information such as standard deviation and variance of journey durations, were also included in the model to make it more robust. The model's performance was evaluated based on a metric called mean absolute percentage error (MAPE). The observed prediction errors for various routes, trips, and stations remained consistent throughout the day. The results showed that the developed model could predict travel times more accurately during peak traffic hours, having a MAPE of around 14%, and performed less accurately during the latter part of the day. In the context of a complicated transportation network in high-density urban areas, the model showed its applicability for real-time travel time prediction of public transportation and ensured the high quality of the predictions generated by the model.

Keywords: gated recurrent unit, mean absolute percentage error, single-step forecasting, travel time prediction.

Procedia PDF Downloads 57
1087 Response of Buildings with Soil-Structure Interaction with Varying Soil Types

Authors: Shreya Thusoo, Karan Modi, Rajesh Kumar, Hitesh Madahar

Abstract:

Over the years, it has been extensively established that the practice of assuming a structure being fixed at base, leads to gross errors in evaluation of its overall response due to dynamic loadings and overestimations in design. The extent of these errors depends on a number of variables; soil type being one of the major factor. This paper studies the effect of Soil Structure Interaction (SSI) on multi-storey buildings with varying under-laying soil types after proper validation of the effect of SSI. Analysis for soft, stiff and very stiff base soils has been carried out, using a powerful Finite Element Method (FEM) software package ANSYS v14.5. Results lead to some very important conclusions regarding time period, deflection and acceleration responses.

Keywords: dynamic response, multi-storey building, soil-structure interaction, varying soil types

Procedia PDF Downloads 464
1086 Capacity Estimation of Hybrid Automated Repeat Request Protocol for Low Earth Orbit Mega-Constellations

Authors: Arif Armagan Gozutok, Alper Kule, Burak Tos, Selman Demirel

Abstract:

Wireless communication chain requires effective ways to keep throughput efficiency high while it suffers location-dependent, time-varying burst errors. Several techniques are developed in order to assure that the receiver recovers the transmitted information without errors. The most fundamental approaches are error checking and correction besides re-transmission of the non-acknowledged packets. In this paper, stop & wait (SAW) and chase combined (CC) hybrid automated repeat request (HARQ) protocols are compared and analyzed in terms of throughput and average delay for the usage of low earth orbit (LEO) mega-constellations case. Several assumptions and technological implementations are considered as well as usage of low-density parity check (LDPC) codes together with several constellation orbit configurations.

Keywords: HARQ, LEO, satellite constellation, throughput

Procedia PDF Downloads 132
1085 Pitfalls and Drawbacks in Visual Modelling of Learning Knowledge by Students

Authors: Tatyana Gavrilova, Vadim Onufriev

Abstract:

Knowledge-based systems’ design requires the developer’s owning the advanced analytical skills. The efficient development of that skills within university courses needs a deep understanding of main pitfalls and drawbacks, which students usually make during their analytical work in form of visual modeling. Thus, it was necessary to hold an analysis of 5-th year students’ learning exercises within courses of 'Intelligent systems' and 'Knowledge engineering' in Saint-Petersburg Polytechnic University. The analysis shows that both lack of system thinking skills and methodological mistakes in course design cause the errors that are discussed in the paper. The conclusion contains an exploration of the issues and topics necessary and sufficient for the implementation of the improved practices in educational design for future curricula of teaching programs.

Keywords: knowledge based systems, knowledge engineering, students’ errors, visual modeling

Procedia PDF Downloads 297
1084 Human Factors Interventions for Risk and Reliability Management of Defence Systems

Authors: Chitra Rajagopal, Indra Deo Kumar, Ila Chauhan, Ruchi Joshi, Binoy Bhargavan

Abstract:

Reliability and safety are essential for the success of mission-critical and safety-critical defense systems. Humans are part of the entire life cycle of defense systems development and deployment. The majority of industrial accidents or disasters are attributed to human errors. Therefore, considerations of human performance and human reliability are critical in all complex systems, including defense systems. Defense systems are operating from the ground, naval and aerial platforms in diverse conditions impose unique physical and psychological challenges to the human operators. Some of the safety and mission-critical defense systems with human-machine interactions are fighter planes, submarines, warships, combat vehicles, aerial and naval platforms based missiles, etc. Human roles and responsibilities are also going through a transition due to the infusion of artificial intelligence and cyber technologies. Human operators, not accustomed to such challenges, are more likely to commit errors, which may lead to accidents or loss events. In such a scenario, it is imperative to understand the human factors in defense systems for better systems performance, safety, and cost-effectiveness. A case study using Task Analysis (TA) based methodology for assessment and reduction of human errors in the Air and Missile Defense System in the context of emerging technologies were presented. Action-oriented task analysis techniques such as Hierarchical Task Analysis (HTA) and Operator Action Event Tree (OAET) along with Critical Action and Decision Event Tree (CADET) for cognitive task analysis was used. Human factors assessment based on the task analysis helps in realizing safe and reliable defense systems. These techniques helped in the identification of human errors during different phases of Air and Missile Defence operations, leading to meet the requirement of a safe, reliable and cost-effective mission.

Keywords: defence systems, reliability, risk, safety

Procedia PDF Downloads 118
1083 Problems of ICT Adoption in Nigerian Small and Medium Scale Enterprises

Authors: Ajayi Adeola

Abstract:

The study examined the sources of revenue in Osun State. It determined the impact of revenue consultants on the internally generated revenue of Osun State Government, all with a view to surveying the expenditure pattern of the state. In the course of carrying out the study, data were collected primarily through interview method. Four principal officers in the financial sector were interviewed. However, secondary sources of data were collected from Osun State of Nigeria audited reports and financial statements for the year ended 31st December, 1997 to 2006. The data generated were analyzed using percentages and pie-chart for illustrations. The findings of the study revealed that the sources of revenue for Osun State Government included internally generated revenue (IGR), statutory allocation, value added tax (VAT) and capital projects. It also discovered that Statutory Allocation was the dominant sources of government revenue during the period of study. It accounted for 63.69% while IGR was 19.7%, value added tax (VAT) 8.07% and capital Receipts 8.48%. The study also discovered that the recurrent expenditure overshot the capital expenditure during the period of study on ratio 7:3 respectively while the state recorded surplus budget in seven times and deficit budgets in 2003 and 2004. The study concluded that the Osun State government was over dependent on external sources to finance recurrent and capital expenditure during the period of study.

Keywords: information communication technology, ICT adoption, ICT solution, small and medium scale enterprises

Procedia PDF Downloads 380
1082 Leveraging the Power of Dual Spatial-Temporal Data Scheme for Traffic Prediction

Authors: Yang Zhou, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is a fundamental problem in urban environment, facilitating the smart management of various businesses, such as taxi dispatching, bike relocation, and stampede alert. Most earlier methods rely on identifying the intrinsic spatial-temporal correlation to forecast. However, the complex nature of this problem entails a more sophisticated solution that can simultaneously capture the mutual influence of both adjacent and far-flung areas, with the information of time-dimension also incorporated seamlessly. To tackle this difficulty, we propose a new multi-phase architecture, DSTDS (Dual Spatial-Temporal Data Scheme for traffic prediction), that aims to reveal the underlying relationship that determines future traffic trend. First, a graph-based neural network with an attention mechanism is devised to obtain the static features of the road network. Then, a multi-granularity recurrent neural network is built in conjunction with the knowledge from a grid-based model. Subsequently, the preceding output is fed into a spatial-temporal super-resolution module. With this 3-phase structure, we carry out extensive experiments on several real-world datasets to demonstrate the effectiveness of our approach, which surpasses several state-of-the-art methods.

Keywords: traffic prediction, spatial-temporal, recurrent neural network, dual data scheme

Procedia PDF Downloads 101
1081 Phenotypic and Genotypic Diagnosis of Gaucher Disease in Algeria

Authors: S. Hallal, Z. Chami, A. Hadji-Lehtihet, S. Sokhal-Boudella, A. Berhoune, L. Yargui

Abstract:

Gaucher disease is the most common lysosomal storage in our population, it is due to a deficiency of β –glucosidase acid. The enzyme deficiency causes a pathological accumulation of undegraded substrate in lysosomes. This metabolic overload is responsible for a multisystemic disease with hepatosplenomegaly, anemia, thrombocytopenia, and bone involvement. Neurological involvement is rare. The laboratory diagnosis of Gaucher disease consists of phenotypic diagnosis by determining the enzymatic activity of β - glucosidase by fluorimetric method, a study by genotypic diagnosis in the GBA gene, limiting the search recurrent mutations (N370S, L444P, 84 GG); PCR followed by an enzymatic digestion. Abnormal profiles were verified by sequencing. Monitoring of treated patients is provided by the determination of chitotriosidase. Our experience spaning a period of 6 years (2007-2014) has enabled us to diagnose 78 patients out of a total of 328 requests from the various departments of pediatrics, internal medicine, neurology. Genotypic diagnosis focused on the entire family of 9 children treated at pediatric CHU Mustapha, which help define the clinical form; or 5 of them had type III disease, carrying the L444P mutation in the homozygous state. Three others were composite (N370/L444P) (N370S/other unintended mutation in our study), and only in one family no recurrent mutation has been found. This molecular study permits screening of heterozygous essential for genetic counseling.

Keywords: Gaucher disease, mutations, N370S, L444P

Procedia PDF Downloads 390
1080 Endometrioma Ethanol Sclerotherapy

Authors: Lamia Bensissaid

Abstract:

Goals: Endometriosis affects 6 to 10% of women of childbearing age. 17 to 44% of them have ovarian endometriomas. Medical and surgical treatments represent the two therapeutic axes with which PMA can be associated. Laparoscopic intraperitoneal ovarian cystectomy is described as the reference technique in the management of endometriomas by learned societies (CNGOF, ESHRE, NICE). However, it leads to a significant short-term reduction in the AMH level and the number of antral follicles, especially in cases of bilateral cystectomy, large cyst size or cystectomy after recurrence. Often, the disease is at an advanced stage with several surgical patients. Most have adhesions, which increase the risk of surgical complications and suboptimal resection and, therefore recurrence of the cyst. These results led to a change of opinion towards a conservative approach. Sclerotherapy is an old technique which acts by fibrinoid necrosis. It consists of injecting a sclerosing agent into the cyst cavity. Results : Recurrence was less than 15% for a 12-month follow-up; these rates are comparable to those of surgery. It does not seem to have a negative impact on ovarian reserve, but this is not sufficiently evaluated. It has an advantage in IVF pregnancy rates compared to cystectomy, particularly in cases of recurrent endometriomas. It has the advantages: · To be done on an outpatient basis. · To be inexpensive. · To avoid sometimes difficult and iterative surgery: · To allow an increase in pregnancy rates and the preservation of the ovarian reserve compared to iterative surgery. · of great interest in cases of bilateral endometriomas (kissing ovaries) or recurrent endometriomas. Conclusions: Ethanol sclerotherapy could be a good alternative to surgery.

Keywords: Endometrioma, Sclerotherapy, infertility, Ethanol

Procedia PDF Downloads 44
1079 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction

Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez

Abstract:

Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.

Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis

Procedia PDF Downloads 166
1078 A Dynamic Equation for Downscaling Surface Air Temperature

Authors: Ch. Surawut, D. Sukawat

Abstract:

In order to utilize results from global climate models, dynamical and statistical downscaling techniques have been developed. For dynamical downscaling, usually a limited area numerical model is used, with associated high computational cost. This research proposes dynamic equation for specific space-time regional climate downscaling from the Educational Global Climate Model (EdGCM) for Southeast Asia. The equation is for surface air temperature. These equations provide downscaling values of surface air temperature at any specific location and time without running a regional climate model. In the proposed equations, surface air temperature is approximated from ground temperature, sensible heat flux and 2m wind speed. Results from the application of the equation show that the errors from the proposed equations are less than the errors for direct interpolation from EdGCM.

Keywords: dynamic equation, downscaling, inverse distance, weight interpolation

Procedia PDF Downloads 287
1077 Forecasting Thermal Energy Demand in District Heating and Cooling Systems Using Long Short-Term Memory Neural Networks

Authors: Kostas Kouvaris, Anastasia Eleftheriou, Georgios A. Sarantitis, Apostolos Chondronasios

Abstract:

To achieve the objective of almost zero carbon energy solutions by 2050, the EU needs to accelerate the development of integrated, highly efficient and environmentally friendly solutions. In this direction, district heating and cooling (DHC) emerges as a viable and more efficient alternative to conventional, decentralized heating and cooling systems, enabling a combination of more efficient renewable and competitive energy supplies. In this paper, we develop a forecasting tool for near real-time local weather and thermal energy demand predictions for an entire DHC network. In this fashion, we are able to extend the functionality and to improve the energy efficiency of the DHC network by predicting and adjusting the heat load that is distributed from the heat generation plant to the connected buildings by the heat pipe network. Two case-studies are considered; one for Vransko, Slovenia and one for Montpellier, France. The data consists of i) local weather data, such as humidity, temperature, and precipitation, ii) weather forecast data, such as the outdoor temperature and iii) DHC operational parameters, such as the mass flow rate, supply and return temperature. The external temperature is found to be the most important energy-related variable for space conditioning, and thus it is used as an external parameter for the energy demand models. For the development of the forecasting tool, we use state-of-the-art deep neural networks and more specifically, recurrent networks with long-short-term memory cells, which are able to capture complex non-linear relations among temporal variables. Firstly, we develop models to forecast outdoor temperatures for the next 24 hours using local weather data for each case-study. Subsequently, we develop models to forecast thermal demand for the same period, taking under consideration past energy demand values as well as the predicted temperature values from the weather forecasting models. The contributions to the scientific and industrial community are three-fold, and the empirical results are highly encouraging. First, we are able to predict future thermal demand levels for the two locations under consideration with minimal errors. Second, we examine the impact of the outdoor temperature on the predictive ability of the models and how the accuracy of the energy demand forecasts decreases with the forecast horizon. Third, we extend the relevant literature with a new dataset of thermal demand and examine the performance and applicability of machine learning techniques to solve real-world problems. Overall, the solution proposed in this paper is in accordance with EU targets, providing an automated smart energy management system, decreasing human errors and reducing excessive energy production.

Keywords: machine learning, LSTMs, district heating and cooling system, thermal demand

Procedia PDF Downloads 127
1076 Human Errors in IT Services, HFACS Model in Root Cause Categorization

Authors: Kari Saarelainen, Marko Jantti

Abstract:

IT service trending of root causes of service incidents and problems is an important part of proactive problem management and service improvement. Human error related root causes are an important root cause category also in IT service management, although it’s proportion among root causes is smaller than in the other industries. The research problem in this study is: How root causes of incidents related to human errors should be categorized in an ITSM organization to effectively support service improvement. Categorization based on IT service management processes and based on Human Factors Analysis and Classification System (HFACS) taxonomy was studied in a case study. HFACS is widely used in human error root cause categorization across many industries. Combining these two categorization models in a two dimensional matrix was found effective, yet impractical for daily work.

Keywords: IT service management, ITIL, incident, problem, HFACS, swiss cheese model

Procedia PDF Downloads 469
1075 Assessment of Students Skills in Error Detection in SQL Classes using Rubric Framework - An Empirical Study

Authors: Dirson Santos De Campos, Deller James Ferreira, Anderson Cavalcante Gonçalves, Uyara Ferreira Silva

Abstract:

Rubrics to learning research provide many evaluation criteria and expected performance standards linked to defined student activity for learning and pedagogical objectives. Despite the rubric being used in education at all levels, academic literature on rubrics as a tool to support research in SQL Education is quite rare. There is a large class of SQL queries is syntactically correct, but certainly, not all are semantically correct. Detecting and correcting errors is a recurring problem in SQL education. In this paper, we usthe Rubric Abstract Framework (RAF), which consists of steps, that allows us to map the information to measure student performance guided by didactic objectives defined by the teacher as long as it is contextualized domain modeling by rubric. An empirical study was done that demonstrates how rubrics can mitigate student difficulties in finding logical errors and easing teacher workload in SQL education. Detecting and correcting logical errors is an important skill for students. Researchers have proposed several ways to improve SQL education because understanding this paradigm skills are crucial in software engineering and computer science. The RAF instantiation was using in an empirical study developed during the COVID-19 pandemic in database course. The pandemic transformed face-to-face and remote education, without presential classes. The lab activities were conducted remotely, which hinders the teaching-learning process, in particular for this research, in verifying the evidence or statements of knowledge, skills, and abilities (KSAs) of students. Various research in academia and industry involved databases. The innovation proposed in this paper is the approach used where the results obtained when using rubrics to map logical errors in query formulation have been analyzed with gains obtained by students empirically verified. The research approach can be used in the post-pandemic period in both classroom and distance learning.

Keywords: rubric, logical error, structured query language (SQL), empirical study, SQL education

Procedia PDF Downloads 171
1074 Annual Water Level Simulation Using Support Vector Machine

Authors: Maryam Khalilzadeh Poshtegal, Seyed Ahmad Mirbagheri, Mojtaba Noury

Abstract:

In this paper, by application of the input yearly data of rainfall, temperature and flow to the Urmia Lake, the simulation of water level fluctuation were applied by means of three models. According to the climate change investigation the fluctuation of lakes water level are of high interest. This study investigate data-driven models, support vector machines (SVM), SVM method which is a new regression procedure in water resources are applied to the yearly level data of Lake Urmia that is the biggest and the hyper saline lake in Iran. The evaluated lake levels are found to be in good correlation with the observed values. The results of SVM simulation show better accuracy and implementation. The mean square errors, mean absolute relative errors and determination coefficient statistics are used as comparison criteria.

Keywords: simulation, water level fluctuation, urmia lake, support vector machine

Procedia PDF Downloads 347
1073 Enhancing Nursing Students’ Communication Using TeamSTEPPS to Improve Patient Safety

Authors: Stefanie Santorsola, Natasha Frank

Abstract:

Improving healthcare safety necessitates examining current trends and beliefs about safety and devising strategies to improve. Errors in healthcare continue to increase and be experienced by patients, which is preventable and directly correlated to a breakdown in healthcare communication. TeamSTEPPS is an evidence-based process designed to improve the quality and safety of healthcare by improving communication and team processes. Communication is at the core of effective team collaboration and is vital for patient safety. TeamSTEPPS offers insights and strategies for improving communication and teamwork and reducing preventable errors to create a safer healthcare environment for patients. The academic, clinical, and educational environment for nursing students is vital in preparing them for professional practice by providing them with foundational knowledge and abilities. This environment provides them with a prime opportunity to learn about errors and the importance of effective communication to enhance patient safety, as nursing students are often unprepared to deal with errors. Proactively introducing and discussing errors through a supportive culture during the nursing student’s academic beginnings has the potential to carry key concepts into practice to improve and enhance patient safety. TeamSTEPPS has been used globally and has collectively positively impacted improvements in patient safety and teamwork. A workshop study was introduced in winter 2023 of registered practical nurses (RPN) students bridging to the baccalaureate nursing program; the majority of the RPNs in the bridging program were actively employed in a variety of healthcare facilities during the semester. The workshop study did receive academic institution ethics board approval, and participants signed a consent form prior to participating in the study. The premise of the workshop was to introduce TeamSTEPPS and a variety of strategies to these students and have students keep a reflective journal to incorporate the presented communication strategies in their practicum setting and keep a reflective journal on the effect and outcomes of the strategies in the healthcare setting. Findings from the workshop study supported the objective of the project, resulting in students verbalizing notable improvements in team functioning in the healthcare environment resulting from the incorporation of enhanced communication strategies from TeamSTEPPS that they were introduced to in the workshop study. Implication for educational institutions is the potential of further advancing the safety literacy and abilities of nursing students in preparing them for entering the workforce and improving safety for patients.

Keywords: teamstepps, education, patient safety, communication

Procedia PDF Downloads 44
1072 Influence and Dissemination of Solecism among Moroccan High School and University Students

Authors: Rachid Ed-Dali, Khalid Elasri

Abstract:

Mass media seem to provide a rich content for language acquisition. Exposure to television, the Internet, the mobile phone and other technological gadgets and devices helps enrich the student’s lexicon positively as well as negatively. The difficulties encountered by students while learning and acquiring second languages in addition to their eagerness to comprehend the content of a particular program prompt them to diversify their methods so as to achieve their targets. The present study highlights the significance of certain media channels and their involvement in language acquisition with the employment of the Natural Approach to further grasp whether students, especially secondary and high school students, learn and acquire errors through watching subtitled television programs. The chief objective is investigating the deductive and inductive relevance of certain programs beside the involvement of peripheral learning while acquiring mistakes.

Keywords: errors, mistakes, Natural Approach, peripheral learning, solecism

Procedia PDF Downloads 105
1071 The Use of Surveys to Combat Fake News in Media Literacy Education

Authors: Jaejun Jong

Abstract:

Fake news has recently become a serious international problem. Therefore, researchers and policymakers worldwide have sought to understand fake news and develop strategies to combat it. This study consists of two primary parts: (1) a literature review of how surveys were used to understand fake news and identify problems caused by fake news, and (2) a discussion of how surveys were used to fight back against fake news in educational settings. This second section specifically analyzes surveys used to evaluate a South Korean elementary school program designed to improve students’ metacognition and critical thinking. This section seeks to identify potential problems that may occur in the elementary school setting. The literature review shows that surveys can help people to understand fake news based on its traits rather than its definition due to the lack of agreement on the definition of fake news. The literature review also shows that people are not good at identifying fake news or evaluating their own ability to identify fake news; indeed, they are more likely to share information that aligns with their previous beliefs. In addition, the elementary school survey data shows that there may be substantial errors in the program evaluation process, likely caused by processing errors or the survey procedure, though the exact cause is not specified. Such a significant error in evaluating the effects of the educational program prevents teachers from making proper decisions and accurately evaluating the program. Therefore, identifying the source of such errors would improve the overall quality of education, which would benefit both teachers and students.

Keywords: critical thinking, elementary education, program evaluation, survey

Procedia PDF Downloads 88
1070 Comparative Study on the Evaluation of Patient Safety in Malaysian Retail Pharmacy Setup

Authors: Palanisamy Sivanandy, Tan Tyng Wei, Tan Wee Loon, Lim Chong Yee

Abstract:

Background: Patient safety has become a major concern over recent years with elevated medication errors; particularly prescribing and dispensing errors. Meticulous prescription screening and diligent drug dispensing is therefore important to prevent drug-related adverse events from inflicting harm to patients. Hence, pharmacists play a significant role in this scenario. The evaluation of patient safety in a pharmacy setup is crucial to contemplate current practices, attitude and perception of pharmacists towards patient safety. Method: The questionnaire for Pharmacy Survey on Patient Safety Culture developed by the Agency for Healthcare and Research Quality (AHRQ) was used to assess patient safety. Main objectives of the study was to evaluate the attitude and perception of pharmacists towards patient safety in retail pharmacies setup in Malaysia. Results: 417 questionnaire were distributed via convenience sampling in three different states of Malaysia, where 390 participants were responded and the response rate was 93.52%. The overall positive response rate (PRR) was ranged from 31.20% to 87.43% and the average PRR was found to be 67%. The overall patient safety grade for our pharmacies was appreciable and it ranges from good to very good. The study found a significant difference in the perception of senior and junior pharmacists towards patient safety. The internal consistency of the questionnaire contents /dimensions was satisfactory (Cronbach’s alpha - 0.92). Conclusion: Our results reflect that there was positive attitude and perception of retail pharmacists towards patient safety. Despite this, various efforts can be implemented in the future to amplify patient safety in retail pharmacies setup.

Keywords: patient safety, attitude, perception, positive response rate, medication errors

Procedia PDF Downloads 308
1069 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code

Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader

Abstract:

In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.

Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset

Procedia PDF Downloads 115
1068 Epicardial Fat Necrosis in a Young Female: A Case Report

Authors: Tayyibah Shah Alam, Joe Thomas, Nayantara Shenoy

Abstract:

Presenting a case that we would like to share, the answer is straight forward but the path taken to get to the diagnosis is where it gets interesting. A 31-year-old lady presented to the Rheumatology Outpatient department with left-sided chest pain associated with left-sided elbow joint pain intensifying over the last 2 days. She had been having a prolonged history of chest pain with minimal intensity since 2016. The pain is intermittent in nature. Aggravated while exerting, lifting heavy weights and lying down. Relieved while sitting. Her physical examination and laboratory tests were within normal limits. An electrocardiogram (ECG) showed normal sinus rhythm and a chest X-ray with no significant abnormality was noted. The primary suspicion was recurrent costochondritis. Cardiac blood inflammatory markers and Echo were normal, ruling out ACS. CT chest and MRI Thorax contrast showed small ill-defined STIR hyperintensity with thin peripheral enhancement in the anterior mediastinum in the left side posterior to the 5th costal cartilage and anterior to the pericardium suggestive of changes in the fat-focal panniculitis. Confirming the diagnosis as Epicardial fat necrosis. She was started on Colchicine and Nonsteroidal anti-inflammatory drugs for 2-3 weeks, following which a repeat CT showed resolution of the lesion and improvement in her. It is often under-recognized or misdiagnosed. CT scan was collectively used to establish the diagnosis. Making the correct diagnosis prospectively alleviates unnecessary testing in favor of conservative management.

Keywords: EFN, panniculitis, unknown etiology, recurrent chest pain

Procedia PDF Downloads 86
1067 Malposition of Femoral Component in Total Hip Arthroplasty

Authors: Renate Krassnig, Gloria M. Hohenberger, Uldis Berzins, Stefen Fischerauer

Abstract:

Background: Only a few reports discuss the effectiveness of intraoperative radiographs for placing femoral components. Therefore there is no international standard in using intraoperative imaging in the proceeding of total hip replacement. Method: Case report; an 84-year-old female patient underwent changing the components of the Total hip arthroplasty (THA) because of aseptic loosening. Due to circumstances, the surgeon decided to implant a cemented femoral component. The procedure was without any significant abnormalities. The first postoperative radiograph was planned after recovery – as usual. The x-ray imaging showed a misplaced femoral component. Therefore a CT-scan was performed additionally and the malposition of the cemented femoral component was confirmed. The patient had to undergo another surgery – removing of the cemented femoral component and implantation of a new well placed one. Conclusion: Intraoperative imaging of the femoral component is not a common standard but this case shows that intraoperative imaging is a useful method for detecting errors and gives the surgeon the opportunity to correct errors intraoperatively.

Keywords: femoral component, intraoperative imaging, malplacement, revison

Procedia PDF Downloads 188
1066 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 657
1065 Precise Determination of the Residual Stress Gradient in Composite Laminates Using a Configurable Numerical-Experimental Coupling Based on the Incremental Hole Drilling Method

Authors: A. S. Ibrahim Mamane, S. Giljean, M.-J. Pac, G. L’Hostis

Abstract:

Fiber reinforced composite laminates are particularly subject to residual stresses due to their heterogeneity and the complex chemical, mechanical and thermal mechanisms that occur during their processing. Residual stresses are now well known to cause damage accumulation, shape instability, and behavior disturbance in composite parts. Many works exist in the literature on techniques for minimizing residual stresses in thermosetting and thermoplastic composites mainly. To study in-depth the influence of processing mechanisms on the formation of residual stresses and to minimize them by establishing a reliable correlation, it is essential to be able to measure very precisely the profile of residual stresses in the composite. Residual stresses are important data to consider when sizing composite parts and predicting their behavior. The incremental hole drilling is very effective in measuring the gradient of residual stresses in composite laminates. This method is semi-destructive and consists of drilling incrementally a hole through the thickness of the material and measuring relaxation strains around the hole for each increment using three strain gauges. These strains are then converted into residual stresses using a matrix of coefficients. These coefficients, called calibration coefficients, depending on the diameter of the hole and the dimensions of the gauges used. The reliability of the incremental hole drilling depends on the accuracy with which the calibration coefficients are determined. These coefficients are calculated using a finite element model. The samples’ features and the experimental conditions must be considered in the simulation. Any mismatch can lead to inadequate calibration coefficients, thus introducing errors on residual stresses. Several calibration coefficient correction methods exist for isotropic material, but there is a lack of information on this subject concerning composite laminates. In this work, a Python program was developed to automatically generate the adequate finite element model. This model allowed us to perform a parametric study to assess the influence of experimental errors on the calibration coefficients. The results highlighted the sensitivity of the calibration coefficients to the considered errors and gave an order of magnitude of the precisions required on the experimental device to have reliable measurements. On the basis of these results, improvements were proposed on the experimental device. Furthermore, a numerical method was proposed to correct the calibration coefficients for different types of materials, including thick composite parts for which the analytical approach is too complex. This method consists of taking into account the experimental errors in the simulation. Accurate measurement of the experimental errors (such as eccentricity of the hole, angular deviation of the gauges from their theoretical position, or errors on increment depth) is therefore necessary. The aim is to determine more precisely the residual stresses and to expand the validity domain of the incremental hole drilling technique.

Keywords: fiber reinforced composites, finite element simulation, incremental hole drilling method, numerical correction of the calibration coefficients, residual stresses

Procedia PDF Downloads 119
1064 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: software quality, fuzzy logic, perception, prediction

Procedia PDF Downloads 304
1063 Integrating Deterministic and Probabilistic Safety Assessment to Decrease Risk & Energy Consumption in a Typical PWR

Authors: Ebrahim Ghanbari, Mohammad Reza Nematollahi

Abstract:

Integrating deterministic and probabilistic safety assessment (IDPSA) is one of the most commonly used issues in the field of safety analysis of power plant accident. It has also been recognized today that the role of human error in creating these accidents is not less than systemic errors, so the human interference and system errors in fault and event sequences are necessary. The integration of these analytical topics will be reflected in the frequency of core damage and also the study of the use of water resources in an accident such as the loss of all electrical power of the plant. In this regard, the SBO accident was simulated for the pressurized water reactor in the deterministic analysis issue, and by analyzing the operator's behavior in controlling the accident, the results of the combination of deterministic and probabilistic assessment were identified. The results showed that the best performance of the plant operator would reduce the risk of an accident by 10%, as well as a decrease of 6.82 liters/second of the water sources of the plant.

Keywords: IDPSA, human error, SBO, risk

Procedia PDF Downloads 115
1062 Towards a Complete Automation Feature Recognition System for Sheet Metal Manufacturing

Authors: Bahaa Eltahawy, Mikko Ylihärsilä, Reino Virrankoski, Esko Petäjä

Abstract:

Sheet metal processing is automated, but the step from product models to the production machine control still requires human intervention. This may cause time consuming bottlenecks in the production process and increase the risk of human errors. In this paper we present a system, which automatically recognizes features from the CAD-model of the sheet metal product. By using these features, the system produces a complete model of the particular sheet metal product. Then the model is used as an input for the sheet metal processing machine. Currently the system is implemented, capable to recognize more than 11 of the most common sheet metal structural features, and the procedure is fully automated. This provides remarkable savings in the production time, and protects against the human errors. This paper presents the developed system architecture, applied algorithms and system software implementation and testing.

Keywords: feature recognition, automation, sheet metal manufacturing, CAD, CAM

Procedia PDF Downloads 337