Search results for: explanations for the probable causes of the errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1358

Search results for: explanations for the probable causes of the errors

1208 Response of Buildings with Soil-Structure Interaction with Varying Soil Types

Authors: Shreya Thusoo, Karan Modi, Rajesh Kumar, Hitesh Madahar

Abstract:

Over the years, it has been extensively established that the practice of assuming a structure being fixed at base, leads to gross errors in evaluation of its overall response due to dynamic loadings and overestimations in design. The extent of these errors depends on a number of variables; soil type being one of the major factor. This paper studies the effect of Soil Structure Interaction (SSI) on multi-storey buildings with varying under-laying soil types after proper validation of the effect of SSI. Analysis for soft, stiff and very stiff base soils has been carried out, using a powerful Finite Element Method (FEM) software package ANSYS v14.5. Results lead to some very important conclusions regarding time period, deflection and acceleration responses.

Keywords: dynamic response, multi-storey building, soil-structure interaction, varying soil types

Procedia PDF Downloads 453
1207 Capacity Estimation of Hybrid Automated Repeat Request Protocol for Low Earth Orbit Mega-Constellations

Authors: Arif Armagan Gozutok, Alper Kule, Burak Tos, Selman Demirel

Abstract:

Wireless communication chain requires effective ways to keep throughput efficiency high while it suffers location-dependent, time-varying burst errors. Several techniques are developed in order to assure that the receiver recovers the transmitted information without errors. The most fundamental approaches are error checking and correction besides re-transmission of the non-acknowledged packets. In this paper, stop & wait (SAW) and chase combined (CC) hybrid automated repeat request (HARQ) protocols are compared and analyzed in terms of throughput and average delay for the usage of low earth orbit (LEO) mega-constellations case. Several assumptions and technological implementations are considered as well as usage of low-density parity check (LDPC) codes together with several constellation orbit configurations.

Keywords: HARQ, LEO, satellite constellation, throughput

Procedia PDF Downloads 117
1206 Pitfalls and Drawbacks in Visual Modelling of Learning Knowledge by Students

Authors: Tatyana Gavrilova, Vadim Onufriev

Abstract:

Knowledge-based systems’ design requires the developer’s owning the advanced analytical skills. The efficient development of that skills within university courses needs a deep understanding of main pitfalls and drawbacks, which students usually make during their analytical work in form of visual modeling. Thus, it was necessary to hold an analysis of 5-th year students’ learning exercises within courses of 'Intelligent systems' and 'Knowledge engineering' in Saint-Petersburg Polytechnic University. The analysis shows that both lack of system thinking skills and methodological mistakes in course design cause the errors that are discussed in the paper. The conclusion contains an exploration of the issues and topics necessary and sufficient for the implementation of the improved practices in educational design for future curricula of teaching programs.

Keywords: knowledge based systems, knowledge engineering, students’ errors, visual modeling

Procedia PDF Downloads 287
1205 Human Factors Interventions for Risk and Reliability Management of Defence Systems

Authors: Chitra Rajagopal, Indra Deo Kumar, Ila Chauhan, Ruchi Joshi, Binoy Bhargavan

Abstract:

Reliability and safety are essential for the success of mission-critical and safety-critical defense systems. Humans are part of the entire life cycle of defense systems development and deployment. The majority of industrial accidents or disasters are attributed to human errors. Therefore, considerations of human performance and human reliability are critical in all complex systems, including defense systems. Defense systems are operating from the ground, naval and aerial platforms in diverse conditions impose unique physical and psychological challenges to the human operators. Some of the safety and mission-critical defense systems with human-machine interactions are fighter planes, submarines, warships, combat vehicles, aerial and naval platforms based missiles, etc. Human roles and responsibilities are also going through a transition due to the infusion of artificial intelligence and cyber technologies. Human operators, not accustomed to such challenges, are more likely to commit errors, which may lead to accidents or loss events. In such a scenario, it is imperative to understand the human factors in defense systems for better systems performance, safety, and cost-effectiveness. A case study using Task Analysis (TA) based methodology for assessment and reduction of human errors in the Air and Missile Defense System in the context of emerging technologies were presented. Action-oriented task analysis techniques such as Hierarchical Task Analysis (HTA) and Operator Action Event Tree (OAET) along with Critical Action and Decision Event Tree (CADET) for cognitive task analysis was used. Human factors assessment based on the task analysis helps in realizing safe and reliable defense systems. These techniques helped in the identification of human errors during different phases of Air and Missile Defence operations, leading to meet the requirement of a safe, reliable and cost-effective mission.

Keywords: defence systems, reliability, risk, safety

Procedia PDF Downloads 111
1204 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction

Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez

Abstract:

Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.

Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis

Procedia PDF Downloads 155
1203 A Dynamic Equation for Downscaling Surface Air Temperature

Authors: Ch. Surawut, D. Sukawat

Abstract:

In order to utilize results from global climate models, dynamical and statistical downscaling techniques have been developed. For dynamical downscaling, usually a limited area numerical model is used, with associated high computational cost. This research proposes dynamic equation for specific space-time regional climate downscaling from the Educational Global Climate Model (EdGCM) for Southeast Asia. The equation is for surface air temperature. These equations provide downscaling values of surface air temperature at any specific location and time without running a regional climate model. In the proposed equations, surface air temperature is approximated from ground temperature, sensible heat flux and 2m wind speed. Results from the application of the equation show that the errors from the proposed equations are less than the errors for direct interpolation from EdGCM.

Keywords: dynamic equation, downscaling, inverse distance, weight interpolation

Procedia PDF Downloads 280
1202 Developing HRCT Criterion to Predict the Risk of Pulmonary Tuberculosis

Authors: Vandna Raghuvanshi, Vikrant Thakur, Anupam Jhobta

Abstract:

Objective: To design HRCT criterion to forecast the threat of pulmonary tuberculosis. Material and methods: This was a prospective study of 69 patients with clinical suspicion of pulmonary tuberculosis. We studied their medical characteristics, numerous separate HRCT-results, and a combination of HRCT findings to foresee the danger for PTB by utilizing univariate and multivariate investigation. Temporary HRCT diagnostic criteria were planned in view of these outcomes to find out the risk of PTB and tested these criteria on our patients. Results: The results of HRCT chest were analyzed, and Rank was given from 1 to 4 according to the HRCT chest findings. Sensitivity, specificity, positive predictive value, and negative predictive value were calculated. Rank 1: Highly suspected PTB. Rank 2: Probable PTB Rank 3: Nonspecific or difficult to differentiate from other diseases Rank 4: Other suspected diseases • Rank 1 (Highly suspected TB) was present in 22 (31.9%) patients, all of them finally diagnosed to have pulmonary tuberculosis. The sensitivity, specificity, and negative likelihood ratio for RANK 1 on HRCT chest was 53.6%, 100%, and 0.43, respectively. • Rank 2 (Probable TB) was present in 13 patients, out of which 12 were tubercular, and 1 was non-tubercular. • The sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio of the combination of Rank 1 and Rank 2 was 82.9%, 96.4%, 23.22, and 0.18, respectively. • Rank 3 (Non-specific TB) was present in 25 patients, and out of these, 7 were tubercular, and 18 were non-tubercular. • When all these 3 ranks were considered together, the sensitivity approached 100% however, the specificity reduced to 35.7%. The positive likelihood ratio and negative likelihood ratio were 1.56 and 0, respectively. • Rank 4 (Other specific findings) was given to 9 patients, and all of these were non-tubercular. Conclusion: HRCT is useful in selecting individuals with greater chances of pulmonary tuberculosis.

Keywords: pulmonary, tuberculosis, multivariate, HRCT

Procedia PDF Downloads 145
1201 Human Errors in IT Services, HFACS Model in Root Cause Categorization

Authors: Kari Saarelainen, Marko Jantti

Abstract:

IT service trending of root causes of service incidents and problems is an important part of proactive problem management and service improvement. Human error related root causes are an important root cause category also in IT service management, although it’s proportion among root causes is smaller than in the other industries. The research problem in this study is: How root causes of incidents related to human errors should be categorized in an ITSM organization to effectively support service improvement. Categorization based on IT service management processes and based on Human Factors Analysis and Classification System (HFACS) taxonomy was studied in a case study. HFACS is widely used in human error root cause categorization across many industries. Combining these two categorization models in a two dimensional matrix was found effective, yet impractical for daily work.

Keywords: IT service management, ITIL, incident, problem, HFACS, swiss cheese model

Procedia PDF Downloads 458
1200 Assessment of Students Skills in Error Detection in SQL Classes using Rubric Framework - An Empirical Study

Authors: Dirson Santos De Campos, Deller James Ferreira, Anderson Cavalcante Gonçalves, Uyara Ferreira Silva

Abstract:

Rubrics to learning research provide many evaluation criteria and expected performance standards linked to defined student activity for learning and pedagogical objectives. Despite the rubric being used in education at all levels, academic literature on rubrics as a tool to support research in SQL Education is quite rare. There is a large class of SQL queries is syntactically correct, but certainly, not all are semantically correct. Detecting and correcting errors is a recurring problem in SQL education. In this paper, we usthe Rubric Abstract Framework (RAF), which consists of steps, that allows us to map the information to measure student performance guided by didactic objectives defined by the teacher as long as it is contextualized domain modeling by rubric. An empirical study was done that demonstrates how rubrics can mitigate student difficulties in finding logical errors and easing teacher workload in SQL education. Detecting and correcting logical errors is an important skill for students. Researchers have proposed several ways to improve SQL education because understanding this paradigm skills are crucial in software engineering and computer science. The RAF instantiation was using in an empirical study developed during the COVID-19 pandemic in database course. The pandemic transformed face-to-face and remote education, without presential classes. The lab activities were conducted remotely, which hinders the teaching-learning process, in particular for this research, in verifying the evidence or statements of knowledge, skills, and abilities (KSAs) of students. Various research in academia and industry involved databases. The innovation proposed in this paper is the approach used where the results obtained when using rubrics to map logical errors in query formulation have been analyzed with gains obtained by students empirically verified. The research approach can be used in the post-pandemic period in both classroom and distance learning.

Keywords: rubric, logical error, structured query language (SQL), empirical study, SQL education

Procedia PDF Downloads 161
1199 Annual Water Level Simulation Using Support Vector Machine

Authors: Maryam Khalilzadeh Poshtegal, Seyed Ahmad Mirbagheri, Mojtaba Noury

Abstract:

In this paper, by application of the input yearly data of rainfall, temperature and flow to the Urmia Lake, the simulation of water level fluctuation were applied by means of three models. According to the climate change investigation the fluctuation of lakes water level are of high interest. This study investigate data-driven models, support vector machines (SVM), SVM method which is a new regression procedure in water resources are applied to the yearly level data of Lake Urmia that is the biggest and the hyper saline lake in Iran. The evaluated lake levels are found to be in good correlation with the observed values. The results of SVM simulation show better accuracy and implementation. The mean square errors, mean absolute relative errors and determination coefficient statistics are used as comparison criteria.

Keywords: simulation, water level fluctuation, urmia lake, support vector machine

Procedia PDF Downloads 339
1198 Enhancing Nursing Students’ Communication Using TeamSTEPPS to Improve Patient Safety

Authors: Stefanie Santorsola, Natasha Frank

Abstract:

Improving healthcare safety necessitates examining current trends and beliefs about safety and devising strategies to improve. Errors in healthcare continue to increase and be experienced by patients, which is preventable and directly correlated to a breakdown in healthcare communication. TeamSTEPPS is an evidence-based process designed to improve the quality and safety of healthcare by improving communication and team processes. Communication is at the core of effective team collaboration and is vital for patient safety. TeamSTEPPS offers insights and strategies for improving communication and teamwork and reducing preventable errors to create a safer healthcare environment for patients. The academic, clinical, and educational environment for nursing students is vital in preparing them for professional practice by providing them with foundational knowledge and abilities. This environment provides them with a prime opportunity to learn about errors and the importance of effective communication to enhance patient safety, as nursing students are often unprepared to deal with errors. Proactively introducing and discussing errors through a supportive culture during the nursing student’s academic beginnings has the potential to carry key concepts into practice to improve and enhance patient safety. TeamSTEPPS has been used globally and has collectively positively impacted improvements in patient safety and teamwork. A workshop study was introduced in winter 2023 of registered practical nurses (RPN) students bridging to the baccalaureate nursing program; the majority of the RPNs in the bridging program were actively employed in a variety of healthcare facilities during the semester. The workshop study did receive academic institution ethics board approval, and participants signed a consent form prior to participating in the study. The premise of the workshop was to introduce TeamSTEPPS and a variety of strategies to these students and have students keep a reflective journal to incorporate the presented communication strategies in their practicum setting and keep a reflective journal on the effect and outcomes of the strategies in the healthcare setting. Findings from the workshop study supported the objective of the project, resulting in students verbalizing notable improvements in team functioning in the healthcare environment resulting from the incorporation of enhanced communication strategies from TeamSTEPPS that they were introduced to in the workshop study. Implication for educational institutions is the potential of further advancing the safety literacy and abilities of nursing students in preparing them for entering the workforce and improving safety for patients.

Keywords: teamstepps, education, patient safety, communication

Procedia PDF Downloads 33
1197 Influence and Dissemination of Solecism among Moroccan High School and University Students

Authors: Rachid Ed-Dali, Khalid Elasri

Abstract:

Mass media seem to provide a rich content for language acquisition. Exposure to television, the Internet, the mobile phone and other technological gadgets and devices helps enrich the student’s lexicon positively as well as negatively. The difficulties encountered by students while learning and acquiring second languages in addition to their eagerness to comprehend the content of a particular program prompt them to diversify their methods so as to achieve their targets. The present study highlights the significance of certain media channels and their involvement in language acquisition with the employment of the Natural Approach to further grasp whether students, especially secondary and high school students, learn and acquire errors through watching subtitled television programs. The chief objective is investigating the deductive and inductive relevance of certain programs beside the involvement of peripheral learning while acquiring mistakes.

Keywords: errors, mistakes, Natural Approach, peripheral learning, solecism

Procedia PDF Downloads 95
1196 The Use of Surveys to Combat Fake News in Media Literacy Education

Authors: Jaejun Jong

Abstract:

Fake news has recently become a serious international problem. Therefore, researchers and policymakers worldwide have sought to understand fake news and develop strategies to combat it. This study consists of two primary parts: (1) a literature review of how surveys were used to understand fake news and identify problems caused by fake news, and (2) a discussion of how surveys were used to fight back against fake news in educational settings. This second section specifically analyzes surveys used to evaluate a South Korean elementary school program designed to improve students’ metacognition and critical thinking. This section seeks to identify potential problems that may occur in the elementary school setting. The literature review shows that surveys can help people to understand fake news based on its traits rather than its definition due to the lack of agreement on the definition of fake news. The literature review also shows that people are not good at identifying fake news or evaluating their own ability to identify fake news; indeed, they are more likely to share information that aligns with their previous beliefs. In addition, the elementary school survey data shows that there may be substantial errors in the program evaluation process, likely caused by processing errors or the survey procedure, though the exact cause is not specified. Such a significant error in evaluating the effects of the educational program prevents teachers from making proper decisions and accurately evaluating the program. Therefore, identifying the source of such errors would improve the overall quality of education, which would benefit both teachers and students.

Keywords: critical thinking, elementary education, program evaluation, survey

Procedia PDF Downloads 77
1195 Comparative Study on the Evaluation of Patient Safety in Malaysian Retail Pharmacy Setup

Authors: Palanisamy Sivanandy, Tan Tyng Wei, Tan Wee Loon, Lim Chong Yee

Abstract:

Background: Patient safety has become a major concern over recent years with elevated medication errors; particularly prescribing and dispensing errors. Meticulous prescription screening and diligent drug dispensing is therefore important to prevent drug-related adverse events from inflicting harm to patients. Hence, pharmacists play a significant role in this scenario. The evaluation of patient safety in a pharmacy setup is crucial to contemplate current practices, attitude and perception of pharmacists towards patient safety. Method: The questionnaire for Pharmacy Survey on Patient Safety Culture developed by the Agency for Healthcare and Research Quality (AHRQ) was used to assess patient safety. Main objectives of the study was to evaluate the attitude and perception of pharmacists towards patient safety in retail pharmacies setup in Malaysia. Results: 417 questionnaire were distributed via convenience sampling in three different states of Malaysia, where 390 participants were responded and the response rate was 93.52%. The overall positive response rate (PRR) was ranged from 31.20% to 87.43% and the average PRR was found to be 67%. The overall patient safety grade for our pharmacies was appreciable and it ranges from good to very good. The study found a significant difference in the perception of senior and junior pharmacists towards patient safety. The internal consistency of the questionnaire contents /dimensions was satisfactory (Cronbach’s alpha - 0.92). Conclusion: Our results reflect that there was positive attitude and perception of retail pharmacists towards patient safety. Despite this, various efforts can be implemented in the future to amplify patient safety in retail pharmacies setup.

Keywords: patient safety, attitude, perception, positive response rate, medication errors

Procedia PDF Downloads 302
1194 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code

Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader

Abstract:

In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.

Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset

Procedia PDF Downloads 106
1193 Malposition of Femoral Component in Total Hip Arthroplasty

Authors: Renate Krassnig, Gloria M. Hohenberger, Uldis Berzins, Stefen Fischerauer

Abstract:

Background: Only a few reports discuss the effectiveness of intraoperative radiographs for placing femoral components. Therefore there is no international standard in using intraoperative imaging in the proceeding of total hip replacement. Method: Case report; an 84-year-old female patient underwent changing the components of the Total hip arthroplasty (THA) because of aseptic loosening. Due to circumstances, the surgeon decided to implant a cemented femoral component. The procedure was without any significant abnormalities. The first postoperative radiograph was planned after recovery – as usual. The x-ray imaging showed a misplaced femoral component. Therefore a CT-scan was performed additionally and the malposition of the cemented femoral component was confirmed. The patient had to undergo another surgery – removing of the cemented femoral component and implantation of a new well placed one. Conclusion: Intraoperative imaging of the femoral component is not a common standard but this case shows that intraoperative imaging is a useful method for detecting errors and gives the surgeon the opportunity to correct errors intraoperatively.

Keywords: femoral component, intraoperative imaging, malplacement, revison

Procedia PDF Downloads 181
1192 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 647
1191 Precise Determination of the Residual Stress Gradient in Composite Laminates Using a Configurable Numerical-Experimental Coupling Based on the Incremental Hole Drilling Method

Authors: A. S. Ibrahim Mamane, S. Giljean, M.-J. Pac, G. L’Hostis

Abstract:

Fiber reinforced composite laminates are particularly subject to residual stresses due to their heterogeneity and the complex chemical, mechanical and thermal mechanisms that occur during their processing. Residual stresses are now well known to cause damage accumulation, shape instability, and behavior disturbance in composite parts. Many works exist in the literature on techniques for minimizing residual stresses in thermosetting and thermoplastic composites mainly. To study in-depth the influence of processing mechanisms on the formation of residual stresses and to minimize them by establishing a reliable correlation, it is essential to be able to measure very precisely the profile of residual stresses in the composite. Residual stresses are important data to consider when sizing composite parts and predicting their behavior. The incremental hole drilling is very effective in measuring the gradient of residual stresses in composite laminates. This method is semi-destructive and consists of drilling incrementally a hole through the thickness of the material and measuring relaxation strains around the hole for each increment using three strain gauges. These strains are then converted into residual stresses using a matrix of coefficients. These coefficients, called calibration coefficients, depending on the diameter of the hole and the dimensions of the gauges used. The reliability of the incremental hole drilling depends on the accuracy with which the calibration coefficients are determined. These coefficients are calculated using a finite element model. The samples’ features and the experimental conditions must be considered in the simulation. Any mismatch can lead to inadequate calibration coefficients, thus introducing errors on residual stresses. Several calibration coefficient correction methods exist for isotropic material, but there is a lack of information on this subject concerning composite laminates. In this work, a Python program was developed to automatically generate the adequate finite element model. This model allowed us to perform a parametric study to assess the influence of experimental errors on the calibration coefficients. The results highlighted the sensitivity of the calibration coefficients to the considered errors and gave an order of magnitude of the precisions required on the experimental device to have reliable measurements. On the basis of these results, improvements were proposed on the experimental device. Furthermore, a numerical method was proposed to correct the calibration coefficients for different types of materials, including thick composite parts for which the analytical approach is too complex. This method consists of taking into account the experimental errors in the simulation. Accurate measurement of the experimental errors (such as eccentricity of the hole, angular deviation of the gauges from their theoretical position, or errors on increment depth) is therefore necessary. The aim is to determine more precisely the residual stresses and to expand the validity domain of the incremental hole drilling technique.

Keywords: fiber reinforced composites, finite element simulation, incremental hole drilling method, numerical correction of the calibration coefficients, residual stresses

Procedia PDF Downloads 111
1190 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction

Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz

Abstract:

In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.

Keywords: software quality, fuzzy logic, perception, prediction

Procedia PDF Downloads 290
1189 Integrating Deterministic and Probabilistic Safety Assessment to Decrease Risk & Energy Consumption in a Typical PWR

Authors: Ebrahim Ghanbari, Mohammad Reza Nematollahi

Abstract:

Integrating deterministic and probabilistic safety assessment (IDPSA) is one of the most commonly used issues in the field of safety analysis of power plant accident. It has also been recognized today that the role of human error in creating these accidents is not less than systemic errors, so the human interference and system errors in fault and event sequences are necessary. The integration of these analytical topics will be reflected in the frequency of core damage and also the study of the use of water resources in an accident such as the loss of all electrical power of the plant. In this regard, the SBO accident was simulated for the pressurized water reactor in the deterministic analysis issue, and by analyzing the operator's behavior in controlling the accident, the results of the combination of deterministic and probabilistic assessment were identified. The results showed that the best performance of the plant operator would reduce the risk of an accident by 10%, as well as a decrease of 6.82 liters/second of the water sources of the plant.

Keywords: IDPSA, human error, SBO, risk

Procedia PDF Downloads 105
1188 Evaluating Probable Bending of Frames for Near-Field and Far-Field Records

Authors: Majid Saaly, Shahriar Tavousi Tafreshi, Mehdi Nazari Afshar

Abstract:

Most reinforced concrete structures are designed only under heavy loads have large transverse reinforcement spacing values, and therefore suffer severe failure after intense ground movements. The main goal of this paper is to compare the shear- and axial failure of concrete bending frames available in Tehran using incremental dynamic analysis under near- and far-field records. For this purpose, IDA analyses of 5, 10, and 15-story concrete structures were done under seven far-fault records and five near-faults records. The results show that in two-dimensional models of short-rise, mid-rise and high-rise reinforced concrete frames located on Type-3 soil, increasing the distance of the transverse reinforcement can increase the maximum inter-story drift ratio values up to 37%. According to the existing results on 5, 10, and 15-story reinforced concrete models located on Type-3 soil, records with characteristics such as fling-step and directivity create maximum drift values between floors more than far-fault earthquakes. The results indicated that in the case of seismic excitation modes under earthquake encompassing directivity or fling-step, the probability values of failure and failure possibility increasing rate values are much smaller than the corresponding values of far-fault earthquakes. However, in near-fault frame records, the probability of exceedance occurs at lower seismic intensities compared to far-fault records.

Keywords: IDA, failure curve, directivity, maximum floor drift, fling step, evaluating probable bending of frames, near-field and far-field earthquake records

Procedia PDF Downloads 78
1187 Towards a Complete Automation Feature Recognition System for Sheet Metal Manufacturing

Authors: Bahaa Eltahawy, Mikko Ylihärsilä, Reino Virrankoski, Esko Petäjä

Abstract:

Sheet metal processing is automated, but the step from product models to the production machine control still requires human intervention. This may cause time consuming bottlenecks in the production process and increase the risk of human errors. In this paper we present a system, which automatically recognizes features from the CAD-model of the sheet metal product. By using these features, the system produces a complete model of the particular sheet metal product. Then the model is used as an input for the sheet metal processing machine. Currently the system is implemented, capable to recognize more than 11 of the most common sheet metal structural features, and the procedure is fully automated. This provides remarkable savings in the production time, and protects against the human errors. This paper presents the developed system architecture, applied algorithms and system software implementation and testing.

Keywords: feature recognition, automation, sheet metal manufacturing, CAD, CAM

Procedia PDF Downloads 328
1186 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints

Authors: Maxime Merheb, Rachel Matar

Abstract:

Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.

Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR

Procedia PDF Downloads 382
1185 Improved Pitch Detection Using Fourier Approximation Method

Authors: Balachandra Kumaraswamy, P. G. Poonacha

Abstract:

Automatic Music Information Retrieval has been one of the challenging topics of research for a few decades now with several interesting approaches reported in the literature. In this paper we have developed a pitch extraction method based on a finite Fourier series approximation to the given window of samples. We then estimate pitch as the fundamental period of the finite Fourier series approximation to the given window of samples. This method uses analysis of the strength of harmonics present in the signal to reduce octave as well as harmonic errors. The performance of our method is compared with three best known methods for pitch extraction, namely, Yin, Windowed Special Normalization of the Auto-Correlation Function and Harmonic Product Spectrum methods of pitch extraction. Our study with artificially created signals as well as music files show that Fourier Approximation method gives much better estimate of pitch with less octave and harmonic errors.

Keywords: pitch, fourier series, yin, normalization of the auto- correlation function, harmonic product, mean square error

Procedia PDF Downloads 389
1184 A Cost-Benefit Analysis of Routinely Performed Transthoracic Echocardiography in the Setting of Acute Ischemic Stroke

Authors: John Rothrock

Abstract:

Background: The role of transthoracic echocardiography (TTE) in the diagnosis and management of patients with acute ischemic stroke remains controversial. While many stroke subspecialist reserve TTE for selected patients, others consider the procedure obligatory for most or all acute stroke patients. This study was undertaken to assess the cost vs. benefit of 'routine' TTE. Methods: We examined a consecutive series of patients who were admitted to a single institution in 2019 for acute ischemic stroke and underwent TTE. We sought to determine the frequency with which the results of TTE led to a new diagnosis of cardioembolism, redirected therapeutic cerebrovascular management, and at least potentially influenced the short or long-term clinical outcome. We recorded the direct cost associated with TTE. Results: There were 1076 patients in the study group, all of whom underwent TTE. TTE identified an unsuspected source of possible/probable cardioembolism in 62 patients (6%), confirmed an initially suspected source (primarily endocarditis) in an additional 13 (1%) and produced findings that stimulated subsequent testing diagnostic of possible/probable cardioembolism in 7 patients ( < 1%). TTE results potentially influenced the clinical outcome in a total of 48 patients (4%). With a total direct cost of $1.51 million, the mean cost per case wherein TTE results potentially influenced the clinical outcome in a positive manner was $31,375. Diagnostically and therapeutically, TTE was most beneficial in 67 patients under the age of 55 who presented with 'cryptogenic' stroke, identifying patent foramen ovale in 21 (31%); closure was performed in 19. Conclusions: The utility of TTE in the setting of acute ischemic stroke is modest, with its yield greatest in younger patients with cryptogenic stroke. Given the greater sensitivity of transesophageal echocardiography in detecting PFO and evaluating the aortic arch, TTE’s role in stroke diagnosis would appear to be limited.

Keywords: cardioembolic, cost-benefit, stroke, TTE

Procedia PDF Downloads 99
1183 Use of a Symptom Scale Based on Degree of Functional Impairment for Acute Concussion

Authors: Matthew T. McCarthy, Sarah Janse, Natalie M. Pizzimenti, Anthony K. Savino, Brian Crosser, Sean C. Rose

Abstract:

Concussion is diagnosed clinically using a comprehensive history and exam, supported by ancillary testing. Frequently, symptom checklists are used as part of the evaluation of concussion. Existing symptom scales are based on a subjective Likert scale, without relation of symptoms to clinical or functional impairment. This is a retrospective review of 133 patients under age 30 seen in an outpatient neurology practice within 30 days of a probable or definite concussion. Each patient completed 2 symptom checklists at the initial visit – the SCAT-3 symptom evaluation (22 symptoms, 0-6 scale) and a scale based on the degree of clinical impairment for each symptom (22 symptoms, 0-3 scale related to functional impact of the symptom). Final clearance date was determined by the treating physician. 60.9% of patients were male with mean age 15.7 years (SD 2.3). Mean time from concussion to first visit was 6.9 days (SD 6.2), and 101 patients had definite concussions (75.9%), while 32 were diagnosed as probable (24.1%). 94 patients had a known clearance date (70.7%) with mean clearance time of 20.6 days (SD 18.6) and median clearance time of 19 days (95% CI 16-21). Mean total symptom score was 27.2 (SD 22.9) on the SCAT-3 and 14.7 (SD 11.9) for the functional impairment scale. Pearson’s correlation between the two scales was 0.98 (p < 0.001). After adjusting for patient and injury characteristics, an equivalent increase in score on each scale was associated with longer time to clearance (SCAT-3 hazard ratio 0.885, 95%CI 0.835-0.938, p < 0.001; functional impairment scale hazard ratio 0.851, 95%CI 0.802-0.902, p < 0.001). A concussion symptom scale based on degree of functional impairment correlates strongly with the SCAT-3 scale and demonstrates a similar association with time to clearance. By assessing the degree of impact on clinical functioning, this symptom scale reflects a more intuitive approach to rating symptoms and can be used in the management of concussion.

Keywords: checklist, concussion, neurology, scale, sports, symptoms

Procedia PDF Downloads 128
1182 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement

Authors: Hu Zhenxing, Gao Jianxin

Abstract:

Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.

Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D

Procedia PDF Downloads 476
1181 Design of a Pneumonia Ontology for Diagnosis Decision Support System

Authors: Sabrina Azzi, Michal Iglewski, Véronique Nabelsi

Abstract:

Diagnosis error problem is frequent and one of the most important safety problems today. One of the main objectives of our work is to propose an ontological representation that takes into account the diagnostic criteria in order to improve the diagnostic. We choose pneumonia disease since it is one of the frequent diseases affected by diagnosis errors and have harmful effects on patients. To achieve our aim, we use a semi-automated method to integrate diverse knowledge sources that include publically available pneumonia disease guidelines from international repositories, biomedical ontologies and electronic health records. We follow the principles of the Open Biomedical Ontologies (OBO) Foundry. The resulting ontology covers symptoms and signs, all the types of pneumonia, antecedents, pathogens, and diagnostic testing. The first evaluation results show that most of the terms are covered by the ontology. This work is still in progress and represents a first and major step toward a development of a diagnosis decision support system for pneumonia.

Keywords: Clinical decision support system, Diagnostic errors, Ontology, Pneumonia

Procedia PDF Downloads 163
1180 A Tutorial on Model Predictive Control for Spacecraft Maneuvering Problem with Theory, Experimentation and Applications

Authors: O. B. Iskender, K. V. Ling, V. Dubanchet, L. Simonini

Abstract:

This paper discusses the recent advances and future prospects of spacecraft position and attitude control using Model Predictive Control (MPC). First, the challenges of the space missions are summarized, in particular, taking into account the errors, uncertainties, and constraints imposed by the mission, spacecraft and, onboard processing capabilities. The summary of space mission errors and uncertainties provided in categories; initial condition errors, unmodeled disturbances, sensor, and actuator errors. These previous constraints are classified into two categories: physical and geometric constraints. Last, real-time implementation capability is discussed regarding the required computation time and the impact of sensor and actuator errors based on the Hardware-In-The-Loop (HIL) experiments. The rationales behind the scenarios’ are also presented in the scope of space applications as formation flying, attitude control, rendezvous and docking, rover steering, and precision landing. The objectives of these missions are explained, and the generic constrained MPC problem formulations are summarized. Three key design elements used in MPC design: the prediction model, the constraints formulation and the objective cost function are discussed. The prediction models can be linear time invariant or time varying depending on the geometry of the orbit, whether it is circular or elliptic. The constraints can be given as linear inequalities for input or output constraints, which can be written in the same form. Moreover, the recent convexification techniques for the non-convex geometrical constraints (i.e., plume impingement, Field-of-View (FOV)) are presented in detail. Next, different objectives are provided in a mathematical framework and explained accordingly. Thirdly, because MPC implementation relies on finding in real-time the solution to constrained optimization problems, computational aspects are also examined. In particular, high-speed implementation capabilities and HIL challenges are presented towards representative space avionics. This covers an analysis of future space processors as well as the requirements of sensors and actuators on the HIL experiments outputs. The HIL tests are investigated for kinematic and dynamic tests where robotic arms and floating robots are used respectively. Eventually, the proposed algorithms and experimental setups are introduced and compared with the authors' previous work and future plans. The paper concludes with a conjecture that MPC paradigm is a promising framework at the crossroads of space applications while could be further advanced based on the challenges mentioned throughout the paper and the unaddressed gap.

Keywords: convex optimization, model predictive control, rendezvous and docking, spacecraft autonomy

Procedia PDF Downloads 88
1179 Design Optimization of a Compact Quadrupole Electromagnet for CLS 2.0

Authors: Md. Armin Islam, Les Dallin, Mark Boland, W. J. Zhang

Abstract:

This paper reports a study on the optimal magnetic design of a compact quadrupole electromagnet for the Canadian Light Source (CLS 2.0). The nature of the design is to determine a quadrupole with low relative higher order harmonics and better field quality. The design problem was formulated as an optimization model, in which the objective function is the higher order harmonics (multipole errors) and the variable to be optimized is the material distribution on the pole. The higher order harmonics arose in the quadrupole due to truncating the ideal hyperbola at a certain point to make the pole. In this project, the arisen harmonics have been optimized both transversely and longitudinally by adjusting material on the poles in a controlled way. For optimization, finite element analysis (FEA) has been conducted. A better higher order harmonics amplitudes and field quality have been achieved through the optimization. On the basis of the optimized magnetic design, electrical and cooling calculation has been performed for the magnet.

Keywords: drift, electrical, and cooling calculation, integrated field, magnetic field gradient, multipole errors, quadrupole

Procedia PDF Downloads 118