Search results for: code error correction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3483

Search results for: code error correction

2763 Examining the Missing Feedback Link in Environmental Kuznets Curve Hypothesis

Authors: Apra Sinha

Abstract:

The inverted U-shaped Environmental Kuznets curve (EKC) demonstrates(pollution-income relationship)that initially the pollution and environmental degradation surpass the level of income per capita; however this trend reverses since at the higher income levels, economic growth initiates environmental upgrading. However, what effect does increased environmental degradation has on growth is the missing feedback link which has not been addressed in the EKC hypothesis. This paper examines the missing feedback link in EKC hypothesis in Indian context by examining the casual association between fossil fuel consumption, carbon dioxide emissions and economic growth for India. Fossil fuel consumption here has been taken as a proxy of driver of economic growth. The casual association between the aforementioned variables has been analyzed using five interventions namely 1) urban development for which urbanization has been taken proxy 2) industrial development for which industrial value added has been taken proxy 3) trade liberalization for which sum of exports and imports as a share of GDP has been taken as proxy 4)financial development for which a)domestic credit to private sector and b)net foreign assets has been taken as proxies. The choice of interventions for this study has been done keeping in view the economic liberalization perspective of India. The main aim of the paper is to investigate the missing feedback link for Environmental Kuznets Curve Hypothesis before and after incorporating the intervening variables. The period of study is from 1971 to 2011 as it covers pre and post liberalization era in India. All the data has been taken from World Bank country level indicators. The Johansen and Juselius cointegration testing methodology and Error Correction based Granger causality have been applied on all the variables. The results clearly show that out of five interventions, only in two interventions the missing feedback link is being addressed. This paper can put forward significant policy implications for environment protection and sustainable development.

Keywords: environmental Kuznets curve hypothesis, fossil fuel consumption, industrialization, trade liberalization, urbanization

Procedia PDF Downloads 233
2762 Mapping Poverty in the Philippines: Insights from Satellite Data and Spatial Econometrics

Authors: Htet Khaing Lin

Abstract:

This study explores the relationship between a diverse set of variables, encompassing both environmental and socio-economic factors, and poverty levels in the Philippines for the years 2012, 2015, and 2018. Employing Ordinary Least Squares (OLS), Spatial Lag Models (SLM), and Spatial Error Models (SEM), this study delves into the dynamics of key indicators, including daytime and nighttime land surface temperature, cropland surface, urban land surface, rainfall, population size, normalized difference water, vegetation, and drought indices. The findings reveal consistent patterns and unexpected correlations, highlighting the need for nuanced policies that address the multifaceted challenges arising from the interplay of environmental and socio-economic factors.

Keywords: poverty analysis, OLS, spatial lag models, spatial error models, Philippines, google earth engine, satellite data, environmental dynamics, socio-economic factors

Procedia PDF Downloads 78
2761 Crossover Memories and Code-Switching in the Narratives of Arabic-Hebrew and Hebrew-English Bilingual Adults in Israel

Authors: Amani Jaber-Awida

Abstract:

This study examines two bilingual phenomena in the narratives of Arabic Hebrew and Hebrew-English bilingual adults in Israel: CO memories and code-switching (CS). The study examined these phenomena in the context of autobiographical memory, using a cue word technique. Student experimenters held two sessions in the homes of the participants. In separate language sessions, the participant was asked to look first at each of 16 cue words and then to state a concrete memory. After stating the memory, participants reported whether their memories were in the same language of the experiment session or different. Memories were classified as ‘Crossovers’ (CO) or ‘Same Language’ (SL) according to participants' self-reports. Participants were also required to elaborate about the setting, interlocutors and other languages involved in the specific memory. Beyond replicating the procedure of cuing technique, one memory from a specific lifespan period was chosen per participant, and the participant was required to provide further details about it. For the more detailed memories, CS count was conducted. Both bilingual groups confirmed the Reminiscence Bump phenomenon, retrieving more memories in the 10-30 age period. CO memories prevailed in second language sessions (L2). Same language memories were more abundant in first language sessions (L1). Higher CS frequency was found in L2 sessions. Finally, as predicted, 'individual' CS was prevalent in L2 sessions, but 'community-based' CS was not higher in L1 sessions. The two bilingual measures in this study, crossovers, and CS came from different research traditions, the former from an experimental paradigm in the psychology of autobiographical memory based on self-reported judgments, the latter a behavioral measure from linguistics. This merger of approaches offers new insight into the field of bilingual autobiographical memory. In addition, the study attempted to shed light on the investigation of motivations for CS, beginning with Walters’ SPPL Model and concluding with a distinction between ‘community-based’ and individual motivations.

Keywords: bilinguals, code-switching, crossover memories, narratives

Procedia PDF Downloads 151
2760 A Comparative Analysis of the Performance of COSMO and WRF Models in Quantitative Rainfall Prediction

Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Mary Nsabagwa, Triphonia Jacob Ngailo, Joachim Reuder, Sch¨attler Ulrich, Musa Semujju

Abstract:

The Numerical weather prediction (NWP) models are considered powerful tools for guiding quantitative rainfall prediction. A couple of NWP models exist and are used at many operational weather prediction centers. This study considers two models namely the Consortium for Small–scale Modeling (COSMO) model and the Weather Research and Forecasting (WRF) model. It compares the models’ ability to predict rainfall over Uganda for the period 21st April 2013 to 10th May 2013 using the root mean square (RMSE) and the mean error (ME). In comparing the performance of the models, this study assesses their ability to predict light rainfall events and extreme rainfall events. All the experiments used the default parameterization configurations and with same horizontal resolution (7 Km). The results show that COSMO model had a tendency of largely predicting no rain which explained its under–prediction. The COSMO model (RMSE: 14.16; ME: -5.91) presented a significantly (p = 0.014) higher magnitude of error compared to the WRF model (RMSE: 11.86; ME: -1.09). However the COSMO model (RMSE: 3.85; ME: 1.39) performed significantly (p = 0.003) better than the WRF model (RMSE: 8.14; ME: 5.30) in simulating light rainfall events. All the models under–predicted extreme rainfall events with the COSMO model (RMSE: 43.63; ME: -39.58) presenting significantly higher error magnitudes than the WRF model (RMSE: 35.14; ME: -26.95). This study recommends additional diagnosis of the models’ treatment of deep convection over the tropics.

Keywords: comparative performance, the COSMO model, the WRF model, light rainfall events, extreme rainfall events

Procedia PDF Downloads 245
2759 Calculating of the Heat Exchange in a Rotating Pipe: Application to the Cooling of Turbine Blades

Authors: A. Miloud

Abstract:

In this work, the results of numerical simulations of the turbulent flow with 3D heat transfer are presented for the case of two U-shaped channels and rotating rectangular section. The purpose of this investigation was to study the effect of the corrugated walls of the heated portion on the improved cooling, in particular the influence of the wavelength. The calculations were performed for a Reynolds number ranging from 10 000 to 100 000, two values of the number of rotation (Ro = 0.0 to 0.14) and a ratio of the restricted density to 0.13. In these simulations, ANSYS FLUENT code was used to solve the Reynolds equations expressing relations between different fields averaged variables over time. Model performance k-omega SST model and RSM are evaluated through a comparison of the numerical results for each model and the experimental and numerical data available. In this work, detailed average temperature predictions, the scope of the secondary flow and distributions of local Nusselt are presented. It turns out that the corrugated configuration further urges the heat exchange provided to reduce the velocity of the coolant inside the channel.

Keywords: cooling blades, corrugated walls, model k-omega SST and RSM, fluent code, rotation effect

Procedia PDF Downloads 237
2758 An Adaptive Back-Propagation Network and Kalman Filter Based Multi-Sensor Fusion Method for Train Location System

Authors: Yu-ding Du, Qi-lian Bao, Nassim Bessaad, Lin Liu

Abstract:

The Global Navigation Satellite System (GNSS) is regarded as an effective approach for the purpose of replacing the large amount used track-side balises in modern train localization systems. This paper describes a method based on the data fusion of a GNSS receiver sensor and an odometer sensor that can significantly improve the positioning accuracy. A digital track map is needed as another sensor to project two-dimensional GNSS position to one-dimensional along-track distance due to the fact that the train’s position can only be constrained on the track. A model trained by BP neural network is used to estimate the trend positioning error which is related to the specific location and proximate processing of the digital track map. Considering that in some conditions the satellite signal failure will lead to the increase of GNSS positioning error, a detection step for GNSS signal is applied. An adaptive weighted fusion algorithm is presented to reduce the standard deviation of train speed measurement. Finally an Extended Kalman Filter (EKF) is used for the fusion of the projected 1-D GNSS positioning data and the 1-D train speed data to get the estimate position. Experimental results suggest that the proposed method performs well, which can reduce positioning error notably.

Keywords: multi-sensor data fusion, train positioning, GNSS, odometer, digital track map, map matching, BP neural network, adaptive weighted fusion, Kalman filter

Procedia PDF Downloads 233
2757 A High Compression Ratio for a Losseless Image Compression Based on the Arithmetic Coding with the Sorted Run Length Coding: Meteosat Second Generation Image Compression

Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane

Abstract:

Image compression is the heart of several multimedia techniques. It is used to reduce the number of bits required to represent an image. Meteosat Second Generation (MSG) satellite allows the acquisition of 12 image files every 15 minutes and that results in a large databases sizes. In this paper, a novel image compression method based on the arithmetic coding with the sorted Run Length Coding (SRLC) for MSG images is proposed. The SRLC allows us to find the occurrence of the consecutive pixels of the original image to create a sorted run. The arithmetic coding allows the encoding of the sorted data of the previous stage to retrieve a unique code word that represents a binary code stream in the sorted order to boost the compression ratio. Through this article, we show that our method can perform the best results concerning compression ratio and bit rate unlike the method based on the Run Length Coding (RLC) and the arithmetic coding. Evaluation criteria like the compression ratio and the bit rate allow the confirmation of the efficiency of our method of image compression.

Keywords: image compression, arithmetic coding, Run Length Coding, RLC, Sorted Run Length Coding, SRLC, Meteosat Second Generation, MSG

Procedia PDF Downloads 337
2756 Support Vector Regression for Retrieval of Soil Moisture Using Bistatic Scatterometer Data at X-Band

Authors: Dileep Kumar Gupta, Rajendra Prasad, Pradeep Kumar, Varun Narayan Mishra, Ajeet Kumar Vishwakarma, Prashant K. Srivastava

Abstract:

An approach was evaluated for the retrieval of soil moisture of bare soil surface using bistatic scatterometer data in the angular range of 200 to 700 at VV- and HH- polarization. The microwave data was acquired by specially designed X-band (10 GHz) bistatic scatterometer. The linear regression analysis was done between scattering coefficients and soil moisture content to select the suitable incidence angle for retrieval of soil moisture content. The 250 incidence angle was found more suitable. The support vector regression analysis was used to approximate the function described by the input-output relationship between the scattering coefficient and corresponding measured values of the soil moisture content. The performance of support vector regression algorithm was evaluated by comparing the observed and the estimated soil moisture content by statistical performance indices %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE). The values of %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE) were found 2.9451, 1.0986, and 0.9214, respectively at HH-polarization. At VV- polarization, the values of %Bias, root mean squared error (RMSE) and Nash-Sutcliffe Efficiency (NSE) were found 3.6186, 0.9373, and 0.9428, respectively.

Keywords: bistatic scatterometer, soil moisture, support vector regression, RMSE, %Bias, NSE

Procedia PDF Downloads 410
2755 Management of Fitness-For-Duty for Human Error Prevention in Nuclear Power Plants

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For the past several decades, not a few researchers have warned that even a trivial human error may result in unexpected accidents, especially in Nuclear Power Plants. To prevent accidents in Nuclear Power Plants, it is quite indispensable to make any factors under the effective control that may raise the possibility of human errors for accident prevention. This study aimed to develop a risk management program, especially in the sense that guaranteeing Fitness-for-Duty (FFD) of human beings working in Nuclear Power Plants. Throughout a literal survey, it was found that work stress and fatigue are major psychophysical factors requiring sophisticated management. A set of major management factors related to work stress and fatigue was through repetitive literal surveys and classified into several categories. To maintain the fitness of human workers, a 4-level – individual worker, team, staff within plants, and external professional - approach was adopted for FFD management program. Moreover, the program was arranged to envelop the whole employment cycle from selection and screening of workers, job allocation, and job rotation. Also, a managerial care program was introduced for employee assistance based on the concept of Employee Assistance Program (EAP). The developed program was reviewed with repetition by ex-operators in nuclear power plants, and assessed in the affirmative. As a whole, responses implied additional treatment to guarantee high performance of human workers not only in normal operations but also in emergency situations. Consequently, the program is under administrative modification for practical application.

Keywords: fitness-for-duty (FFD), human error, work stress, fatigue, Employee-Assistance-Program (EAP)

Procedia PDF Downloads 290
2754 A Comprehensive Model of Professional Ethics Based on the Teachings of the Holy Quran

Authors: Zahra Mohagheghian, Fatema Agharebparast

Abstract:

Professional ethic is a subject that has been an issue today, so most of the businesses, including the teaching profession, understand the need and importance of it. So they need to develop a code of professional ethics for their own. In this regard, this study seeks to answer the question, with respect to the integrity of the Qur'an (Nahl / 89), is it possible to contemplate the divine teachers conduct to extract the divine pattern for teaching and training? In the code of conduct for divine teachers what are the most important moral obligations and duties of the teaching professionals? The results of this study show that the teaching of Khidr, according to the Quran’s verses, Abundant and subtle hints emphasized that it can be as comprehensive and divine pattern used in teaching and in the drafting of the charter of professional ethics of teachers used it. Also, the results show that in there have been many ethical principles in prophet Khidr’s teaching pattern.The most important ethical principles include: Student assessment, using objective and not subjective examples, assessment during teaching, flexibility, and others. According to each of these principles can help teachers achieve their educational goals and lead human being in their path toward spiritual evaluation.

Keywords: professional ethics, teaching-learning process, teacher, student, Quran

Procedia PDF Downloads 280
2753 Study on Shape Coefficient of Large Statue Building Based on CFD

Authors: Wang Guangda, Ma Jun, Zhao Caiqi, Pan Rui

Abstract:

Wind load is the main control load of large statue structures. Due to the irregular plane and elevation and uneven outer contour, statues’ shape coefficient can not pick up from the current code. Currently a common practice is based on wind tunnel test. But this method is time-consuming and high cost. In this paper, based on the fundamental theory of CFD, using fluid dynamics software of Fluent 15.0, a few large statue structure of 40 to 70m high, which are located in china , including large fairy statues and large Buddha statues, are analyzed by numerical wind tunnel. The results are contrasted with the recommended values in load code and the wind tunnel test results respectively. Results show that the shape coefficient has a good reliability by the numerical wind tunnel method of this kind of building. This will has a certain reference value of wind load values for large statues’ structure.

Keywords: large statue structure, shape coefficient, irregular structure, wind tunnel test, numerical wind tunnel simulation

Procedia PDF Downloads 358
2752 Numerical Modeling of Determination of in situ Rock Mass Deformation Modulus Using the Plate Load Test

Authors: A. Khodabakhshi, A. Mortazavi

Abstract:

Accurate determination of rock mass deformation modulus, as an important design parameter, is one of the most controversial issues in most engineering projects. A 3D numerical model of standard plate load test (PLT) using the FLAC3D code was carried to investigate the mechanism governing the test process. Five objectives were the focus of this study. The first goal was to employ 3D modeling in the interpretation of PLT conducted at the Bazoft dam site, Iran. The second objective was to investigate the effect of displacements measuring depth from the loading plates on the calculated moduli. The magnitude of rock mass deformation modulus calculated from PLT depends on anchor depth, and in practice, this may be a cause of error in the selection of realistic deformation modulus for the rock mass. The third goal of the study was to investigate the effect of testing plate diameter on the calculated modulus. Moreover, a comparison of the calculated modulus from ISRM formula, numerical modeling and calculated modulus from the actual PLT carried out at right abutment of the Bazoft dam site was another objective of the study. Finally, the effect of plastic strains on the calculated moduli in each of the loading-unloading cycles for three loading plates was investigated. The geometry, material properties, and boundary conditions on the constructed 3D model were selected based on the in-situ conditions of PLT at Bazoft dam site. A good agreement was achieved between numerical model results and the field tests results.

Keywords: deformation modulus, numerical model, plate loading test, rock mass

Procedia PDF Downloads 152
2751 Lyapunov-Based Tracking Control for Nonholonomic Wheeled Mobile Robot

Authors: Raouf Fareh, Maarouf Saad, Sofiane Khadraoui, Tamer Rabie

Abstract:

This paper presents a tracking control strategy based on Lyapunov approach for nonholonomic wheeled mobile robot. This control strategy consists of two levels. First, a kinematic controller is developed to adjust the right and left wheel velocities. Using this velocity control law, the stability of the tracking error is guaranteed using Lyapunov approach. This kinematic controller cannot be generated directly by the motors. To overcome this problem, the second level of the controllers, dynamic control, is designed. This dynamic control law is developed based on Lyapunov theory in order to track the desired trajectories of the mobile robot. The stability of the tracking error is proved using Lupunov and Barbalat approaches. Simulation results on a nonholonomic wheeled mobile robot are given to demonstrate the feasibility and effectiveness of the presented approach.

Keywords: mobile robot, trajectory tracking, Lyapunov, stability

Procedia PDF Downloads 360
2750 Reducing Diagnostic Error in Australian Emergency Departments Using a Behavioural Approach

Authors: Breanna Wright, Peter Bragge

Abstract:

Diagnostic error rates in healthcare are approximately 10% of cases. Diagnostic errors can cause patient harm due to inappropriate, inadequate or delayed treatment, and such errors contribute heavily to medical liability claims globally. Therefore, addressing diagnostic error is a high priority. In most cases, diagnostic errors are the result of faulty information synthesis rather than lack of knowledge. Specifically, the majority of diagnostic errors involve cognitive factors, and in particular, cognitive biases. Emergency Departments are an environment with heightened risk of diagnostic error due to time and resource pressures, a frequently chaotic environment, and patients arriving undifferentiated and with minimal context. This project aimed to develop a behavioural, evidence-informed intervention to reduce diagnostic error in Emergency Departments through co-design with emergency physicians, insurers, researchers, hospital managers, citizens and consumer representatives. The Forum Process was utilised to address this aim. This involves convening a small (4 – 6 member) expert panel to guide a focused literature and practice review; convening of a 10 – 12 person citizens panel to gather perspectives of laypeople, including those affected by misdiagnoses; and a 18 – 22 person structured stakeholder dialogue bringing together representatives of the aforementioned stakeholder groups. The process not only provides in-depth analysis of the problem and associated behaviours, but brings together expertise and insight to facilitate identification of a behaviour change intervention. Informed by the literature and practice review, the Citizens Panel focused on eliciting the values and concerns of those affected or potentially affected by diagnostic error. Citizens were comfortable with diagnostic uncertainty if doctors were honest with them. They also emphasised the importance of open communication between doctors and patients and their families. Citizens expect more consistent standards across the state and better access for both patients and their doctors to patient health information to avoid time-consuming re-taking of long patient histories and medication regimes when re-presenting at Emergency Departments and to reduce the risk of unintentional omissions. The structured Stakeholder Dialogue focused on identifying a feasible behavioural intervention to review diagnoses in Emergency Departments. This needed to consider the role of cognitive bias in medical decision-making; contextual factors (in Victoria, there is a legislated 4-hour maximum time between ED triage and discharge / hospital admission); resource availability; and the need to ensure the intervention could work in large metropolitan as well as small rural and regional ED settings across Victoria. The identified behavioural intervention will be piloted in approximately ten hospital EDs across Victoria, Australia. This presentation will detail the findings of all review and consultation activities, describe the behavioural intervention developed and present results of the pilot trial.

Keywords: behavioural intervention, cognitive bias, decision-making, diagnostic error

Procedia PDF Downloads 112
2749 End-to-End Performance of MPPM in Multihop MIMO-FSO System Over Dependent GG Atmospheric Turbulence Channels

Authors: Hechmi Saidi, Noureddine Hamdi

Abstract:

The performance of decode and forward (DF) multihop free space optical (FSO) scheme deploying multiple input multiple output (MIMO) configuration under gamma-gamma (GG) statistical distribution, that adopts M-ary pulse position modulation (MPPM) coding, is investigated. We have extracted exact and estimated values of symbol-error rates (SERs) respectively. The probability density function (PDF)’s closed-form formula is expressed for our designed system. Thanks to the use of DF multihop MIMO FSO configuration and MPPM signaling, atmospheric turbulence is combatted; hence the transmitted signal quality is improved.

Keywords: free space optical, gamma gamma channel, radio frequency, decode and forward, multiple-input multiple-output, M-ary pulse position modulation, symbol error rate

Procedia PDF Downloads 235
2748 Human Error Analysis in the USA Marine Accidents Reports

Authors: J. Sánchez-Beaskoetxea

Abstract:

The analysis of accidents, such as marine accidents, is one of the most useful instruments to avoid future accidents. In the case of marine accidents, from a simple collision of a small boat in a port to the wreck of a gigantic tanker ship, the study of the causes of the accidents is the basis of a great part of the marine international legislation. Some countries have official institutions who investigate all the accidents in which a ship with their flag is involved. In the case of the USA, the National Transportation Safety Board (NTSB) is responsible for these researches. The NTSB, after a deep investigation into each accident, publishes a Marine Accident Report with the possible cause of the accident. This paper analyses all the Marine Accident Reports published by the NTBS and focuses its attention especially in the Human Errors that led to reported accidents. In this research, the different Human Errors made by crew members are cataloged in 10 different groups. After a complete analysis of all the reports, the statistical analysis on the Human Errors typology in marine accidents is presented in order to use it as a tool to avoid the same errors in the future.

Keywords: human error, marine accidents, ship crew, USA

Procedia PDF Downloads 403
2747 Integrating Computational Thinking into Classroom Practice – A Case Study

Authors: Diane Vassallo., Leonard Busuttil

Abstract:

Recent educational developments have seen increasing attention attributed to Computational Thinking (CT) and its integration into primary and secondary school curricula. CT is more than simply being able to use technology but encompasses fundamental Computer Science concepts which are deemed to be very important in developing the correct mindset for our future digital citizens. The case study presented in this article explores the journey of a Maltese secondary school teacher in his efforts to plan, develop and integrate CT within the context of a local classroom. The teacher participant was recruited from the Malta EU Code week summer school, a pilot initiative that stemmed from the EU Code week Team’s Train the Trainer program. The qualitative methodology involved interviews with the participant teacher as well as an analysis of the artefacts created by the students during the lessons. The results shed light on the numerous challenges and obstacles that the teacher encountered in his integration of CT, as well as portray some brilliant examples of good practices which can substantially inform further research and practice around the integration of CT in classroom practice.

Keywords: computational thinking, digital citizens, digital literacy, technology integration

Procedia PDF Downloads 142
2746 Artificial Neural Networks Based Calibration Approach for Six-Port Receiver

Authors: Nadia Chagtmi, Nejla Rejab, Noureddine Boulejfen

Abstract:

This paper presents a calibration approach based on artificial neural networks (ANN) to determine the envelop signal (I+jQ) of a six-port based receiver (SPR). The memory effects called also dynamic behavior and the nonlinearity brought by diode based power detector have been taken into consideration by the ANN. Experimental set-up has been performed to validate the efficiency of this method. The efficiency of this approach has been confirmed by the obtained results in terms of waveforms. Moreover, the obtained error vector magnitude (EVM) and the mean absolute error (MAE) have been calculated in order to confirm and to test the ANN’s performance to achieve I/Q recovery using the output voltage detected by the power based detector. The baseband signal has been recovered using ANN with EVMs no higher than 1 % and an MAE no higher than 17, 26 for the SPR excited different type of signals such QAM (quadrature amplitude modulation) and LTE (Long Term Evolution).

Keywords: six-port based receiver; calibration, nonlinearity, memory effect, artificial neural network

Procedia PDF Downloads 54
2745 Handling Missing Data by Using Expectation-Maximization and Expectation-Maximization with Bootstrapping for Linear Functional Relationship Model

Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, A. H. M. R. Imon

Abstract:

Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in two types of LFRM namely the full model of LFRM and in LFRM when the slope is estimated using a nonparametric method. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.

Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators

Procedia PDF Downloads 440
2744 On-Site Coaching on Freshly-Graduated Nurses to Improves Quality of Clinical Handover and to Avoid Clinical Error

Authors: Sau Kam Adeline Chan

Abstract:

World Health Organization had listed ‘Communication during Patient Care Handovers’ as one of its highest 5 patient safety initiatives. Clinical handover means transfer of accountability and responsibility of clinical information from one health professional to another. The main goal of clinical handover is to convey patient’s current condition and treatment plan accurately. Ineffective communication at point of care is globally regarded as the main cause of the sentinel event. Situation, Background, Assessment and Recommendation (SBAR), a communication tool, is extensively regarded as an effective communication tool in healthcare setting. Nonetheless, just by scenario-based program in nursing school or attending workshops on SBAR would not be enough for freshly graduated nurses to apply it competently in a complex clinical practice. To what extend and in-depth of information should be conveyed during handover process is not easy to learn. As such, on-site coaching is essential to upgrade their expertise on the usage of SBAR and ultimately to avoid any clinical error. On-site coaching for all freshly graduated nurses on the usage of SBAR in clinical handover was commenced in August 2014. During the preceptorship period, freshly graduated nurses were coached by the preceptor. After that, they were gradually assigned to take care of a group of patients independently. Nurse leaders would join in their shift handover process at patient’s bedside. Feedback and support were given to them accordingly. Discrepancies on their clinical handover process were shared with them and documented for further improvement work. Owing to the constraint of manpower in nurse leader, about coaching for 30 times were provided to a nurse in a year. Staff satisfaction survey was conducted to gauge their feelings about the coaching and look into areas for further improvement. Number of clinical error avoided was documented as well. The nurses reported that there was a significant improvement particularly in their confidence and knowledge in clinical handover process. In addition, the sense of empowerment was developed when liaising with senior and experienced nurses. Their proficiency in applying SBAR was enhanced and they become more alert to the critical criteria of an effective clinical handover. Most importantly, accuracy of transferring patient’s condition was improved and repetition of information was avoided. Clinical errors were prevented and quality patient care was ensured. Using SBAR as a communication tool looks simple. The tool only provides a framework to guide the handover process. Nevertheless, without on-site training, loophole on clinical handover still exists, patient’s safety will be affected and clinical error still happens.

Keywords: freshly graduated nurse, competency of clinical handover, quality, clinical error

Procedia PDF Downloads 134
2743 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets

Authors: Kothuri Sriraman, Mattupalli Komal Teja

Abstract:

In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).

Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm

Procedia PDF Downloads 327
2742 Using Real Truck Tours Feedback for Address Geocoding Correction

Authors: Dalicia Bouallouche, Jean-Baptiste Vioix, Stéphane Millot, Eric Busvelle

Abstract:

When researchers or logistics software developers deal with vehicle routing optimization, they mainly focus on minimizing the total travelled distance or the total time spent in the tours by the trucks, and maximizing the number of visited customers. They assume that the upstream real data given to carry the optimization of a transporter tours is free from errors, like customers’ real constraints, customers’ addresses and their GPS-coordinates. However, in real transporter situations, upstream data is often of bad quality because of address geocoding errors and the irrelevance of received addresses from the EDI (Electronic Data Interchange). In fact, geocoders are not exempt from errors and could give impertinent GPS-coordinates. Also, even with a good geocoding, an inaccurate address can lead to a bad geocoding. For instance, when the geocoder has trouble with geocoding an address, it returns those of the center of the city. As well, an obvious geocoding issue is that the mappings used by the geocoders are not regularly updated. Thus, new buildings could not exist on maps until the next update. Even so, trying to optimize tours with impertinent customers GPS-coordinates, which are the most important and basic input data to take into account for solving a vehicle routing problem, is not really useful and will lead to a bad and incoherent solution tours because the locations of the customers used for the optimization are very different from their real positions. Our work is supported by a logistics software editor Tedies and a transport company Upsilon. We work with Upsilon's truck routes data to carry our experiments. In fact, these trucks are equipped with TOMTOM GPSs that continuously save their tours data (positions, speeds, tachograph-information, etc.). We, then, retrieve these data to extract the real truck routes to work with. The aim of this work is to use the experience of the driver and the feedback of the real truck tours to validate GPS-coordinates of well geocoded addresses, and bring a correction to the badly geocoded addresses. Thereby, when a vehicle makes its tour, for each visited customer, the vehicle might have trouble with finding this customer’s address at most once. In other words, the vehicle would be wrong at most once for each customer’s address. Our method significantly improves the quality of the geocoding. Hence, we achieve to automatically correct an average of 70% of GPS-coordinates of a tour addresses. The rest of the GPS-coordinates are corrected in a manual way by giving the user indications to help him to correct them. This study shows the importance of taking into account the feedback of the trucks to gradually correct address geocoding errors. Indeed, the accuracy of customer’s address and its GPS-coordinates play a major role in tours optimization. Unfortunately, address writing errors are very frequent. This feedback is naturally and usually taken into account by transporters (by asking drivers, calling customers…), to learn about their tours and bring corrections to the upcoming tours. Hence, we develop a method to do a big part of that automatically.

Keywords: driver experience feedback, geocoding correction, real truck tours

Procedia PDF Downloads 661
2741 A Framework for Blockchain Vulnerability Detection and Cybersecurity Education

Authors: Hongmei Chi

Abstract:

The Blockchain has become a necessity for many different societal industries and ordinary lives including cryptocurrency technology, supply chain, health care, public safety, education, etc. Therefore, training our future blockchain developers to know blockchain programming vulnerability and I.T. students' cyber security is in high demand. In this work, we propose a framework including learning modules and hands-on labs to guide future I.T. professionals towards developing secure blockchain programming habits and mitigating source code vulnerabilities at the early stages of the software development lifecycle following the concept of Secure Software Development Life Cycle (SSDLC). In this research, our goal is to make blockchain programmers and I.T. students aware of the vulnerabilities of blockchains. In summary, we develop a framework that will (1) improve students' skills and awareness of blockchain source code vulnerabilities, detection tools, and mitigation techniques (2) integrate concepts of blockchain vulnerabilities for IT students, (3) improve future IT workers’ ability to master the concepts of blockchain attacks.

Keywords: software vulnerability detection, hands-on lab, static analysis tools, vulnerabilities, blockchain, active learning

Procedia PDF Downloads 72
2740 A Sociolinguistic Study of the Outcomes of Arabic-French Contact in the Algerian Dialect Tlemcen Speech Community as a Case Study

Authors: R. Rahmoun-Mrabet

Abstract:

It is acknowledged that our style of speaking changes according to a wide range of variables such as gender, setting, the age of both the addresser and the addressee, the conversation topic, and the aim of the interaction. These differences in style are noticeable in monolingual and multilingual speech communities. Yet, they are more observable in speech communities where two or more codes coexist. The linguistic situation in Algeria reflects a state of bilingualism because of the coexistence of Arabic and French. Nevertheless, like all Arab countries, it is characterized by diglossia i.e. the concomitance of Modern Standard Arabic (MSA) and Algerian Arabic (AA), the former standing for the ‘high variety’ and the latter for the ‘low variety’. The two varieties are derived from the same source but are used to fulfil distinct functions that is, MSA is used in the domains of religion, literature, education and formal settings. AA, on the other hand, is used in informal settings, in everyday speech. French has strongly affected the Algerian language and culture because of the historical background of Algeria, thus, what can easily be noticed in Algeria is that everyday speech is characterized by code-switching from dialectal Arabic and French or by the use of borrowings. Tamazight is also very present in many regions of Algeria and is the mother tongue of many Algerians. Yet, it is not used in the west of Algeria, where the study has been conducted. The present work, which was directed in the speech community of Tlemcen-Algeria, aims at depicting some of the outcomes of the contact of Arabic with French such as code-switching, borrowing and interference. The question that has been asked is whether Algerians are aware of their use of borrowings or not. Three steps are followed in this research; the first one is to depict the sociolinguistic situation in Algeria and to describe the linguistic characteristics of the dialect of Tlemcen, which are specific to this city. The second one is concerned with data collection. Data have been collected from 57 informants who were given questionnaires and who have then been classified according to their age, gender and level of education. Information has also been collected through observation, and note taking. The third step is devoted to analysis. The results obtained reveal that most Algerians are aware of their use of borrowings. The present work clarifies how words are borrowed from French, and then adapted to Arabic. It also illustrates the way in which singular words inflect into plural. The results expose the main characteristics of borrowing as opposed to code-switching. The study also clarifies how interference occurs at the level of nouns, verbs and adjectives.

Keywords: bilingualism, borrowing, code-switching, interference, language contact

Procedia PDF Downloads 259
2739 Honour Killing in Iraqi Statutory Law

Authors: Hersh Azeez

Abstract:

Honour killing, also known as "honor killing," is a deeply rooted and complex social issue that persists in many parts of the world, including Iraq. This paper seeks to examine the legal framework surrounding honour killing in Iraqi statutory law. The paper begins with an introduction to honour killing as a phenomenon and its cultural and societal context in Iraq. It then delves into the methodology used in this research, including a comprehensive review of relevant legal texts, case studies, and scholarly articles. The paper analyzes the existing legal framework in Iraq, including relevant penal code provisions and other relevant legislation, as well as the challenges and shortcomings in addressing honour killing in the country. The research findings reveal that despite some legal provisions aimed at addressing honour killing, the practice continues to persist due to a lack of effective implementation, societal norms, and cultural attitudes. The paper concludes with recommendations for improving the legal framework to combat honour killing in Iraq, including legal reforms, education and awareness campaigns, and cultural change initiatives.

Keywords: honour killing, Iraq, statutory law, legal framework, penal code, cultural norms

Procedia PDF Downloads 50
2738 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups

Authors: Lily Ingsrisawang, Tasanee Nacharoen

Abstract:

Introduction: The problems of unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many research papers found that the performance of existing classifier tends to be biased towards the majority class. The k -nearest neighbors’ nonparametric discriminant analysis is one method that was proposed for classifying unbalanced classes with good performance. Hence, the methods of discriminant analysis are of interest to us in investigating misclassification error rates for class-imbalanced data of three diabetes risk groups. Objective: The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification application of class-imbalanced data of diabetes risk groups. Methods: Data from a healthy project for 599 staffs in a government hospital in Bangkok were obtained for the classification problem. The staffs were diagnosed into one of three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data along with the variables; diabetes risk group, age, gender, cholesterol, and BMI was analyzed and bootstrapped up to 50 and 100 samples, 599 observations per sample, for additional estimation of misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples show non-normality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. In finding the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions with three choices of (0.90:0.05:0.05), (0.80: 0.10: 0.10) or (0.70, 0.15, 0.15). Results: The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k = 3 or k = 4 and the prior probabilities of {non-risk:risk:diabetic} as {0.90:0.05:0.05} or {0.80:0.10:0.10} gave the smallest error rate of misclassification. Conclusion: The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.

Keywords: error rate, bootstrap, diabetes risk groups, k-nearest neighbors

Procedia PDF Downloads 421
2737 Optimal Design of Reference Node Placement for Wireless Indoor Positioning Systems in Multi-Floor Building

Authors: Kittipob Kondee, Chutima Prommak

Abstract:

In this paper, we propose an optimization technique that can be used to optimize the placements of reference nodes and improve the location determination performance for the multi-floor building. The proposed technique is based on Simulated Annealing algorithm (SA) and is called MSMR-M. The performance study in this work is based on simulation. We compare other node-placement techniques found in the literature with the optimal node-placement solutions obtained from our optimization. The results show that using the optimal node-placement obtained by our proposed technique can improve the positioning error distances up to 20% better than those of the other techniques. The proposed technique can provide an average error distance within 1.42 meters.

Keywords: indoor positioning system, optimization system design, multi-floor building, wireless sensor networks

Procedia PDF Downloads 227
2736 Use of Oral Communication Strategies: A Study of Bangladeshi EFL Learners at the Graduate Level

Authors: Afroza Akhter Tina

Abstract:

This paper reports on an investigation into the use of specific types of oral communication strategies, namely ‘topic avoidance’, ‘message abandonment’, ‘code-switching’, ‘paraphrasing’, ‘restructuring’, and ‘stalling’ by Bangladeshi EFL learners at the graduate level. It chiefly considers the frequency of using these strategies as well as the students and teachers attitudes toward such uses. The participants of this study are 66 EFL students and 12 EFL teachers of Jahangirnagar University. Data was collected through questionnaire, oral interview, and classroom observation form. The findings reveal that the EFL students tried to employ all the strategies to various extents due to the language difficulties they encountered in their oral English performance. Among them, the mostly used strategy was ‘stalling’ or the use of fillers, followed by ‘code-switching’. The least used strategies were ‘topic avoidance’, ‘restructuring’, and ‘paraphrasing’. The findings indicate that the use of such strategies was related to the contexts of situation and data-elicitation tasks. It also reveals that the students were not formally trained to use the strategies though the majority of the teachers and students acknowledge them as helpful in communication. Finally the study suggests that an awareness of the nature and functions of these strategies can contribute to the overall improvement of the learners’ communicative competence in spoken English.

Keywords: communicative strategies, competency, attitude, frequency

Procedia PDF Downloads 395
2735 Design and Simulation of Unified Power Quality Conditioner based on Adaptive Fuzzy PI Controller

Authors: Brahim Ferdi, Samira Dib

Abstract:

The unified power quality conditioner (UPQC), a combination of shunt and series active power filter, is one of the best solutions towards the mitigation of voltage and current harmonics problems in distribution power system. PI controller is very common in the control of UPQC. However, one disadvantage of this conventional controller is the difficulty in tuning its gains (Kp and Ki). To overcome this problem, an adaptive fuzzy logic PI controller is proposed. The controller is composed of fuzzy controller and PI controller. According to the error and error rate of the control system and fuzzy control rules, the fuzzy controller can online adjust the two gains of the PI controller to get better performance of UPQC. Simulations using MATLAB/SIMULINK are carried out to verify the performance of the proposed controller. The results show that the proposed controller has fast dynamic response and high accuracy of tracking the current and voltage references.

Keywords: adaptive fuzzy PI controller, current harmonics, PI controller, voltage harmonics, UPQC

Procedia PDF Downloads 534
2734 Estimating Lost Digital Video Frames Using Unidirectional and Bidirectional Estimation Based on Autoregressive Time Model

Authors: Navid Daryasafar, Nima Farshidfar

Abstract:

In this article, we make attempt to hide error in video with an emphasis on the time-wise use of autoregressive (AR) models. To resolve this problem, we assume that all information in one or more video frames is lost. Then, lost frames are estimated using analogous Pixels time information in successive frames. Accordingly, after presenting autoregressive models and how they are applied to estimate lost frames, two general methods are presented for using these models. The first method which is the same standard method of autoregressive models estimates lost frame in unidirectional form. Usually, in such condition, previous frames information is used for estimating lost frame. Yet, in the second method, information from the previous and next frames is used for estimating the lost frame. As a result, this method is known as bidirectional estimation. Then, carrying out a series of tests, performance of each method is assessed in different modes. And, results are compared.

Keywords: error steganography, unidirectional estimation, bidirectional estimation, AR linear estimation

Procedia PDF Downloads 516