Search results for: types of spelling errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6202

Search results for: types of spelling errors

6082 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom

Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu

Abstract:

The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.

Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity

Procedia PDF Downloads 169
6081 Development of Visual Working Memory Precision: A Cross-Sectional Study of Simultaneously Delayed Responses Paradigm

Authors: Yao Fu, Xingli Zhang, Jiannong Shi

Abstract:

Visual working memory (VWM) capacity is the ability to maintain and manipulate short-term information which is not currently available. It is well known for its significance to form the basis of numerous cognitive abilities and its limitation in holding information. VWM span, the most popular measurable indicator, is found to reach the adult level (3-4 items) around 12-13 years’ old, while less is known about the precision development of the VWM capacity. By using simultaneously delayed responses paradigm, the present study investigates the development of VWM precision among 6-18-year-old children and young adults, besides its possible relationships with fluid intelligence and span. Results showed that precision and span both increased with age, and precision reached the maximum in 16-17 age-range. Moreover, when remembering 3 simultaneously presented items, the probability of remembering target item correlated with fluid intelligence and the probability of wrap errors (misbinding target and non-target items) correlated with age. When remembering more items, children had worse performance than adults due to their wrap errors. Compared to span, VWM precision was effective predictor of intelligence even after controlling for age. These results suggest that unlike VWM span, precision developed in a slow, yet longer fashion. Moreover, decreasing probability of wrap errors might be the main reason for the development of precision. Last, precision correlated more closely with intelligence than span in childhood and adolescence, which might be caused by the probability of remembering target item.

Keywords: fluid intelligence, precision, visual working memory, wrap errors

Procedia PDF Downloads 255
6080 Performance of Nine Different Types of PV Modules in the Tropical Region

Authors: Jiang Fan

Abstract:

With growth of PV market in tropical region, it is necessary to investigate the performance of different types of PV technology under the tropical weather conditions. Singapore Polytechnic was funded by Economic Development Board (EDB) to set up a solar PV test-bed for the research on performance of different types of PV modules in the country. The PV test-bed installed the nine different types of PV systems that are integrated to power utility grid for monitoring and analyzing their operating performances. This paper presents the 12 months operational data of nine different PV systems and analyses on performances of installed PV systems using energy yield and performance ratio. The nine types of PV systems under test have shown their energy yields ranging from 2.67 to 3.36 kWh/kWp and their performance ratios (PRs) ranging from 70% to 88%.

Keywords: monocrystalline, multicrystalline, amorphous silicon, cadmium telluride, thin film PV

Procedia PDF Downloads 484
6079 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process

Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza

Abstract:

The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.

Keywords: artificial intelligence, harmonization, laws, intelligence

Procedia PDF Downloads 123
6078 Energy Complementary in Colombia: Imputation of Dataset

Authors: Felipe Villegas-Velasquez, Harold Pantoja-Villota, Sergio Holguin-Cardona, Alejandro Osorio-Botero, Brayan Candamil-Arango

Abstract:

Colombian electricity comes mainly from hydric resources, affected by environmental variations such as the El Niño phenomenon. That is why incorporating other types of resources is necessary to provide electricity constantly. This research seeks to fill the wind speed and global solar irradiance dataset for two years with the highest amount of information. A further result is the characterization of the data by region that led to infer which errors occurred and offered the incomplete dataset.

Keywords: energy, wind speed, global solar irradiance, Colombia, imputation

Procedia PDF Downloads 123
6077 Evaluation of Vehicle Classification Categories: Florida Case Study

Authors: Ren Moses, Jaqueline Masaki

Abstract:

This paper addresses the need for accurate and updated vehicle classification system through a thorough evaluation of vehicle class categories to identify errors arising from the existing system and proposing modifications. The data collected from two permanent traffic monitoring sites in Florida were used to evaluate the performance of the existing vehicle classification table. The vehicle data were collected and classified by the automatic vehicle classifier (AVC), and a video camera was used to obtain ground truth data. The Federal Highway Administration (FHWA) vehicle classification definitions were used to define vehicle classes from the video and compare them to the data generated by AVC in order to identify the sources of misclassification. Six types of errors were identified. Modifications were made in the classification table to improve the classification accuracy. The results of this study include the development of updated vehicle classification table with a reduction in total error by 5.1%, a step by step procedure to use for evaluation of vehicle classification studies and recommendations to improve FHWA 13-category rule set. The recommendations for the FHWA 13-category rule set indicate the need for the vehicle classification definitions in this scheme to be updated to reflect the distribution of current traffic. The presented results will be of interest to States’ transportation departments and consultants, researchers, engineers, designers, and planners who require accurate vehicle classification information for planning, designing and maintenance of transportation infrastructures.

Keywords: vehicle classification, traffic monitoring, pavement design, highway traffic

Procedia PDF Downloads 164
6076 Measurement Errors and Misclassifications in Covariates in Logistic Regression: Bayesian Adjustment of Main and Interaction Effects and the Sample Size Implications

Authors: Shahadut Hossain

Abstract:

Measurement errors in continuous covariates and/or misclassifications in categorical covariates are common in epidemiological studies. Regression analysis ignoring such mismeasurements seriously biases the estimated main and interaction effects of covariates on the outcome of interest. Thus, adjustments for such mismeasurements are necessary. In this research, we propose a Bayesian parametric framework for eliminating deleterious impacts of covariate mismeasurements in logistic regression. The proposed adjustment method is unified and thus can be applied to any generalized linear and non-linear regression models. Furthermore, adjustment for covariate mismeasurements requires validation data usually in the form of either gold standard measurements or replicates of the mismeasured covariates on a subset of the study population. Initial investigation shows that adequacy of such adjustment depends on the sizes of main and validation samples, especially when prevalences of the categorical covariates are low. Thus, we investigate the impact of main and validation sample sizes on the adjusted estimates, and provide a general guideline about these sample sizes based on simulation studies.

Keywords: measurement errors, misclassification, mismeasurement, validation sample, Bayesian adjustment

Procedia PDF Downloads 392
6075 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements

Authors: Shagufta Tabassum

Abstract:

The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. In this paper, we discuss the basic calibration and normalization procedure for time-domain reflectometry measurements. Our approach is to explain the different types of error occur during TDR measurements and how these errors can be eliminated or minimized.

Keywords: time domain reflectometry measurement techinque, cable and connector loss, oscilloscope loss, and normalization technique

Procedia PDF Downloads 183
6074 Short Arc Technique for Baselines Determinations

Authors: Gamal F.Attia

Abstract:

The baselines are the distances and lengths of the chords between projections of the positions of the laser stations on the reference ellipsoid. For the satellite geodesy, it is very important to determine the optimal length of orbital arc along which laser measurements are to be carried out. It is clear that for the dynamical methods long arcs (one month or more) are to be used. According to which more errors of modeling of different physical forces such as earth's gravitational field, air drag, solar radiation pressure, and others that may influence the accuracy of the estimation of the satellites position, at the same time the measured errors con be almost completely excluded and high stability in determination of relative coordinate system can be achieved. It is possible to diminish the influence of the errors of modeling by using short-arcs of the satellite orbit (several revolutions or days), but the station's coordinates estimated by different arcs con differ from each other by a larger quantity than statistical zero. Under the semidynamical ‘short arc’ method one or several passes of the satellite in one of simultaneous visibility from both ends of the chord is known and the estimated parameter in this case is the length of the chord. The comparison of the same baselines calculated with long and short arcs methods shows a good agreement and even speaks in favor of the last one. In this paper the Short Arc technique has been explained and 3 baselines have been determined using the ‘short arc’ method.

Keywords: baselines, short arc, dynamical, gravitational field

Procedia PDF Downloads 449
6073 Preparation Control Information and Analyzing of Metering Gas System Based of Orifice Plate

Authors: A. Harrouz, A. Benatiallah, O. Harrouz

Abstract:

This paper presents the search for errors in the measurement instruments in a dynamic system of metering liquid or gas and sees the tolerance defined by the international standards and recommendations. We will implement a program on MATLAB/Simulink which is calculated based on the ISO-5167. This program will take the system parameters on considerations such as: the willingness plates, the size of the orifice, the given design conditions, reference conditions, find pressure drop for a given flow, or flow for a loss of given load. The results are considered very good and satisfactory because the errors identified of measuring instruments system are within the margin of error limit by the regulations.

Keywords: analyzing, control, gas, meters system

Procedia PDF Downloads 380
6072 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos

Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog

Abstract:

Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.

Keywords: vaccinated, unvaccinated, socoal distancing, filipinos

Procedia PDF Downloads 180
6071 Reasons for the Selection of Information-Processing Framework and the Philosophy of Mind as a General Account for an Error Analysis and Explanation on Mathematics

Authors: Michael Lousis

Abstract:

This research study is concerned with learner’s errors on Arithmetic and Algebra. The data resulted from a broader international comparative research program called Kassel Project. However, its conceptualisation differed from and contrasted with that of the main program, which was mostly based on socio-demographic data. The way in which the research study was conducted, was not dependent on the researcher’s discretion, but was absolutely dictated by the nature of the problem under investigation. This is because the phenomenon of learners’ mathematical errors is due neither to the intentions of learners nor to institutional processes, rules and norms, nor to the educators’ intentions and goals; but rather to the way certain information is presented to learners and how their cognitive apparatus processes this information. Several approaches for the study of learners’ errors have been developed from the beginning of the 20th century, encompassing different belief systems. These approaches were based on the behaviourist theory, on the Piagetian- constructivist research framework, the perspective that followed the philosophy of science and the information-processing paradigm. The researcher of the present study was forced to disclose the learners’ course of thinking that led them in specific observable actions with the result of showing particular errors in specific problems, rather than analysing scripts with the students’ thoughts presented in a written form. This, in turn, entailed that the choice of methods would have to be appropriate and conducive to seeing and realising the learners’ errors from the perspective of the participants in the investigation. This particular fact determined important decisions to be made concerning the selection of an appropriate framework for analysing the mathematical errors and giving explanations. Thus the rejection of the belief systems concerning behaviourism, the Piagetian-constructivist, and philosophy of science perspectives took place, and the information-processing paradigm in conjunction with the philosophy of mind were adopted as a general account for the elaboration of data. This paper explains why these decisions were appropriate and beneficial for conducting the present study and for the establishment of the ensued thesis. Additionally, the reasons for the adoption of the information-processing paradigm in conjunction with the philosophy of mind give sound and legitimate bases for the development of future studies concerning mathematical error analysis are explained.

Keywords: advantages-disadvantages of theoretical prospects, behavioral prospect, critical evaluation of theoretical prospects, error analysis, information-processing paradigm, opting for the appropriate approach, philosophy of science prospect, Piagetian-constructivist research frameworks, review of research in mathematical errors

Procedia PDF Downloads 171
6070 Creation and Implementation of A New Palliative Care Drug Chart, via A Closed-Loop Audit

Authors: Asfa Hussain, Chee Tang, Mien Nguyen

Abstract:

Introduction: The safe usage of medications is dependent on clear, well-documented prescribing. Medical drug charts should be regularly checked to ensure that they are fit for purpose. Aims: The purpose of this study was to evaluate whether the Isabel Hospice drug charts were effective or prone to medical errors. The aim was to create a comprehensive palliative care drug chart in line with medico-legal guidelines and to minimise drug administration and prescription errors. Methodology: 50 medical drug charts were audited from March to April 2020, to assess whether they complied with medico-legal guidelines, in a hospice within East of England. Meetings were held with the larger multi-disciplinary team (MDT), including the pharmacists, nursing staff and doctors, to raise awareness of the issue. A preliminary drug chart was created, using the input from the wider MDT. The chart was revised and trialled over 15 times, and each time feedback from the MDT was incorporated into the subsequent template. In the midst of the COVID-19 pandemic in September 2020, the finalised drug chart was trialled. 50 new palliative drug charts were re-audited, to evaluate the changes made. Results: Prescribing and administration errors were high prior to the implementation of the new chart. This improved significantly after introducing the new drug charts, therefore improving patient safety and care. The percentage of inadequately documented allergies went down from 66% to 20% and incorrect oxygen prescription from 40% to 16%. The prescription drug-drug interactions decreased by 30%. Conclusion: It is vital to have clear standardised drug charts, in line with medico-legal standards, to allow ease of prescription and administration of medications and ensure optimum patient-centred care. This closed loop audit demonstrated significant improvement in documentation and prevention of possible fatal drug errors and interactions.

Keywords: palliative care, drug chart, medication errors, drug-drug interactions, COVID-19, patient safety

Procedia PDF Downloads 150
6069 Evaluation of Correct Usage, Comfort and Fit of Personal Protective Equipment in Construction Work

Authors: Anna-Lisa Osvalder, Jonas Borell

Abstract:

There are several reasons behind the use, non-use, or inadequate use of personal protective equipment (PPE) in the construction industry. Comfort and accurate size support proper use, while discomfort, misfit, and difficulties to understand how the PPEs should be handled inhibit correct usage. The need for several protective equipments simultaneously might also create problems. The purpose of this study was to analyse the correct usage, comfort, and fit of different types of PPEs used for construction work. Correct usage was analysed as guessability, i.e., human perceptions of how to don, adjust, use, and doff the equipment, and if used as intended. The PPEs tested individually or in combinations were a helmet, ear protectors, goggles, respiratory masks, gloves, protective cloths, and safety harnesses. First, an analytical evaluation was performed with ECW (enhanced cognitive walkthrough) and PUEA (predictive use error analysis) to search for usability problems and use errors during handling and use. Then usability tests were conducted to evaluate guessability, comfort, and fit with 10 test subjects of different heights and body constitutions. The tests included observations during donning, five different outdoor work tasks, and doffing. The think-aloud method, short interviews, and subjective estimations were performed. The analytical evaluation showed that some usability problems and use errors arise during donning and doffing, but with minor severity, mostly causing discomfort. A few use errors and usability problems arose for the safety harness, especially for novices, where some could lead to a high risk of severe incidents. The usability tests showed that discomfort arose for all test subjects when using a combination of PPEs, increasing over time. For instance, goggles, together with the face mask, caused pressure, chafing at the nose, and heat rash on the face. This combination also limited sight of vision. The helmet, in combination with the goggles and ear protectors, did not fit well and caused uncomfortable pressure at the temples. No major problems were found with the individual fit of the PPEs. The ear protectors, goggles, and face masks could be adjusted for different head sizes. The guessability for how to don and wear the combination of PPE was moderate, but it took some time to adjust them for a good fit. The guessability was poor for the safety harness; few clues in the design showed how it should be donned, adjusted, or worn on the skeletal bones. Discomfort occurred when the straps were tightened too much. All straps could not be adjusted for somebody's constitutions leading to non-optimal safety. To conclude, if several types of PPEs are used together, discomfort leading to pain is likely to occur over time, which can lead to misuse, non-use, or reduced performance. If people who are not regular users should wear a safety harness correctly, the design needs to be improved for easier interpretation, correct position of the straps, and increased possibilities for individual adjustments. The results from this study can be a base for re-design ideas for PPE, especially when they should be used in combinations.

Keywords: construction work, PPE, personal protective equipment, misuse, guessability, usability

Procedia PDF Downloads 68
6068 Interlingual Interference in Students’ Writing

Authors: Zakaria Khatraoui

Abstract:

Interlanguage has transcendentally capitalized its central role over a considerable metropolitan landscape. Either academically driven or pedagogically oriented, Interlanguage has principally floated as important than ever before. It academically probes theoretical and linguistic issues in the turf and further malleably flows from idea to reality to vindicate a bridging philosophy between theory and educational rehearsal. Characteristically, the present research grants a prolifically developed theoretical framework that is conversely sustained by empirical teaching practices, along with teasing apart the narrowly confined implementation. The focus of this interlingual study is placed stridently on syntactic errors projected in students’ writing as performance. To attain this endeavor, the paper appropriates qualitatively a plethora of focal methodological choices sponsored by a solid design. The steadily undeniable ipso facto to be examined is the creative sense of syntactic errors unequivocally endorsed by the tangible dominance of cognitively intralingual errors over linguistically interlingual ones. Subsequently, this paper attempts earnestly to highlight transferable implications worth indicating both theoretical and pedagogically professional principles. In particular, results are fundamentally relative to the scholarly community in a multidimensional sense to recommend actions of educational value.

Keywords: interlanguage, interference, error, writing

Procedia PDF Downloads 44
6067 Improving Health Care and Patient Safety at the ICU by Using Innovative Medical Devices and ICT Tools: Examples from Bangladesh

Authors: Mannan Mridha, Mohammad S. Islam

Abstract:

Innovative medical technologies offer more effective medical care, with less risk to patient and healthcare personnel. Medical technology and devices when properly used provide better data, precise monitoring and less invasive treatments and can be more targeted and often less costly. The Intensive Care Unit (ICU) equipped with patient monitoring, respiratory and cardiac support, pain management, emergency resuscitation and life support devices is particularly prone to medical errors for various reasons. Many people in the developing countries now wonder whether their visit to hospital might harm rather than help them. This is because; clinicians in the developing countries are required to maintain an increasing workload with limited resources and absence of well-functioning safety system. A team of experts from the medical, biomedical and clinical engineering in Sweden and Bangladesh have worked together to study the incidents, adverse events at the ICU in Bangladesh. The study included both public and private hospitals to provide a better understanding for physical structure, organization and practice in operating processes of care, and the occurrence of adverse outcomes the errors, risks and accidents related to medical devices at the ICU, and to develop a ICT based support system in order to reduce hazards and errors and thus improve the quality of performance, care and cost effectiveness at the ICU. Concrete recommendations and guidelines have been made for preparing appropriate ICT related tools and methods for improving the routine for use of medical devices, reporting and analyzing of the incidents at the ICU in order to reduce the number of undetected and unsolved incidents and thus improve the patient safety.

Keywords: intensive care units, medical errors, medical devices, patient care and safety

Procedia PDF Downloads 127
6066 Measuring the Effectiveness of Response Inhibition regarding to Motor Complexity: Evidence from the Stroop Effect

Authors: Germán Gálvez-García, Marta Lavin, Javiera Peña, Javier Albayay, Claudio Bascour, Jesus Fernandez-Gomez, Alicia Pérez-Gálvez

Abstract:

We studied the effectiveness of response inhibition in movements with different degrees of motor complexity when they were executed in isolation and alternately. Sixteen participants performed the Stroop task which was used as a measure of response inhibition. Participants responded by lifting the index finger and reaching the screen with the same finger. Both actions were performed separately and alternately in different experimental blocks. Repeated measures ANOVAs were used to compare reaction time, movement time, kinematic errors and Movement errors across conditions (experimental block, movement, and congruency). Delta plots were constructed to perform distributional analyses of response inhibition and accuracy rate. The effectiveness of response inhibition did not show difference when the movements were performed in separated blocks. Nevertheless, it showed differences when they were performed alternately in the same experimental block, being more effective for the lifting action. This could be due to a competition of the available resources during a more complex scenario which also demands to adopt some strategy to avoid errors.

Keywords: response inhibition, motor complexity, Stroop task, delta plots

Procedia PDF Downloads 375
6065 Symo-syl: A Meta-Phonological Intervention to Support Italian Pre-Schoolers’ Emergent Literacy Skills

Authors: Tamara Bastianello, Rachele Ferrari, Marinella Majorano

Abstract:

The adoption of the syllabic approach in preschool programmes could support and reinforce meta-phonological awareness and literacy skills in children. The introduction of a meta-phonological intervention in preschool could facilitate the transition to primary school, especially for children with learning fragilities. In the present contribution, we want to investigate the efficacy of "Simo-syl" intervention in enhancing emergent literacy skills in children (especially for reading). Simo-syl is a 12 weeks multimedia programme developed for children to improve their language and communication skills and later literacy development in preschool. During the intervention, Simo-syl, an invented character, leads children in a series of meta-phonological games. Forty-six Italian preschool children (i.e., the Simo-syl group) participated in the programme; seventeen preschool children (i.e., the control group) did not participate in the intervention. Children in the two groups were between 4;10 and 5;9 years. They were assessed on their vocabulary, morpho-syntactical, meta-phonological, phonological, and phono-articulatory skills twice: 1) at the beginning of the last year of the preschool through standardised paper-based assessment tools and 2) one week after the intervention. All children in the Simo-syl group took part in the meta-phonological programme based on the syllabic approach. The intervention lasted 12 weeks (three activities per week; week 1: activities focused on syllable blending and spelling and a first approach to the written code; weeks 2-11: activities focused on syllables recognition; week 12: activities focused on vowels recognition). Very few children (Simo-syl group = 21, control group = 9) were tested again (post-test) one week after the intervention. Before starting the intervention programme, the Simo-syl and the control groups had similar meta-phonological, phonological, lexical skills (all ps > .05). One week after the intervention, a significant difference emerged between the two groups in their meta-phonological skills (syllable blending, p = .029; syllable spelling, p = .032), in their vowel recognition ability (p = .032) and their word reading skills (p = .05). An ANOVA confirmed the effect of the group membership on the developmental growth for the word reading task (F (1,28) = 6.83, p = .014, ηp2 = .196). Taking part in the Simo-syl intervention has a positive effect on the ability to read in preschool children.

Keywords: intervention programme, literacy skills, meta-phonological skills, syllabic approach

Procedia PDF Downloads 141
6064 Forecasting Age-Specific Mortality Rates and Life Expectancy at Births for Malaysian Sub-Populations

Authors: Syazreen N. Shair, Saiful A. Ishak, Aida Y. Yusof, Azizah Murad

Abstract:

In this paper, we forecast age-specific Malaysian mortality rates and life expectancy at births by gender and ethnic groups including Malay, Chinese and Indian. Two mortality forecasting models are adopted the original Lee-Carter model and its recent modified version, the product ratio coherent model. While the first forecasts the mortality rates for each subpopulation independently, the latter accounts for the relationship between sub-populations. The evaluation of both models is performed using the out-of-sample forecast errors which are mean absolute percentage errors (MAPE) for mortality rates and mean forecast errors (MFE) for life expectancy at births. The best model is then used to perform the long-term forecasts up to the year 2030, the year when Malaysia is expected to become an aged nation. Results suggest that in terms of overall accuracy, the product ratio model performs better than the original Lee-Carter model. The association of lower mortality group (Chinese) in the subpopulation model can improve the forecasts of high mortality groups (Malay and Indian).

Keywords: coherent forecasts, life expectancy at births, Lee-Carter model, product-ratio model, mortality rates

Procedia PDF Downloads 200
6063 Mathematical Competence as It Is Defined through Learners' Errors in Arithmetic and Algebra

Authors: Michael Lousis

Abstract:

Mathematical competence is the great aim of every mathematical teaching and learning endeavour. This can be defined as an idealised conceptualisation of the quality of cognition and the ability of implementation in practice of the mathematical subject matter, which is included in the curriculum, and is displayed only through performance of doing mathematics. The present study gives a clear definition of mathematical competence in the domains of Arithmetic and Algebra that stems from the explanation of the learners’ errors in these domains. The learners, whose errors are explained, were Greek and English participants of a large, international, longitudinal, comparative research program entitled the Kassel Project. The participants’ errors emerged as results of their work in dealing with mathematical questions and problems of the tests, which were presented to them. The construction of the tests was such as only the outcomes of the participants’ work was to be encompassed and not their course of thinking, which resulted in these outcomes. The intention was that the tests had to provide undeviating comparable results and simultaneously avoid any probable bias. Any bias could stem from obtaining results by involving so many markers from different countries and cultures, with so many different belief systems concerning the assessment of learners’ course of thinking. In this way the validity of the research was protected. This fact forced the implementation of specific research methods and theoretical prospects to take place in order the participants’ erroneous way of thinking to be disclosed. These were Methodological Pragmatism, Symbolic Interactionism, Philosophy of Mind and the ideas of Computationalism, which were used for deciding and establishing the grounds of the adequacy and legitimacy of the obtained kinds of knowledge through the explanations given by the error analysis. The employment of this methodology and of these theoretical prospects resulted in the definition of the learners’ mathematical competence, which is the thesis of the present study. Thus, learners’ mathematical competence is depending upon three key elements that should be developed in their minds: appropriate representations, appropriate meaning, and appropriate developed schemata. This definition then determined the development of appropriate teaching practices and interventions conducive to the achievement and finally the entailment of mathematical competence.

Keywords: representations, meaning, appropriate developed schemata, computationalism, error analysis, explanations for the probable causes of the errors, Kassel Project, mathematical competence

Procedia PDF Downloads 248
6062 FT-NIR Method to Determine Moisture in Gluten Free Rice-Based Pasta during Drying

Authors: Navneet Singh Deora, Aastha Deswal, H. N. Mishra

Abstract:

Pasta is one of the most widely consumed food products around the world. Rapid determination of the moisture content in pasta will assist food processors to provide online quality control of pasta during large scale production. Rapid Fourier transform near-infrared method (FT-NIR) was developed for determining moisture content in pasta. A calibration set of 150 samples, a validation set of 30 samples and a prediction set of 25 samples of pasta were used. The diffuse reflection spectra of different types of pastas were measured by FT-NIR analyzer in the 4,000-12,000 cm-1 spectral range. Calibration and validation sets were designed for the conception and evaluation of the method adequacy in the range of moisture content 10 to 15 percent (w.b) of the pasta. The prediction models based on partial least squares (PLS) regression, were developed in the near-infrared. Conventional criteria such as the R2, the root mean square errors of cross validation (RMSECV), root mean square errors of estimation (RMSEE) as well as the number of PLS factors were considered for the selection of three pre-processing (vector normalization, minimum-maximum normalization and multiplicative scatter correction) methods. Spectra of pasta sample were treated with different mathematic pre-treatments before being used to build models between the spectral information and moisture content. The moisture content in pasta predicted by FT-NIR methods had very good correlation with their values determined via traditional methods (R2 = 0.983), which clearly indicated that FT-NIR methods could be used as an effective tool for rapid determination of moisture content in pasta. The best calibration model was developed with min-max normalization (MMN) spectral pre-processing (R2 = 0.9775). The MMN pre-processing method was found most suitable and the maximum coefficient of determination (R2) value of 0.9875 was obtained for the calibration model developed.

Keywords: FT-NIR, pasta, moisture determination, food engineering

Procedia PDF Downloads 239
6061 Identification of Architectural Design Error Risk Factors in Construction Projects Using IDEF0 Technique

Authors: Sahar Tabarroki, Ahad Nazari

Abstract:

The design process is one of the most key project processes in the construction industry. Although architects have the responsibility to produce complete, accurate, and coordinated documents, architectural design is accompanied by many errors. A design error occurs when the constraints and requirements of the design are not satisfied. Errors are potentially costly and time-consuming to correct if not caught early during the design phase, and they become expensive in either construction documents or in the construction phase. The aim of this research is to identify the risk factors of architectural design errors, so identification of risks is necessary. First, a literature review in the design process was conducted and then a questionnaire was designed to identify the risks and risk factors. The questions in the form of the questionnaire were based on the “similar service description of study and supervision of architectural works” published by “Vice Presidency of Strategic Planning & Supervision of I.R. Iran” as the base of architects’ tasks. Second, the top 10 risks of architectural activities were identified. To determine the positions of possible causes of risks with respect to architectural activities, these activities were located in a design process modeled by the IDEF0 technique. The research was carried out by choosing a case study, checking the design drawings, interviewing its architect and client, and providing a checklist in order to identify the concrete examples of architectural design errors. The results revealed that activities such as “defining the current and future requirements of the project”, “studies and space planning,” and “time and cost estimation of suggested solution” has a higher error risk than others. Moreover, the most important causes include “unclear goals of a client”, “time force by a client”, and “lack of knowledge of architects about the requirements of end-users”. For error detecting in the case study, lack of criteria, standards and design criteria, and lack of coordination among them, was a barrier, anyway, “lack of coordination between architectural design and electrical and mechanical facility”, “violation of the standard dimensions and sizes in space designing”, “design omissions” were identified as the most important design errors.

Keywords: architectural design, design error, risk management, risk factor

Procedia PDF Downloads 112
6060 The Acquisition of Spanish L4 by Learners with Croatian L1, English L2 and Italian L3

Authors: Barbara Peric

Abstract:

The study of acquiring a third and additional language has garnered significant focus within second language acquisition (SLA) research. Initially, it was commonly viewed as merely an extension of second language acquisition (SLA). However, in the last two decades, numerous researchers have emphasized the need to recognize the unique characteristics of third language acquisition (TLA). This recognition is crucial for understanding the intricate cognitive processes that arise from the interaction of more than two linguistic systems in the learner's mind. This study investigates cross-linguistic influences in the acquisition of Spanish as a fourth language by students who have Croatian as a first language (L1). English as a second language (L2), and Italian as a third language (L3). Observational data suggests that influence or transfer of linguistic elements can arise not only from one's native language (L1) but also from non-native languages. This implies that, for individuals proficient in multiple languages, the native language doesn't consistently hold a superior position. Instead, it should be examined alongside other potential sources of linguistic transfer. Earlier studies have demonstrated that high proficiency in a second language can significantly impact cross-linguistic influences when acquiring a third and additional language. Among the extensively examined factors, the typological relationship stands out as one of the most scrutinized variables. The goal of the present study was to explore whether language typology and formal similarity or proficiency in the second language had a more significant impact on L4 acquisition. Participants in this study were third-year undergraduate students at Rochester Institute of Technology’s subsidiary in Croatia (RIT Croatia). All the participants had exclusively Croatian as L1, English as L2, Italian as L3 and were learning Spanish as L4 at the time of the study. All the participants had a high level of proficiency in English and low level of proficiency in Italian. Based on the error analysis the findings indicate that for some types of lexical errors such as coinage, language typology had a more significant impact and Italian language was the preferred source of transfer despite the law proficiency in that language. For some other types of lexical errors, such as calques, second language proficiency had a more significant impact, and English language was the preferred source of transfer. On the other hand, Croatian, Italian, and Spanish are more similar in the area of morphology due to higher degree of inflection compared to English and the strongest influence of the Croatian language was precisely in the area of morphology. The results emphasize the need to consider linguistic resemblances between the native language (L1) and the third and additional language as well as the learners' proficiency in the second language when developing successful teaching strategies for acquiring the third and additional language. These conclusions add to the expanding knowledge in the realm of Second Language Acquisition (SLA) and offer practical insights for language educators aiming to enhance the effectiveness of learning experiences in acquiring a third and additional language.

Keywords: third and additional language acquisition, cross-linguistic influences, language proficiency, language typology

Procedia PDF Downloads 26
6059 Implementing Fault Tolerance with Proxy Signature on the Improvement of RSA System

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Fault tolerance and data security are two important issues in modern communication systems. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on the improved RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.

Keywords: fault tolerance, improved RSA, key agreement, proxy signature

Procedia PDF Downloads 395
6058 The Effect of Gender on the Three Types of Aggression among Kuwaiti Children

Authors: Hend Almaseb

Abstract:

Child aggression is a serious social problem that affects children’s lives. This study examines the relationship between three types of aggressive behaviors–physical, verbal, and indirect aggression–from sociocultural and social work perspectives. Also, it investigates the effect of gender on the three types of aggressive behaviors and the most frequently used aggressive behaviors among a sample of 329 Kuwaiti children. The results show that there is a positive correlation between the three types of aggression and gender.

Keywords: child aggression, indirect aggression, physical aggression, verbal aggression

Procedia PDF Downloads 351
6057 Feedback in the Language Class: An Action Research Process

Authors: Arash Golzari Koloor

Abstract:

Feedback seems to be an inseparable part of teaching a second/foreign language. One type of feedback is corrective feedback which is one type of error treatment in second language classrooms. This study is a report on the types of corrective feedback employed in an IELTS preparation course. The types of feedback, their frequencies, and their effectiveness are enlisted, enumerated, and interpreted. The results showed that explicit correction and recast were the most frequent types of feedback while repetition and elicitation were the least. The results also revealed that metalinguistic feedback, elicitation, and explicit correction were the most effective types of feedback and affected learners performance greatly.

Keywords: classroom interaction, corrective feedback, error treatment, oral performance

Procedia PDF Downloads 309
6056 Orthogonal Regression for Nonparametric Estimation of Errors-In-Variables Models

Authors: Anastasiia Yu. Timofeeva

Abstract:

Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.

Keywords: grade point average, orthogonal regression, penalized regression spline, locally weighted regression

Procedia PDF Downloads 388
6055 Block Matching Based Stereo Correspondence for Depth Calculation

Authors: G. Balakrishnan

Abstract:

Stereo Correspondence plays a major role in estimation of distance of an object from the stereo camera pair for various applications. In this paper, a stereo correspondence algorithm based on block-matching technique is presented. Initially, an energy matrix is calculated for every disparity obtained using modified Sum of Absolute Difference (SAD). Higher energy matrix errors are removed by using threshold value in order to reduce the mismatch errors. A smoothening filter is applied to eliminate unreliable disparity estimate across the object boundaries. The purpose is to improve the reliability of calculation of disparity map. The experimental results obtained shows that the final depth map produce better results and can be used to all the applications using stereo cameras.

Keywords: stereo matching, filters, energy matrix, disparity

Procedia PDF Downloads 198
6054 An Approach to Specify Software Requirements in Semantic Form

Authors: Deepa Vijay, Chellammal Surianarayanan, Gopinath Ganapathy

Abstract:

Requirements of a software project serve as a guideline for the entire project team which enable the team towards producing the right outcome. As requirements are the key in deciding the success of the project, it should be specified in an unambiguous manner. Also, the requirements should be complete and consistent. It should be interpreted in the same way by the entire software project team as the customer interprets. Specifying requirements in textual manner is common in software development. This leads to poor understanding of the requirements which results in more errors and degraded quality. There are some literatures which focus on semantic way of specifying functional requirement which ensure the consistency and completeness of requirements. Alternately in the work, a method is proposed to map the syntactic requirements with corresponding semantics in the form of ontologies. This improves the understanding of requirements, prevents errors and improves quality.

Keywords: functional requirement, ontology, requirements management, semantics

Procedia PDF Downloads 350
6053 A Corpus Output Error Analysis of Chinese L2 Learners From America, Myanmar, and Singapore

Authors: Qiao-Yu Warren Cai

Abstract:

Due to the rise of big data, building corpora and using them to analyze ChineseL2 learners’ language output has become a trend. Various empirical research has been conducted using Chinese corpora built by different academic institutes. However, most of the research analyzed the data in the Chinese corpora usingcorpus-based qualitative content analysis with descriptive statistics. Descriptive statistics can be used to make summations about the subjects or samples that research has actually measured to describe the numerical data, but the collected data cannot be generalized to the population. Comte, a Frenchpositivist, has argued since the 19th century that human beings’ knowledge, whether the discipline is humanistic and social science or natural science, should be verified in a scientific way to construct a universal theory to explain the truth and human beings behaviors. Inferential statistics, able to make judgments of the probability of a difference observed between groups being dependable or caused by chance (Free Geography Notes, 2015)and to infer from the subjects or examples what the population might think or behave, is just the right method to support Comte’s argument in the field of TCSOL. Also, inferential statistics is a core of quantitative research, but little research has been conducted by combing corpora with inferential statistics. Little research analyzes the differences in Chinese L2 learners’ language corpus output errors by using theOne-way ANOVA so that the findings of previous research are limited to inferring the population's Chinese errors according to the given samples’ Chinese corpora. To fill this knowledge gap in the professional development of Taiwanese TCSOL, the present study aims to utilize the One-way ANOVA to analyze corpus output errors of Chinese L2 learners from America, Myanmar, and Singapore. The results show that no significant difference exists in ‘shì (是) sentence’ and word order errors, but compared with Americans and Singaporeans, it is significantly easier for Myanmar to have ‘sentence blends.’ Based on the above results, the present study provides an instructional approach and contributes to further exploration of how Chinese L2 learners can have (and use) learning strategies to lower errors.

Keywords: Chinese corpus, error analysis, one-way analysis of variance, Chinese L2 learners, Americans, myanmar, Singaporeans

Procedia PDF Downloads 85