Search results for: target error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4433

Search results for: target error

3953 Comparing the SALT and START Triage System in Disaster and Mass Casualty Incidents: A Systematic Review

Authors: Hendri Purwadi, Christine McCloud

Abstract:

Triage is a complex decision-making process that aims to categorize a victim’s level of acuity and the need for medical assistance. Two common triage systems have been widely used in Mass Casualty Incidents (MCIs) and disaster situation are START (Simple triage algorithm and rapid treatment) and SALT (sort, asses, lifesaving, intervention, and treatment/transport). There is currently controversy regarding the effectiveness of SALT over START triage system. This systematic review aims to investigate and compare the effectiveness between SALT and START triage system in disaster and MCIs setting. Literatures were searched via systematic search strategy from 2009 until 2019 in PubMed, Cochrane Library, CINAHL, Scopus, Science direct, Medlib, ProQuest. This review included simulated-based and medical record -based studies investigating the accuracy and applicability of SALT and START triage systems of adult and children population during MCIs and disaster. All type of studies were included. Joana Briggs institute critical appraisal tools were used to assess the quality of reviewed studies. As a result, 1450 articles identified in the search, 10 articles were included. Four themes were identified by review, they were accuracy, under-triage, over-triage and time to triage per individual victim. The START triage system has a wide range and inconsistent level of accuracy compared to SALT triage system (44% to 94. 2% of START compared to 70% to 83% of SALT). The under-triage error of START triage system ranged from 2.73% to 20%, slightly lower than SALT triage system (7.6 to 23.3%). The over-triage error of START triage system was slightly greater than SALT triage system (START ranged from 2% to 53% compared to 2% to 22% of SALT). The time for applying START triage system was faster than SALT triage system (START was 70-72.18 seconds compared to 78 second of SALT). Consequently; The START triage system has lower level of under-triage error and faster than SALT triage system in classifying victims of MCIs and disaster whereas SALT triage system is known slightly more accurate and lower level of over-triage. However, the magnitude of these differences is relatively small, and therefore the effect on the patient outcomes is not significance. Hence, regardless of the triage error, either START or SALT triage system is equally effective to triage victims of disaster and MCIs.

Keywords: disaster, effectiveness, mass casualty incidents, START triage system, SALT triage system

Procedia PDF Downloads 112
3952 Adaptation in Translation of 'Christmas Every Day' Short Story by William Dean Howells

Authors: Mohsine Khazrouni

Abstract:

The present study is an attempt to highlight the importance of adaptation in translation. To convey the message, the translator needs to take into account not only the text but also extra-linguistic factors such as the target audience. The present paper claims that adaptation is an unavoidable translation strategy when dealing with texts that are heavy with religious and cultural themes. The translation task becomes even more challenging when dealing with children’s literature as the audience are children whose comprehension, experience and world knowledge are limited. The study uses the Arabic translation of the short story ‘Christmas Every Day’ as a case study. The short story will be translated, and the pragmatic problems involved will be discussed. The focus will be on the issue of adaptation. i.e., the source text should be adapted to the target language audience`s social and cultural environment.

Keywords: pragmatic adaptation, Arabic translation, children's literature, equivalence

Procedia PDF Downloads 184
3951 Integrating Deterministic and Probabilistic Safety Assessment to Decrease Risk & Energy Consumption in a Typical PWR

Authors: Ebrahim Ghanbari, Mohammad Reza Nematollahi

Abstract:

Integrating deterministic and probabilistic safety assessment (IDPSA) is one of the most commonly used issues in the field of safety analysis of power plant accident. It has also been recognized today that the role of human error in creating these accidents is not less than systemic errors, so the human interference and system errors in fault and event sequences are necessary. The integration of these analytical topics will be reflected in the frequency of core damage and also the study of the use of water resources in an accident such as the loss of all electrical power of the plant. In this regard, the SBO accident was simulated for the pressurized water reactor in the deterministic analysis issue, and by analyzing the operator's behavior in controlling the accident, the results of the combination of deterministic and probabilistic assessment were identified. The results showed that the best performance of the plant operator would reduce the risk of an accident by 10%, as well as a decrease of 6.82 liters/second of the water sources of the plant.

Keywords: IDPSA, human error, SBO, risk

Procedia PDF Downloads 105
3950 Heat Transfer Enhancement by Turbulent Impinging Jet with Jet's Velocity Field Excitations Using OpenFOAM

Authors: Naseem Uddin

Abstract:

Impinging jets are used in variety of engineering and industrial applications. This paper is based on numerical simulations of heat transfer by turbulent impinging jet with velocity field excitations using different Reynolds Averaged Navier-Stokes Equations models. Also Detached Eddy Simulations are conducted to investigate the differences in the prediction capabilities of these two simulation approaches. In this paper the excited jet is simulated in non-commercial CFD code OpenFOAM with the goal to understand the influence of dynamics of impinging jet on heat transfer. The jet’s frequencies are altered keeping in view the preferred mode of the jet. The Reynolds number based on mean velocity and diameter is 23,000 and jet’s outlet-to-target wall distance is 2. It is found that heat transfer at the target wall can be influenced by judicious selection of amplitude and frequencies.

Keywords: excitation, impinging jet, natural frequency, turbulence models

Procedia PDF Downloads 253
3949 A Hybrid Data-Handler Module Based Approach for Prioritization in Quality Function Deployment

Authors: P. Venu, Joeju M. Issac

Abstract:

Quality Function Deployment (QFD) is a systematic technique that creates a platform where the customer responses can be positively converted to design attributes. The accuracy of a QFD process heavily depends on the data that it is handling which is captured from customers or QFD team members. Customized computer programs that perform Quality Function Deployment within a stipulated time have been used by various companies across the globe. These programs heavily rely on storage and retrieval of the data on a common database. This database must act as a perfect source with minimum missing values or error values in order perform actual prioritization. This paper introduces a missing/error data handler module which uses Genetic Algorithm and Fuzzy numbers. The prioritization of customer requirements of sesame oil is illustrated and a comparison is made between proposed data handler module-based deployment and manual deployment.

Keywords: hybrid data handler, QFD, prioritization, module-based deployment

Procedia PDF Downloads 271
3948 Satellite Image Classification Using Firefly Algorithm

Authors: Paramjit Kaur, Harish Kundra

Abstract:

In the recent years, swarm intelligence based firefly algorithm has become a great focus for the researchers to solve the real time optimization problems. Here, firefly algorithm is used for the application of satellite image classification. For experimentation, Alwar area is considered to multiple land features like vegetation, barren, hilly, residential and water surface. Alwar dataset is considered with seven band satellite images. Firefly Algorithm is based on the attraction of less bright fireflies towards more brightener one. For the evaluation of proposed concept accuracy assessment parameters are calculated using error matrix. With the help of Error matrix, parameters of Kappa Coefficient, Overall Accuracy and feature wise accuracy parameters of user’s accuracy & producer’s accuracy can be calculated. Overall results are compared with BBO, PSO, Hybrid FPAB/BBO, Hybrid ACO/SOFM and Hybrid ACO/BBO based on the kappa coefficient and overall accuracy parameters.

Keywords: image classification, firefly algorithm, satellite image classification, terrain classification

Procedia PDF Downloads 372
3947 Lexical-Semantic Processing by Chinese as a Second Language Learners

Authors: Yi-Hsiu Lai

Abstract:

The present study aimed to elucidate the lexical-semantic processing for Chinese as second language (CSL) learners. Twenty L1 speakers of Chinese and twenty CSL learners in Taiwan participated in a picture naming task and a category fluency task. Based on their Chinese proficiency levels, these CSL learners were further divided into two sub-groups: ten CSL learners of elementary Chinese proficiency level and ten CSL learners of intermediate Chinese proficiency level. Instruments for the naming task were sixty black-and-white pictures: thirty-five object pictures and twenty-five action pictures. Object pictures were divided into two categories: living objects and non-living objects. Action pictures were composed of two categories: action verbs and process verbs. As in the naming task, the category fluency task consisted of two semantic categories – objects (i.e., living and non-living objects) and actions (i.e., action and process verbs). Participants were asked to report as many items within a category as possible in one minute. Oral productions were tape-recorded and transcribed for further analysis. Both error types and error frequency were calculated. Statistical analysis was further conducted to examine these error types and frequency made by CSL learners. Additionally, category effects, pictorial effects and L2 proficiency were discussed. Findings in the present study helped characterize the lexical-semantic process of Chinese naming in CSL learners of different Chinese proficiency levels and made contributions to Chinese vocabulary teaching and learning in the future.

Keywords: lexical-semantic processing, Mandarin Chinese, naming, category effects

Procedia PDF Downloads 438
3946 Position and Speed Tracking of DC Motor Based on Experimental Analysis in LabVIEW

Authors: Muhammad Ilyas, Awais Khan, Syed Ali Raza Shah

Abstract:

DC motors are widely used in industries to provide mechanical power in speed and torque. The position and speed control of DC motors is getting the interest of the scientific community in robotics, especially in the robotic arm, a flexible joint manipulator. The current research work is based on position control of DC motors using experimental investigations in LabVIEW. The linear control strategy is applied to track the position and speed of the DC motor with comparative analysis in the LabVIEW platform and simulation analysis in MATLAB. The tracking error in hardware setup based on LabVIEW programming is slightly greater than simulation analysis in MATLAB due to the inertial load of the motor during steady-state conditions. The controller output shows the input voltage applied to the dc motor varies between 0-8V to ensure minimal steady error while tracking the position and speed of the DC motor.

Keywords: DC motor, labview, proportional integral derivative control, position tracking, speed tracking

Procedia PDF Downloads 77
3945 Signal Processing Techniques for Adaptive Beamforming with Robustness

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.

Keywords: adaptive beamforming, robustness, signal blocking, steering angle error

Procedia PDF Downloads 101
3944 Permeability Prediction Based on Hydraulic Flow Unit Identification and Artificial Neural Networks

Authors: Emad A. Mohammed

Abstract:

The concept of hydraulic flow units (HFU) has been used for decades in the petroleum industry to improve the prediction of permeability. This concept is strongly related to the flow zone indicator (FZI) which is a function of the reservoir rock quality index (RQI). Both indices are based on reservoir porosity and permeability of core samples. It is assumed that core samples with similar FZI values belong to the same HFU. Thus, after dividing the porosity-permeability data based on the HFU, transformations can be done in order to estimate the permeability from the porosity. The conventional practice is to use the power law transformation using conventional HFU where percentage of error is considerably high. In this paper, neural network technique is employed as a soft computing transformation method to predict permeability instead of power law method to avoid higher percentage of error. This technique is based on HFU identification where Amaefule et al. (1993) method is utilized. In this regard, Kozeny and Carman (K–C) model, and modified K–C model by Hasan and Hossain (2011) are employed. A comparison is made between the two transformation techniques for the two porosity-permeability models. Results show that the modified K-C model helps in getting better results with lower percentage of error in predicting permeability. The results also show that the use of artificial intelligence techniques give more accurate prediction than power law method. This study was conducted on a heterogeneous complex carbonate reservoir in Oman. Data were collected from seven wells to obtain the permeability correlations for the whole field. The findings of this study will help in getting better estimation of permeability of a complex reservoir.

Keywords: permeability, hydraulic flow units, artificial intelligence, correlation

Procedia PDF Downloads 106
3943 The Colombian Linguistic Landscape: A Study of Commercial Signs

Authors: Francia Martinez

Abstract:

This study documents and demonstrates the profound impact of the high status of American English and culture in Colombian commercial landscape due to the globalization and commodification of English. It also documents and describes how Colombian advertisers make use of various language and visual mechanisms in the commercial linguistic landscape to convey messages, create an image with which the target audience can identify, and build a relationship with that target audience. The data (in the form of pictures) were collected in different cities in Colombia and were classified and organized into different categories for the reliability and validity of the analysis. The research questions were: do the ubiquity and high status of American English and culture play a major role in the Colombian commercial linguistic landscape? If so, how?, what roles do national and local culture and language (Spanish) play in the commercial linguistic landscape?, and what different linguistic and visual strategies do Colombian advertisers employ to reach their target audience? Based on data analysis and results, American and local culture and icons play a major role when Colombian advertisers create and design their commercial logos and ads to get consumers’ attention and establish a rapport with them in a successful way. In order to achieve their objectives, Colombian advertisers rely on creative linguistic and visual techniques in their ads, such as puns, humor, irony, comparisons, metaphors, mocking, exaggeration, parody, personification, sarcasm, satire, allusion, onomatopoeias, and imitation (copycat or cloning).

Keywords: Colombian ads, linguistic landscape, rhetorical devices, sociolinguistics

Procedia PDF Downloads 282
3942 Artificial Intelligence Based Predictive Models for Short Term Global Horizontal Irradiation Prediction

Authors: Kudzanayi Chiteka, Wellington Makondo

Abstract:

The whole world is on the drive to go green owing to the negative effects of burning fossil fuels. Therefore, there is immediate need to identify and utilise alternative renewable energy sources. Among these energy sources solar energy is one of the most dominant in Zimbabwe. Solar power plants used to generate electricity are entirely dependent on solar radiation. For planning purposes, solar radiation values should be known in advance to make necessary arrangements to minimise the negative effects of the absence of solar radiation due to cloud cover and other naturally occurring phenomena. This research focused on the prediction of Global Horizontal Irradiation values for the sixth day given values for the past five days. Artificial intelligence techniques were used in this research. Three models were developed based on Support Vector Machines, Radial Basis Function, and Feed Forward Back-Propagation Artificial neural network. Results revealed that Support Vector Machines gives the best results compared to the other two with a mean absolute percentage error (MAPE) of 2%, Mean Absolute Error (MAE) of 0.05kWh/m²/day root mean square (RMS) error of 0.15kWh/m²/day and a coefficient of determination of 0.990. The other predictive models had prediction accuracies of MAPEs of 4.5% and 6% respectively for Radial Basis Function and Feed Forward Back-propagation Artificial neural network. These two models also had coefficients of determination of 0.975 and 0.970 respectively. It was found that prediction of GHI values for the future days is possible using artificial intelligence-based predictive models.

Keywords: solar energy, global horizontal irradiation, artificial intelligence, predictive models

Procedia PDF Downloads 253
3941 Religious Tourism the Core Strategy of Shaping Life Style: Evidences from Iran

Authors: Mostafa Jafari

Abstract:

Religious tourism is the core strategy of shaping Iranian's life-style. Why and How? This paper answers to this question. Theoretical base: From strategic marketing point of view, Life style is pattern of believes values, interests and acts. Strategy can be defined as a set of continuous important decisions. Here, strategy is making decisions about the target place and vehicle of touristic travel due to reform and redefine the self-identity and shaping life style. Methodology: Target society of this research is the selected residents of three provinces at northwest of Iran. The data collection instrument is interview and questionnaire and the collected data analysis by SEM (structural Equation Modeling) and LISREL software. Results: The primary results show that variety of touristic travels play an important role on shaping new life style of Iranian people. The target places of touristic travel (Europe, USA. Japan and etc.) are at the second priority. The number of foreign friends is at the third position. The fourth criteria are the number of travels. Among all kind of touristic travels the religious tourism from competitive point of view plays the main role. Findings: The geometry of Iranian life style are shaping and reshaping through some domestic and international tourism strategies particular religious strategy. During the dynamic trend of identity redefine, so many Iranians put the quantity and quality of their touristic travel on the first priority.

Keywords: religious tourism, core strategy, shaping life style

Procedia PDF Downloads 388
3940 Comparison between Some of Robust Regression Methods with OLS Method with Application

Authors: Sizar Abed Mohammed, Zahraa Ghazi Sadeeq

Abstract:

The use of the classic method, least squares (OLS) to estimate the linear regression parameters, when they are available assumptions, and capabilities that have good characteristics, such as impartiality, minimum variance, consistency, and so on. The development of alternative statistical techniques to estimate the parameters, when the data are contaminated with outliers. These are powerful methods (or resistance). In this paper, three of robust methods are studied, which are: Maximum likelihood type estimate M-estimator, Modified Maximum likelihood type estimate MM-estimator and Least Trimmed Squares LTS-estimator, and their results are compared with OLS method. These methods applied to real data taken from Duhok company for manufacturing furniture, the obtained results compared by using the criteria: Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE) and Mean Sum of Absolute Error (MSAE). Important conclusions that this study came up with are: a number of typical values detected by using four methods in the furniture line and very close to the data. This refers to the fact that close to the normal distribution of standard errors, but typical values in the doors line data, using OLS less than that detected by the powerful ways. This means that the standard errors of the distribution are far from normal departure. Another important conclusion is that the estimated values of the parameters by using the lifeline is very far from the estimated values using powerful methods for line doors, gave LTS- destined better results using standard MSE, and gave the M- estimator better results using standard MAPE. Moreover, we noticed that using standard MSAE, and MM- estimator is better. The programs S-plus (version 8.0, professional 2007), Minitab (version 13.2) and SPSS (version 17) are used to analyze the data.

Keywords: Robest, LTS, M estimate, MSE

Procedia PDF Downloads 212
3939 Developing Islamic Module Project for Preschool Teachers Using Modified Delphi Technique

Authors: Mazeni Ismail, Nurul Aliah, Hasmadi Hassan

Abstract:

The purpose of this study is to gather the consensus of experts regarding the use of moral guidance amongst preschool teachers vis-a-vis the Islamic Project module (I-Project Module). This I-Project Module seeks to provide pertinent data on the assimilation of noble values in subject-matter teaching. To obtain consensus for the various components of the module, the Modified Delphi technique was used to develop the module. 12 subject experts from various educational fields of Islamic education, early childhood education, counselling and language fully participated in the development of this module. The Modified Delphi technique was administered in two mean cycles. The standard deviation value derived from questionnaires completed by the participating panel of experts provided the value of expert consensus reached. This was subsequently analyzed using SPSS version 22. Findings revealed that the panel of experts reached a discernible degree of agreement on five topics outlined in the module, viz; content (mean value 3.36), teaching strategy (mean value 3.28), programme duration (mean value 3.0), staff involved and attention-grabbing strategy of target group participating in the value program (mean value 3.5), and strategy to attract attention of target group to utilize i-project (mean value 3.0). With regard to the strategy to attract the attention of the target group, the experts proposed for creative activities to be added in order to enhance teachers’ creativity.

Keywords: Modified Delphi Technique, Islamic project, noble values, teacher moral guidance

Procedia PDF Downloads 159
3938 Cognitive Translation and Conceptual Wine Tasting Metaphors: A Corpus-Based Research

Authors: Christine Demaecker

Abstract:

Many researchers have underlined the importance of metaphors in specialised language. Their use of specific domains helps us understand the conceptualisations used to communicate new ideas or difficult topics. Within the wide area of specialised discourse, wine tasting is a very specific example because it is almost exclusively metaphoric. Wine tasting metaphors express various conceptualisations. They are not linguistic but rather conceptual, as defined by Lakoff & Johnson. They correspond to the linguistic expression of a mental projection from a well-known or more concrete source domain onto the target domain, which is the taste of wine. But unlike most specialised terminologies, the vocabulary is never clearly defined. When metaphorical terms are listed in dictionaries, their definitions remain vague, unclear, and circular. They cannot be replaced by literal linguistic expressions. This makes it impossible to transfer them into another language with the traditional linguistic translation methods. Qualitative research investigates whether wine tasting metaphors could rather be translated with the cognitive translation process, as well described by Nili Mandelblit (1995). The research is based on a corpus compiled from two high-profile wine guides; the Parker’s Wine Buyer’s Guide and its translation into French and the Guide Hachette des Vins and its translation into English. In this small corpus with a total of 68,826 words, 170 metaphoric expressions have been identified in the original English text and 180 in the original French text. They have been selected with the MIPVU Metaphor Identification Procedure developed at the Vrije Universiteit Amsterdam. The selection demonstrates that both languages use the same set of conceptualisations, which are often combined in wine tasting notes, creating conceptual integrations or blends. The comparison of expressions in the source and target texts also demonstrates the use of the cognitive translation approach. In accordance with the principle of relevance, the translation always uses target language conceptualisations, but compared to the original, the highlighting of the projection is often different. Also, when original metaphors are complex with a combination of conceptualisations, at least one element of the original metaphor underlies the target expression. This approach perfectly integrates into Lederer’s interpretative model of translation (2006). In this triangular model, the transfer of conceptualisation could be included at the level of ‘deverbalisation/reverbalisation’, the crucial stage of the model, where the extraction of meaning combines with the encyclopedic background to generate the target text.

Keywords: cognitive translation, conceptual integration, conceptual metaphor, interpretative model of translation, wine tasting metaphor

Procedia PDF Downloads 107
3937 Accuracy/Precision Evaluation of Excalibur I: A Neurosurgery-Specific Haptic Hand Controller

Authors: Hamidreza Hoshyarmanesh, Benjamin Durante, Alex Irwin, Sanju Lama, Kourosh Zareinia, Garnette R. Sutherland

Abstract:

This study reports on a proposed method to evaluate the accuracy and precision of Excalibur I, a neurosurgery-specific haptic hand controller, designed and developed at Project neuroArm. Having an efficient and successful robot-assisted telesurgery is considerably contingent on how accurate and precise a haptic hand controller (master/local robot) would be able to interpret the kinematic indices of motion, i.e., position and orientation, from the surgeon’s upper limp to the slave/remote robot. A proposed test rig is designed and manufactured according to standard ASTM F2554-10 to determine the accuracy and precision range of Excalibur I at four different locations within its workspace: central workspace, extreme forward, far left and far right. The test rig is metrologically characterized by a coordinate measuring machine (accuracy and repeatability < ± 5 µm). Only the serial linkage of the haptic device is examined due to the use of the Structural Length Index (SLI). The results indicate that accuracy decreases by moving from the workspace central area towards the borders of the workspace. In a comparative study, Excalibur I performs on par with the PHANToM PremiumTM 3.0 and more accurate/precise than the PHANToM PremiumTM 1.5. The error in Cartesian coordinate system shows a dominant component in one direction (δx, δy or δz) for the movements on horizontal, vertical and inclined surfaces. The average error magnitude of three attempts is recorded, considering all three error components. This research is the first promising step to quantify the kinematic performance of Excalibur I.

Keywords: accuracy, advanced metrology, hand controller, precision, robot-assisted surgery, tele-operation, workspace

Procedia PDF Downloads 315
3936 Translation as a Cultural Medium: Understanding the Mauritian Culture and History through an English Translation

Authors: Pooja Booluck

Abstract:

This project seeks to translate a chapter in Le Silence des Chagos by Shenaz Patel a Mauritian author whose work has never been translated before. The chapter discusses the attempt of the protagonist to return to her home country Diego Garcia after her deportation. The English translation will offer an historical account to the target audience of the deportation of Chagossians to Mauritius during the 1970s. The target audience comprises of English-speaking translation scholars translation students and African literature scholars. In light of making the cultural elements of Mauritian culture accessible the translation will maintain the cultural items such as food and oral discourses in Creole so as to preserve the authenticity of the source culture. In order to better comprehend the cultural elements mentioned the target reader will be provided with detailed footnotes explaining the cultural and historical references. This translation will also address the importance of folkloric songs in Mauritius and its intergenerational function in Mauritian communities which will also remain in Creole. While such an approach will help to preserve the meaning of the source text the borrowing technique and the foreignizing method will be employed which will in turn help the reader in becoming more familiar with the Mauritian community. Translating a text from French to English while maintaining certain words or discourses in a minority language such as Creole bears certain challenges: How does the translator ensure the comprehensibility of the reader? Are there any translation losses? What are the choices of the translator?

Keywords: Chagos archipelagos in Exile, English translation, Le Silence des Chagos, Mauritian culture and history

Procedia PDF Downloads 296
3935 Prediction of California Bearing Ratio of a Black Cotton Soil Stabilized with Waste Glass and Eggshell Powder using Artificial Neural Network

Authors: Biruhi Tesfaye, Avinash M. Potdar

Abstract:

The laboratory test process to determine the California bearing ratio (CBR) of black cotton soils is not only overpriced but also time-consuming as well. Hence advanced prediction of CBR plays a significant role as it is applicable In pavement design. The prediction of CBR of treated soil was executed by Artificial Neural Networks (ANNs) which is a Computational tool based on the properties of the biological neural system. To observe CBR values, combined eggshell and waste glass was added to soil as 4, 8, 12, and 16 % of the weights of the soil samples. Accordingly, the laboratory related tests were conducted to get the required best model. The maximum CBR value found at 5.8 at 8 % of eggshell waste glass powder addition. The model was developed using CBR as an output layer variable. CBR was considered as a function of the joint effect of liquid limit, plastic limit, and plastic index, optimum moisture content and maximum dry density. The best model that has been found was ANN with 5, 6 and 1 neurons in the input, hidden and output layer correspondingly. The performance of selected ANN has been 0.99996, 4.44E-05, 0.00353 and 0.0067 which are correlation coefficient (R), mean square error (MSE), mean absolute error (MAE) and root mean square error (RMSE) respectively. The research presented or summarized above throws light on future scope on stabilization with waste glass combined with different percentages of eggshell that leads to the economical design of CBR acceptable to pavement sub-base or base, as desired.

Keywords: CBR, artificial neural network, liquid limit, plastic limit, maximum dry density, OMC

Procedia PDF Downloads 161
3934 The Study of Formal and Semantic Errors of Lexis by Persian EFL Learners

Authors: Mohammad J. Rezai, Fereshteh Davarpanah

Abstract:

Producing a text in a language which is not one’s mother tongue can be a demanding task for language learners. Examining lexical errors committed by EFL learners is a challenging area of investigation which can shed light on the process of second language acquisition. Despite the considerable number of investigations into grammatical errors, few studies have tackled formal and semantic errors of lexis committed by EFL learners. The current study aimed at examining Persian learners’ formal and semantic errors of lexis in English. To this end, 60 students at three different proficiency levels were asked to write on 10 different topics in 10 separate sessions. Finally, 600 essays written by Persian EFL learners were collected, acting as the corpus of the study. An error taxonomy comprising formal and semantic errors was selected to analyze the corpus. The formal category covered misselection and misformation errors, while the semantic errors were classified into lexical, collocational and lexicogrammatical categories. Each category was further classified into subcategories depending on the identified errors. The results showed that there were 2583 errors in the corpus of 9600 words, among which, 2030 formal errors and 553 semantic errors were identified. The most frequent errors in the corpus included formal error commitment (78.6%), which were more prevalent at the advanced level (42.4%). The semantic errors (21.4%) were more frequent at the low intermediate level (40.5%). Among formal errors of lexis, the highest number of errors was devoted to misformation errors (98%), while misselection errors constituted 2% of the errors. Additionally, no significant differences were observed among the three semantic error subcategories, namely collocational, lexical choice and lexicogrammatical. The results of the study can shed light on the challenges faced by EFL learners in the second language acquisition process.

Keywords: collocational errors, lexical errors, Persian EFL learners, semantic errors

Procedia PDF Downloads 117
3933 Continuous Wave Interference Effects on Global Position System Signal Quality

Authors: Fang Ye, Han Yu, Yibing Li

Abstract:

Radio interference is one of the major concerns in using the global positioning system (GPS) for civilian and military applications. Interference signals are produced not only through all electronic systems but also illegal jammers. Among different types of interferences, continuous wave (CW) interference has strong adverse impacts on the quality of the received signal. In this paper, we make more detailed analysis for CW interference effects on GPS signal quality. Based on the C/A code spectrum lines, the influence of CW interference on the acquisition performance of GPS receivers is further analysed. This influence is supported by simulation results using GPS software receiver. As the most important user parameter of GPS receivers, the mathematical expression of bit error probability is also derived in the presence of CW interference, and the expression is consistent with the Monte Carlo simulation results. The research on CW interference provides some theoretical gist and new thoughts on monitoring the radio noise environment and improving the anti-jamming ability of GPS receivers.

Keywords: GPS, CW interference, acquisition performance, bit error probability, Monte Carlo

Procedia PDF Downloads 238
3932 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error

Procedia PDF Downloads 292
3931 Culture of Writing and Writing of Culture: Organizational Connections and Pedagogical Implications of ESL Writing in Multilingual Philippine Setting

Authors: Randy S. Magdaluyo, Lea M. Cabar, Jefferson Q. Correa

Abstract:

One recurring issue in ESL writing is the confusing differences in the writing conventions of the first language and the target language. Culture may play an intriguing role in specifying writing features and structures that ESL writers have to follow. Although writing is typically organized in a three-part structure with introduction, body, and conclusion, it is important to analyze the complex nature of ESL writing. This study investigated the organizational features and structures of argumentative essays written in English by thirty college ESL students from three linguistic backgrounds (Cebuano, Chavacao, and Tausug) in a Philippine university. The nature of word order and sentence construction in the students’ essays and the specific components of the introduction, body, and conclusion were quantitatively and qualitatively analyzed based on ESL writing models. Focus group discussions were also conducted to help clarify the possible influence of students’ first language on the ways their essays were conceptualized and organized. Results indicate that while there was no significant difference in the overall introduction, body, and conclusion in all essays, the sentence length was interestingly different for each linguistic group of ESL students, and the word order was notably inconsistent with the S-V-O pattern of the target language. The first language was also revealed to have a facilitative role in the cognitive translation process of these ESL students. As such, implications for a multicultural writing pedagogy was discussed and recommended considering both the students’ native resources in their first language and the ESL writing models in their target language.

Keywords: community funds of knowledge, contrastive rhetoric, ESL writing, multicultural writing pedagogy

Procedia PDF Downloads 109
3930 MiRNA Regulation of CXCL12β during Inflammation

Authors: Raju Ranjha, Surbhi Aggarwal

Abstract:

Background: Inflammation plays an important role in infectious and non-infectious diseases. MiRNA is also reported to play role in inflammation and associated cancers. Chemokine CXCL12 is also known to play role in inflammation and various cancers. CXCL12/CXCR4 chemokine axis was involved in pathogenesis of IBD specially UC. Supplementation of CXCL12 induces homing of dendritic cells to spleen and enhances control of plasmodium parasite in BALB/c mice. We looked at the regulation of CXCL12β by miRNA in UC colitis. Prolonged inflammation of colon in UC patient increases the risk of developing colorectal cancer. We looked at the expression differences of CXCl12β and its targeting miRNA in cancer susceptible area of colon of UC patients. Aim: Aim of this study was to find out the expression regulation of CXCL12β by miRNA in inflammation. Materials and Methods: Biopsy samples and blood samples were collected from UC patients and non-IBD controls. mRNA expression was analyzed using microarray and real-time PCR. CXCL12β targeting miRNA were looked by using online target prediction tools. Expression of CXCL12β in blood samples and cell line supernatant was analyzed using ELISA. miRNA target was validated using dual luciferase assay. Results and conclusion: We found miR-200a regulate the expression of CXCL12β in UC. Expression of CXCL12β was increased in cancer susceptible part of colon and expression of its targeting miRNA was decreased in the same part of colon. miR-200a regulate CXCL12β expression in inflammation and may be an important therapeutic target in inflammation associated cancer.

Keywords: inflammation, miRNA, regulation, CXCL12

Procedia PDF Downloads 247
3929 The Link between Money Market and Economic Growth in Nigeria: Vector Error Correction Model Approach

Authors: Uyi Kizito Ehigiamusoe

Abstract:

The paper examines the impact of money market on economic growth in Nigeria using data for the period 1980-2012. Econometrics techniques such as Ordinary Least Squares Method, Johanson’s Co-integration Test and Vector Error Correction Model were used to examine both the long-run and short-run relationship. Evidence from the study suggest that though a long-run relationship exists between money market and economic growth, but the present state of the Nigerian money market is significantly and negatively related to economic growth. The link between the money market and the real sector of the economy remains very weak. This implies that the market is not yet developed enough to produce the needed growth that will propel the Nigerian economy because of several challenges. It was therefore recommended that government should create the appropriate macroeconomic policies, legal framework and sustain the present reforms with a view to developing the market so as to promote productive activities, investments, and ultimately economic growth.

Keywords: economic growth, investments, money market, money market challenges, money market instruments

Procedia PDF Downloads 324
3928 Modernization of the Economic Price Adjustment Software

Authors: Roger L. Goodwin

Abstract:

The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. In mid to late 1990, much research went into changes to the CPI by a Congressional Advisory Committee. One thing can be said from the research is that, aside from there are alternative estimators for the CPI; any fundamental change to the CPI will affect many government programs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for long-term contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.

Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures

Procedia PDF Downloads 293
3927 Soil Stress State under Tractive Tire and Compaction Model

Authors: Prathuang Usaborisut, Dithaporn Thungsotanon

Abstract:

Soil compaction induced by a tractor towing trailer becomes a major problem associated to sugarcane productivity. Soil beneath the tractor’s tire is not only under compressing stress but also shearing stress. Therefore, in order to help to understand such effects on soil, this research aimed to determine stress state in soil and predict compaction of soil under a tractive tire. The octahedral stress ratios under the tires were higher than one and much higher under higher draft forces. Moreover, the ratio was increasing with increase of number of tire’s passage. Soil compaction model was developed using data acquired from triaxial tests. The model was then used to predict soil bulk density under tractive tire. The maximum error was about 4% at 15 cm depth under lower draft force and tended to increase with depth and draft force. At depth of 30 cm and under higher draft force, the maximum error was about 16%.

Keywords: draft force, soil compaction model, stress state, tractive tire

Procedia PDF Downloads 322
3926 A Multilayer Perceptron Neural Network Model Optimized by Genetic Algorithm for Significant Wave Height Prediction

Authors: Luis C. Parra

Abstract:

The significant wave height prediction is an issue of great interest in the field of coastal activities because of the non-linear behavior of the wave height and its complexity of prediction. This study aims to present a machine learning model to forecast the significant wave height of the oceanographic wave measuring buoys anchored at Mooloolaba of the Queensland Government Data. Modeling was performed by a multilayer perceptron neural network-genetic algorithm (GA-MLP), considering Relu(x) as the activation function of the MLPNN. The GA is in charge of optimized the MLPNN hyperparameters (learning rate, hidden layers, neurons, and activation functions) and wrapper feature selection for the window width size. Results are assessed using Mean Square Error (MSE), Root Mean Square Error (RMSE), and Mean Absolute Error (MAE). The GAMLPNN algorithm was performed with a population size of thirty individuals for eight generations for the prediction optimization of 5 steps forward, obtaining a performance evaluation of 0.00104 MSE, 0.03222 RMSE, 0.02338 MAE, and 0.71163% of MAPE. The results of the analysis suggest that the MLPNNGA model is effective in predicting significant wave height in a one-step forecast with distant time windows, presenting 0.00014 MSE, 0.01180 RMSE, 0.00912 MAE, and 0.52500% of MAPE with 0.99940 of correlation factor. The GA-MLP algorithm was compared with the ARIMA forecasting model, presenting better performance criteria in all performance criteria, validating the potential of this algorithm.

Keywords: significant wave height, machine learning optimization, multilayer perceptron neural networks, evolutionary algorithms

Procedia PDF Downloads 78
3925 Effects of a Cooler on the Sampling Process in a Continuous Emission Monitoring System

Authors: J. W. Ahn, I. Y. Choi, T. V. Dinh, J. C. Kim

Abstract:

A cooler has been widely employed in the extractive system of the continuous emission monitoring system (CEMS) to remove water vapor in the gas stream. The effect of the cooler on analytical target gases was investigated in this research. A commercial cooler for the CEMS operated at 4 C was used. Several gases emitted from a coal power plant (i.e. CO2, SO2, NO, NO2 and CO) were mixed with humid air, and then introduced into the cooler to observe its effect. Concentrations of SO2, NO, NO2 and CO were made as 200 ppm. The CO2 concentration was 8%. The inlet absolute humidity was produced as 12.5% at 100 C using a bubbling method. It was found that the reduction rate of SO2 was the highest (~21%), followed by NO2 (~17%), CO2 (~11%) and CO (~10%). In contrast, the cooler was not affected by NO gas. The result indicated that the cooler caused a significant effect on the water soluble gases due to condensate water in the cooler. To overcome this problem, a correction factor may be applied. However, water vapor might be different, and emissions of target gases are also various. Therefore, the correction factor is not only a solution, but also a better available method should be employed.

Keywords: cooler, CEMS, monitoring, reproductive, sampling

Procedia PDF Downloads 338
3924 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom

Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu

Abstract:

The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.

Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity

Procedia PDF Downloads 165