Search results for: definite article error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5080

Search results for: definite article error

5080 Definite Article Errors and Effect of L1 Transfer

Authors: Bimrisha Mali

Abstract:

The present study investigates the type of errors English as a second language (ESL) learners produce using the definite article ‘the’. The participants were provided a questionnaire on the learner's ability test. The questionnaire consists of three cloze tests and two free composition tests. Each participant's response was received in the form of written data. A total of 78 participants from three government schools participated in the study. The participants are high-school students from Rural Assam. Assam is a north-eastern state of India. Their age ranged between 14-15. The medium of instruction and the communication among the students take place in the local language, i.e., Assamese. Pit Corder’s steps for conducting error analysis have been followed for the analysis procedure. Four types of errors were found (1) deletion of the definite article, (2) use of the definite article as modifiers as adjectives, (3) incorrect use of the definite article with singular proper nouns, (4) substitution of the definite article by the indefinite article ‘a’. Classifiers in Assamese that express definiteness is used with nouns, adjectives, and numerals. It is found that native language (L1) transfer plays a pivotal role in the learners’ errors. The analysis reveals the learners' inability to acquire the semantic connotation of definiteness in English due to native language (L1) interference.

Keywords: definite article error, l1 transfer, error analysis, ESL

Procedia PDF Downloads 102
5079 The Noun-Phrase Elements on the Usage of the Zero Article

Authors: Wen Zhen

Abstract:

Compared to content words, function words have been relatively overlooked by English learners especially articles. The article system, to a certain extent, becomes a resistance to know English better, driven by different elements. Three principal factors can be summarized in term of the nature of the articles when referring to the difficulty of the English article system. However, making the article system more complex are difficulties in the second acquisition process, for [-ART] learners have to create another category, causing even most non-native speakers at proficiency level to make errors. According to the sequences of acquisition of the English article, it is showed that the zero article is first acquired and in high inaccuracy. The zero article is often overused in the early stages of L2 acquisition. Although learners at the intermediate level move to underuse the zero article for they realize that the zero article does not cover any case, overproduction of the zero article even occurs among advanced L2 learners. The aim of the study is to investigate noun-phrase factors which give rise to incorrect usage or overuse of the zero article, thus providing suggestions for L2 English acquisition. Moreover, it enables teachers to carry out effective instruction that activate conscious learning of students. The research question will be answered through a corpus-based, data- driven approach to analyze the noun-phrase elements from the semantic context and countability of noun-phrases. Based on the analysis of the International Thurber Thesis corpus, the results show that: (1) Although context of [-definite,-specific] favored the zero article, both[-definite,+specific] and [+definite,-specific] showed less influence. When we reflect on the frequency order of the zero article , prototypicality plays a vital role in it .(2)EFL learners in this study have trouble classifying abstract nouns as countable. We can find that it will bring about overuse of the zero article when learners can not make clear judgements on countability altered from (+definite ) to (-definite).Once a noun is perceived as uncountable by learners, the choice would fall back on the zero article. These findings suggest that learners should be engaged in recognition of the countability of new vocabulary by explaining nouns in lexical phrases and explore more complex aspects such as analysis dependent on discourse.

Keywords: noun phrase, zero article, corpus, second language acquisition

Procedia PDF Downloads 225
5078 The Role of Specificity in Mastering the English Article System

Authors: Sugene Kim

Abstract:

The English articles are taught as a binary system based on nominal countability and definiteness. Despite the detailed rules of prescriptive grammar, it has been consistently reported in the literature that their correct usage is extremely difficult to master even for advanced learners of English as a second language (ESL) or a foreign language (EFL). Given that an English sentence (except for an imperative) cannot be constructed without a noun, which is always paired with one of the indefinite, definite, and zero articles; it is essential to understand specifically what causes ESL/EFL learners to misuse them. To that end, this study examined EFL learners’ article use employing a one-group pre–post-test design. Forty-three Korean college students received instruction on correct English article usage for two 75-minute classes employing the binary schema set up for the study. They also practiced in class how to apply the rules as instructed. Then, the participants were assigned a forced-choice elicitation task, which was also used as a pre-test administered three months prior to the instruction. Unlike the pre-test on which they only chose the correct article for each of the 40 items, the post-instruction task additionally asked them to give written accounts of their decision-making procedure to choose the article as they did. The participants’ performance was scored manually by checking whether the answer given is correct or incorrect, and their written comments were first categorized using thematic analysis and then ranked by frequency. The analyses of the performance on the two tasks and the written think-aloud data suggested that EFL learners exhibit fluctuation between specificity and definiteness, overgeneralizing the use of the definite article for almost all cataphoric references. It was apparent that they have trouble distinguishing from the two concepts possibly because the former is almost never introduced in the grammar books or classes designed for ESL/EFL learners. Particularly, most participants were found to be ignorant of the possibility of using nouns as [+specific, –definite]. Not surprisingly, the correct answer rates for such nouns averaged out at 33% and 46% on the pre- and post-tests, respectively, which narrowly reach half the overall mean correct answer rates of 65% on the pre-test and 81% on the post-test. In addition, correct article use for specific indefinites was most impermeable to instruction when compared with nouns used as [–specific, –definite] or [± specific, +definite]. Such findings underline the necessity for expanding the binary schema to a ternary form that incorporates the specificity feature, albeit not morphologically marked in the English language.

Keywords: countability, definiteness, English articles, specificity, ternary system

Procedia PDF Downloads 104
5077 Solution of S3 Problem of Deformation Mechanics for a Definite Condition and Resulting Modifications of Important Failure Theories

Authors: Ranajay Bhowmick

Abstract:

Analysis of stresses for an infinitesimal tetrahedron leads to a situation where we obtain a cubic equation consisting of three stress invariants. This cubic equation, when solved for a definite condition, gives the principal stresses directly without requiring any cumbersome and time-consuming trial and error methods or iterative numerical procedures. Since the failure criterion of different materials are generally expressed as functions of principal stresses, an attempt has been made in this study to incorporate the solutions of the cubic equation in the form of principal stresses, obtained for a definite condition, into some of the established failure theories to determine their modified descriptions. It has been observed that the failure theories can be represented using the quadratic stress invariant and the orientation of the principal plane.

Keywords: cubic equation, stress invariant, trigonometric, explicit solution, principal stress, failure criterion

Procedia PDF Downloads 107
5076 Calculate Consumer Surplus and Producer Surplus Using Integration

Authors: Bojan Radisic, Katarina Stavlic

Abstract:

The paper describes two economics terms consumer surplus and producer surplus using the definite integrals (the Riemann integral). The consumer surplus is the difference between what consumers are willing to pay and actual price. The producer surplus is the difference between what producers selling at the current price, rather than at the price they would have been are willing to accept. Using the definite integrals describe terms and mathematical formulas of the consumer surplus and the producer surplus and will be applied to the numerical examples.

Keywords: consumer surplus, producer surplus, definite integral, integration

Procedia PDF Downloads 537
5075 Effects of Manufacture and Assembly Errors on the Output Error of Globoidal Cam Mechanisms

Authors: Shuting Ji, Yueming Zhang, Jing Zhao

Abstract:

The output error of the globoidal cam mechanism can be considered as a relevant indicator of mechanism performance, because it determines kinematic and dynamical behavior of mechanical transmission. Based on the differential geometry and the rigid body transformations, the mathematical model of surface geometry of the globoidal cam is established. Then we present the analytical expression of the output error (including the transmission error and the displacement error along the output axis) by considering different manufacture and assembly errors. The effects of the center distance error, the perpendicular error between input and output axes and the rotational angle error of the globoidal cam on the output error are systematically analyzed. A globoidal cam mechanism which is widely used in automatic tool changer of CNC machines is applied for illustration. Our results show that the perpendicular error and the rotational angle error have little effects on the transmission error but have great effects on the displacement error along the output axis. This study plays an important role in the design, manufacture and assembly of the globoidal cam mechanism.

Keywords: globoidal cam mechanism, manufacture error, transmission error, automatic tool changer

Procedia PDF Downloads 538
5074 On the Cluster of the Families of Hybrid Polynomial Kernels in Kernel Density Estimation

Authors: Benson Ade Eniola Afere

Abstract:

Over the years, kernel density estimation has been extensively studied within the context of nonparametric density estimation. The fundamental components of kernel density estimation are the kernel function and the bandwidth. While the mathematical exploration of the kernel component has been relatively limited, its selection and development remain crucial. The Mean Integrated Squared Error (MISE), serving as a measure of discrepancy, provides a robust framework for assessing the effectiveness of any kernel function. A kernel function with a lower MISE is generally considered to perform better than one with a higher MISE. Hence, the primary aim of this article is to create kernels that exhibit significantly reduced MISE when compared to existing classical kernels. Consequently, this article introduces a cluster of hybrid polynomial kernel families. The construction of these proposed kernel functions is carried out heuristically by combining two kernels from the classical polynomial kernel family using probability axioms. We delve into the analysis of error propagation within these kernels. To assess their performance, simulation experiments, and real-life datasets are employed. The obtained results demonstrate that the proposed hybrid kernels surpass their classical kernel counterparts in terms of performance.

Keywords: classical polynomial kernels, cluster of families, global error, hybrid Kernels, Kernel density estimation, Monte Carlo simulation

Procedia PDF Downloads 62
5073 Flexible Capacitive Sensors Based on Paper Sheets

Authors: Mojtaba Farzaneh, Majid Baghaei Nejad

Abstract:

This article proposes a new Flexible Capacitive Tactile Sensors based on paper sheets. This method combines the parameters of sensor's material and dielectric, and forms a new model of flexible capacitive sensors. The present article tries to present a practical explanation of this method's application and advantages. With the use of this new method, it is possible to make a more flexibility and accurate sensor in comparison with the current models. To assess the performance of this model, the common capacitive sensor is simulated and the proposed model of this article and one of the existing models are assessed. The results of this article indicate that the proposed model of this article can enhance the speed and accuracy of tactile sensor and has less error in comparison with the current models. Based on the results of this study, it can be claimed that in comparison with the current models, the proposed model of this article is capable of representing more flexibility and more accurate output parameters for touching the sensor, especially in abnormal situations and uneven surfaces, and increases accuracy and practicality.

Keywords: capacitive sensor, paper sheets, flexible, tactile, uneven

Procedia PDF Downloads 326
5072 The Foundation Binary-Signals Mechanics and Actual-Information Model of Universe

Authors: Elsadig Naseraddeen Ahmed Mohamed

Abstract:

In contrast to the uncertainty and complementary principle, it will be shown in the present paper that the probability of the simultaneous occupation event of any definite values of coordinates by any definite values of momentum and energy at any definite instance of time can be described by a binary definite function equivalent to the difference between their numbers of occupation and evacuation epochs up to that time and also equivalent to the number of exchanges between those occupation and evacuation epochs up to that times modulus two, these binary definite quantities can be defined at all point in the time’s real-line so it form a binary signal represent a complete mechanical description of physical reality, the time of these exchanges represent the boundary of occupation and evacuation epochs from which we can calculate these binary signals using the fact that the time of universe events actually extends in the positive and negative of time’s real-line in one direction of extension when these number of exchanges increase, so there exists noninvertible transformation matrix can be defined as the matrix multiplication of invertible rotation matrix and noninvertible scaling matrix change the direction and magnitude of exchange event vector respectively, these noninvertible transformation will be called actual transformation in contrast to information transformations by which we can navigate the universe’s events transformed by actual transformations backward and forward in time’s real-line, so these information transformations will be derived as an elements of a group can be associated to their corresponded actual transformations. The actual and information model of the universe will be derived by assuming the existence of time instance zero before and at which there is no coordinate occupied by any definite values of momentum and energy, and then after that time, the universe begin its expanding in spacetime, this assumption makes the need for the existence of Laplace’s demon who at one moment can measure the positions and momentums of all constituent particle of the universe and then use the law of classical mechanics to predict all future and past of universe’s events, superfluous, we only need for the establishment of our analog to digital converters to sense the binary signals that determine the boundaries of occupation and evacuation epochs of the definite values of coordinates relative to its origin by the definite values of momentum and energy as present events of the universe from them we can predict approximately in high precision it's past and future events.

Keywords: binary-signal mechanics, actual-information model of the universe, actual-transformation, information-transformation, uncertainty principle, Laplace's demon

Procedia PDF Downloads 141
5071 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 457
5070 Using the Bootstrap for Problems Statistics

Authors: Brahim Boukabcha, Amar Rebbouh

Abstract:

The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.

Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models

Procedia PDF Downloads 355
5069 Calibration of the Radical Installation Limit Error of the Accelerometer in the Gravity Gradient Instrument

Authors: Danni Cong, Meiping Wu, Xiaofeng He, Junxiang Lian, Juliang Cao, Shaokuncai, Hao Qin

Abstract:

Gravity gradient instrument (GGI) is the core of the gravity gradiometer, so the structural error of the sensor has a great impact on the measurement results. In order not to affect the aimed measurement accuracy, limit error is required in the installation of the accelerometer. In this paper, based on the established measuring principle model, the radial installation limit error is calibrated, which is taken as an example to provide a method to calculate the other limit error of the installation under the premise of ensuring the accuracy of the measurement result. This method provides the idea for deriving the limit error of the geometry structure of the sensor, laying the foundation for the mechanical precision design and physical design.

Keywords: gravity gradient sensor, radial installation limit error, accelerometer, uniaxial rotational modulation

Procedia PDF Downloads 397
5068 High Capacity Reversible Watermarking through Interpolated Error Shifting

Authors: Hae-Yeoun Lee

Abstract:

Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.

Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation

Procedia PDF Downloads 291
5067 On the Algorithmic Iterative Solutions of Conjugate Gradient, Gauss-Seidel and Jacobi Methods for Solving Systems of Linear Equations

Authors: Hussaini Doko Ibrahim, Hamilton Cyprian Chinwenyi, Henrietta Nkem Ude

Abstract:

In this paper, efforts were made to examine and compare the algorithmic iterative solutions of the conjugate gradient method as against other methods such as Gauss-Seidel and Jacobi approaches for solving systems of linear equations of the form Ax=b, where A is a real n×n symmetric and positive definite matrix. We performed algorithmic iterative steps and obtained analytical solutions of a typical 3×3 symmetric and positive definite matrix using the three methods described in this paper (Gauss-Seidel, Jacobi, and conjugate gradient methods), respectively. From the results obtained, we discovered that the conjugate gradient method converges faster to exact solutions in fewer iterative steps than the two other methods, which took many iterations, much time, and kept tending to the exact solutions.

Keywords: conjugate gradient, linear equations, symmetric and positive definite matrix, gauss-seidel, Jacobi, algorithm

Procedia PDF Downloads 119
5066 Investigating the Causes of Human Error-Induced Incidents in the Maintenance Operations of Petrochemical Industry by Using Human Factors Analysis and Classification System

Authors: Omid Kalatpour, Mohammadreza Ajdari

Abstract:

This article studied the possible causes of human error-induced incidents in the petrochemical industry maintenance activities by using Human Factors Analysis and Classification System (HFACS). The purpose of the study was anticipating and identifying these causes and proposing corrective and preventive actions. Maintenance department in a petrochemical company was selected for research. A checklist of human error-induced incidents was developed based on four HFACS main levels and nineteen sub-groups. Hierarchical task analysis (HTA) technique was used to identify maintenance activities and tasks. The main causes of possible incidents were identified by checklist and recorded. Corrective and preventive actions were defined depending on priority. Analyzing the worksheets of 444 activities in four levels of HFACS showed 37.6% of the causes were at the level of unsafe actions, 27.5% at the level of unsafe supervision, 20.9% at the level of preconditions for unsafe acts and 14% of the causes were at the level of organizational effects. The HFACS sub-groups showed errors (24.36%) inadequate supervision (14.89%) and violations (13.26%) with the most frequency. According to findings of this study, increasing the training effectiveness of operators and supervision improvement respectively are the most important measures in decreasing the human error-induced incidents in petrochemical industry maintenance.

Keywords: human error, petrochemical industry, maintenance, HFACS

Procedia PDF Downloads 206
5065 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection

Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang

Abstract:

To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.

Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method

Procedia PDF Downloads 334
5064 Student Attendance System Applying Reed Solomon ECC

Authors: Mohd Noah A. Rahman, Armandurni Abd Rahman, Afzaal H. Seyal, Md Rizal Md Hendry

Abstract:

The article reports an automated student attendance system modeled and developed for use at a Vocational school. This project focuses on developing an application using a QR code utilizing the Reed-Solomon error correction code using a smartphone scanned through a webcam. This system enables us to speed up the process of taking attendance and would save us valuable teaching time. This is planned to help students avoid consequences that may result from poor attendances which will eventually penalize them from sitting their final examination as required.

Keywords: QR code, Reed-Solomon, error correction, system design.

Procedia PDF Downloads 356
5063 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 463
5062 Traverse Surveying Table Simple and Sure

Authors: Hamid Fallah

Abstract:

Creating surveying stations is the first thing that a surveyor learns; they can use it for control and implementation in projects such as buildings, roads, tunnels, monitoring, etc., whatever is related to the preparation of maps. In this article, the method of calculation through the traverse table and by checking several examples of errors of several publishers of surveying books in the calculations of this table, we also control the results of several software in a simple way. Surveyors measure angles and lengths in creating surveying stations, so the most important task of a surveyor is to be able to correctly remove the error of angles and lengths from the calculations and to determine whether the amount of error is within the permissible limit for delete it or not.

Keywords: UTM, localization, scale factor, cartesian, traverse

Procedia PDF Downloads 52
5061 Using Derivative Free Method to Improve the Error Estimation of Numerical Quadrature

Authors: Chin-Yun Chen

Abstract:

Numerical integration is an essential tool for deriving different physical quantities in engineering and science. The effectiveness of a numerical integrator depends on different factors, where the crucial one is the error estimation. This work presents an error estimator that combines a derivative free method to improve the performance of verified numerical quadrature.

Keywords: numerical quadrature, error estimation, derivative free method, interval computation

Procedia PDF Downloads 431
5060 Medical Error: Concept and Description According to Brazilian Physicians

Authors: Vitor S. Mendonca, Maria Luisa S. Schmidt

Abstract:

The Brazilian medical profession is viewed as being error-free, so healthcare professionals who commit an error are condemned there. Medical errors occur frequently in the Brazilian healthcare system, so identifying better options for handling this issue has become of interest primarily for physicians. The purpose of this study is to better understand the tensions involved in the fear of making an error due to the harm and risk this would represent for those involved. A qualitative study was performed by means of the narratives of the lived experiences of ten acting physicians in the State of Sao Paulo. The concept and characterization of errors were discussed, together with the fear of making an error, the near misses or error in itself, how to deal with errors and what to do to avoid them. The analysis indicates an excessive pressure in the medical profession for error-free practices, with a well-established physician-patient relationship to facilitate the management of medical errors. The error occurs, but a lack of information and discussion often leads to its concealment due to fear or possible judgment by society or peers. The establishment of programs that encourage appropriate medical conduct in the event of an error requires coherent answers for humanization in Brazilian medical science. It is necessary to improve the discussion about medical errors and disseminate models of communication and notification of errors in Brazil.

Keywords: medical error, narrative, physician-patient relationship, qualitative research

Procedia PDF Downloads 144
5059 Articles, Delimitation of Speech and Perception

Authors: Nataliya L. Ogurechnikova

Abstract:

The paper aims to clarify the function of articles in the English speech and specify their place and role in the English language, taking into account the use of articles for delimitation of speech. A focus of the paper is the use of the definite and the indefinite articles with different types of noun phrases which comprise either one noun with or without attributes, such as the King, the Queen, the Lion, the Unicorn, a dimple, a smile, a new language, an unknown dialect, or several nouns with or without attributes, such as the King and Queen of Hearts, the Lion and Unicorn, a dimple or smile, a completely isolated language or dialect. It is stated that the function of delimitation is related to perception: the number of speech units in a text correlates with the way the speaker perceives and segments the denotation. The two following combinations of words the house and garden and the house and the garden contain different numbers of speech units, one and two respectively, and reveal two different perception modes which correspond to the use of the definite article in the examples given. Thus, the function of delimitation is twofold, it is related to perception and cognition, on the one hand, and, on the other hand, to grammar, if the subject of grammar is the structure of speech. Analysis of speech units in the paper is not limited by noun phrases and is amplified by discussion of peripheral phenomena which are nevertheless important because they enable to qualify articles as a syntactic phenomenon whereas they are not infrequently described in terms of noun morphology. With this regard attention is given to the history of linguistic studies, specifically to the description of English articles by Niels Haislund, a disciple of Otto Jespersen. A discrepancy is noted between the initial plan of Jespersen who intended to describe articles as a syntactic phenomenon in ‘A Modern English Grammar on Historical Principles’ and the interpretation of articles in terms of noun morphology, finally given by Haislund. Another issue of the paper is correlation between description and denotation, being a traditional aspect of linguistic studies focused on articles. An overview of relevant studies, given in the paper, goes back to the works of G. Frege, which gave rise to a series of scientific works where the meaning of articles was described within the scope of logical semantics. Correlation between denotation and description is treated in the paper as the meaning of article, i.e. a component in its semantic structure, which differs from the function of delimitation and is similar to the meaning of other quantifiers. The paper further explains why the relation between description and denotation, i.e. the meaning of English article, is irrelevant for noun morphology and has nothing to do with nominal categories of the English language.

Keywords: delimitation of speech, denotation, description, perception, speech units, syntax

Procedia PDF Downloads 217
5058 Variation of Refractive Errors among Right and Left Eyes in Jos, Plateau State, Nigeria

Authors: F. B. Masok, S. S Songdeg, R. R. Dawam

Abstract:

Vision is an important process for learning and communication as man depends greatly on vision to sense his environment. Prevalence and variation of refractive errors conducted between December 2010 and May 2011 in Jos, revealed that 735 (77.50%) out 950 subjects examined for refractive error had various refractive errors. Myopia was observed in 373 (49.79%) of the subjects, the error in the right eyes was 263 (55.60%) while the error in the left was 210(44.39%). The mean myopic error was found to be -1.54± 3.32. Hyperopia was observed in 385 (40.53%) of the sampled population comprising 203(52.73%) of the right eyes and 182(47.27%). The mean hyperopic error was found to be +1.74± 3.13. Astigmatism accounted for 359 (38.84%) of the subjects, out of which 193(53.76%) were in the right eyes while 168(46.79%) were in the left eyes. Presbyopia was found in 404(42.53%) of the subjects, of this figure, 164(40.59%) were in the right eyes while 240(59.41%) were in left eyes. The number of right eyes and left eyes with refractive errors was observed in some age groups to increase with age and later had its peak within 60 – 69 age groups. This pattern of refractive errors could be attributed to exposure to various forms of light particularly the ultraviolet rays (e.g rays from television and computer screen). There was no remarkable differences between the mean Myopic error and mean Hyperopic error in the right eyes and in the left eyes which suggest the right eye and the left eye are similar.

Keywords: left eye, refractive errors, right eye, variation

Procedia PDF Downloads 404
5057 Error Correction Method for 2D Ultra-Wideband Indoor Wireless Positioning System Using Logarithmic Error Model

Authors: Phornpat Chewasoonthorn, Surat Kwanmuang

Abstract:

Indoor positioning technologies have been evolved rapidly. They augment the Global Positioning System (GPS) which requires line-of-sight to the sky to track the location of people or objects. This study developed an error correction method for an indoor real-time location system (RTLS) based on an ultra-wideband (UWB) sensor from Decawave. Multiple stationary nodes (anchor) were installed throughout the workspace. The distance between stationary and moving nodes (tag) can be measured using a two-way-ranging (TWR) scheme. The result has shown that the uncorrected ranging error from the sensor system can be as large as 1 m. To reduce ranging error and thus increase positioning accuracy, This study purposes an online correction algorithm using the Kalman filter. The results from experiments have shown that the system can reduce ranging error down to 5 cm.

Keywords: indoor positioning, ultra-wideband, error correction, Kalman filter

Procedia PDF Downloads 130
5056 Generalization of Tau Approximant and Error Estimate of Integral Form of Tau Methods for Some Class of Ordinary Differential Equations

Authors: A. I. Ma’ali, R. B. Adeniyi, A. Y. Badeggi, U. Mohammed

Abstract:

An error estimation of the integrated formulation of the Lanczos tau method for some class of ordinary differential equations was reported. This paper is concern with the generalization of tau approximants and their corresponding error estimates for some class of ordinary differential equations (ODEs) characterized by m + s =3 (i.e for m =1, s=2; m=2, s=1; and m=3, s=0) where m and s are the order of differential equations and number of overdetermination, respectively. The general result obtained were validated with some numerical examples.

Keywords: approximant, error estimate, tau method, overdetermination

Procedia PDF Downloads 574
5055 A Study on the Influence of Planet Pin Parallelism Error to Load Sharing Factor

Authors: Kyung Min Kang, Peng Mou, Dong Xiang, Yong Yang, Gang Shen

Abstract:

In this paper, planet pin parallelism error, which is one of manufacturing error of planet carrier, is employed as a main variable to influence planet load sharing factor. This error is categorize two group: (i) pin parallelism error with rotation on the axis perpendicular to the tangent of base circle of gear(x axis rotation in this paper) (ii) pin parallelism error with rotation on the tangent axis of base circle of gear(y axis rotation in this paper). For this study, the planetary gear system in 1.5MW wind turbine is applied and pure torsional rigid body model of this planetary gear is built using Solidworks and MSC.ADAMS. Based on quantified parallelism error and simulation model, dynamics simulation of planetary gear is carried out to obtain dynamic mesh load results with each type of error and load sharing factor is calculated with mesh load results. Load sharing factor formula and the suggestion for planetary reliability design is proposed with the conclusion of this study.

Keywords: planetary gears, planet load sharing, MSC. ADAMS, parallelism error

Procedia PDF Downloads 371
5054 Turing Pattern in the Oregonator Revisited

Authors: Elragig Aiman, Dreiwi Hanan, Townley Stuart, Elmabrook Idriss

Abstract:

In this paper, we reconsider the analysis of the Oregonator model. We highlight an error in this analysis which leads to an incorrect depiction of the parameter region in which diffusion driven instability is possible. We believe that the cause of the oversight is the complexity of stability analyses based on eigenvalues and the dependence on parameters of matrix minors appearing in stability calculations. We regenerate the parameter space where Turing patterns can be seen, and we use the common Lyapunov function (CLF) approach, which is numerically reliable, to further confirm the dependence of the results on diffusion coefficients intensities.

Keywords: diffusion driven instability, common Lyapunov function (CLF), turing pattern, positive-definite matrix

Procedia PDF Downloads 332
5053 An Observational Study of Vitamin B12 Levels and Peripheral Neuropathy Profile in Patients of Diabetes Mellitus on Metformin Therapy

Authors: Kamesh Gupta, Nitin Jain, Anurag Rohatgi

Abstract:

Objective: To study Vitamin B12 levels and presence of peripheral neuropathy among diabetes mellitus patients on metformin therapy. Method: The observational study was conducted from November 2014 to March 2015. Patients were selected from the Lady Hardinge Medical College, Delhi, India. Exhaustive history regarding dietary habits and metformin usage was taken. Lab tests including HbA1c levels and Vit B12 assays were done, on the basis of which patients were classified into subgroups. Peripheral neuropathy was detected by both clinical scoring and electrophysiological studies. Appropriate Statistical analysis for observational studies was done to evaluate the data. Results: The average duration of metformin usage was higher in patients with definite B12 deficiency (9.4y) than patients with normal B12 levels (5.6 y). Patients in the definite B12 deficiency group had much higher incidence of neuropathy (89%) than patients with no deficiency (27%). The incidence of neuropathy was higher in cases with longer metformin usage (100% with 18-22y of use and 83% with 14-17y of use) than shorter periods (29% with 2-5y of use and 75% with 6-9y of use). Conclusion: Thus patients on long-term metformin therapy are at a high risk for Vitamin B12 deficiency. Definite and possible Vitamin B12 deficiency on metformin had an earlier onset of neuropathy than the subgroup with normal Vitamin B12 levels.

Keywords: diabetic neuroptahy, cobalamine deficiency, metformin, nerve conduction studies

Procedia PDF Downloads 339
5052 Unequal Error Protection of VQ Image Transmission System

Authors: Khelifi Mustapha, A. Moulay lakhdar, I. Elawady

Abstract:

We will study the unequal error protection for VQ image. We have used the Reed Solomon (RS) Codes as Channel coding because they offer better performance in terms of channel error correction over a binary output channel. One such channel (binary input and output) should be considered if it is the case of the application layer, because it includes all the features of the layers located below and on the what it is usually not feasible to make changes.

Keywords: vector quantization, channel error correction, Reed-Solomon channel coding, application

Procedia PDF Downloads 332
5051 Analysis of Aspergillus fumigatus IgG Serologic Cut-Off Values to Increase Diagnostic Specificity of Allergic Bronchopulmonary Aspergillosis

Authors: Sushmita Roy Chowdhury, Steve Holding, Sujoy Khan

Abstract:

The immunogenic responses of the lung towards the fungus Aspergillus fumigatus may range from invasive aspergillosis in the immunocompromised, fungal ball or infection within a cavity in the lung in those with structural lung lesions, or allergic bronchopulmonary aspergillosis (ABPA). Patients with asthma or cystic fibrosis are particularly predisposed to ABPA. There are consensus guidelines that have established criteria for diagnosis of ABPA, but uncertainty remains on the serologic cut-off values that would increase the diagnostic specificity of ABPA. We retrospectively analyzed 80 patients with severe asthma and evidence of peripheral blood eosinophilia ( > 500) over the last 3 years who underwent all serologic tests to exclude ABPA. Total IgE, specific IgE and specific IgG levels against Aspergillus fumigatus were measured using ImmunoCAP Phadia-100 (Thermo Fisher Scientific, Sweden). The Modified ISHAM working group 2013 criteria (obligate criteria: asthma or cystic fibrosis, total IgE > 1000 IU/ml or > 417 kU/L and positive specific IgE Aspergillus fumigatus or skin test positivity; with ≥ 2 of peripheral eosinophilia, positive specific IgG Aspergillus fumigatus and consistent radiographic opacities) was used in the clinical workup for the final diagnosis of ABPA. Patients were divided into 3 groups - definite, possible, and no evidence of ABPA. Specific IgG Aspergillus fumigatus levels were not used to assign the patients into any of the groups. Of 80 patients (males 48, females 32; mean age 53.9 years ± SD 15.8) selected for the analysis, there were 30 patients who had positive specific IgE against Aspergillus fumigatus (37.5%). 13 patients fulfilled the Modified ISHAM working group 2013 criteria of ABPA (‘definite’), while 15 patients were ‘possible’ ABPA and 52 did not fulfill the criteria (not ABPA). As IgE levels were not normally distributed, median levels were used in the analysis. Median total IgE levels of patients with definite and possible ABPA were 2144 kU/L and 2597 kU/L respectively (non-significant), while median specific IgE Aspergillus fumigatus at 4.35 kUA/L and 1.47 kUA/L respectively were significantly different (comparison of standard deviations F-statistic 3.2267, significance level p=0.040). Mean levels of IgG anti-Aspergillus fumigatus in the three groups (definite, possible and no evidence of ABPA) were compared using ANOVA (Statgraphics Centurion Professional XV, Statpoint Inc). Mean levels of IgG anti-Aspergillus fumigatus (Gm3) in definite ABPA was 125.17 mgA/L ( ± SD 54.84, with 95%CI 92.03-158.32), while mean Gm3 levels in possible and no ABPA were 18.61 mgA/L and 30.05 mgA/L respectively. ANOVA showed a significant difference between the definite group and the other groups (p < 0.001). This was confirmed using multiple range tests (Fisher's least significant difference procedure). There was no significant difference between the possible ABPA and not ABPA groups (p > 0.05). The study showed that a sizeable proportion of patients with asthma are sensitized to Aspergillus fumigatus in this part of India. A higher cut-off value of Gm3 ≥ 80 mgA/L provides a higher serologic specificity towards definite ABPA. Long-term studies would provide us more information if those patients with 'possible' APBA and positive Gm3 later develop clear ABPA, and are different from the Gm3 negative group in this respect. Serologic testing with clear defined cut-offs are a valuable adjunct in the diagnosis of ABPA.

Keywords: allergic bronchopulmonary aspergillosis, Aspergillus fumigatus, asthma, IgE level

Procedia PDF Downloads 175