Search results for: lexical errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1160

Search results for: lexical errors

470 Design and Implementation of Image Super-Resolution for Myocardial Image

Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad

Abstract:

Super-resolution is the technique of intelligently upscaling images, avoiding artifacts or blurring, and deals with the recovery of a high-resolution image from one or more low-resolution images. Single-image super-resolution is a process of obtaining a high-resolution image from a set of low-resolution observations by signal processing. While super-resolution has been demonstrated to improve image quality in scaled down images in the image domain, its effects on the Fourier-based technique remains unknown. Super-resolution substantially improved the spatial resolution of the patient LGE images by sharpening the edges of the heart and the scar. This paper aims at investigating the effects of single image super-resolution on Fourier-based and image based methods of scale-up. In this paper, first, generate a training phase of the low-resolution image and high-resolution image to obtain dictionary. In the test phase, first, generate a patch and then difference of high-resolution image and interpolation image from the low-resolution image. Next simulation of the image is obtained by applying convolution method to the dictionary creation image and patch extracted the image. Finally, super-resolution image is obtained by combining the fused image and difference of high-resolution and interpolated image. Super-resolution reduces image errors and improves the image quality.

Keywords: image dictionary creation, image super-resolution, LGE images, patch extraction

Procedia PDF Downloads 371
469 Assertion-Driven Test Repair Based on Priority Criteria

Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang

Abstract:

Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent preservation has been proposed, but it does not take into account the association between test repairs and assertions, leading to a large number of irrelevant candidates and decreasing the repair capability. This paper proposes an assertion-driven test repair approach. Furthermore, an intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) of broken test cases, which is more effective than the existing intentpreserved test repair approach, and our intent-oriented priority criteria work well.

Keywords: test repair, test intent, software test, test case evolution

Procedia PDF Downloads 125
468 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 155
467 Unsupervised Part-of-Speech Tagging for Amharic Using K-Means Clustering

Authors: Zelalem Fantahun

Abstract:

Part-of-speech tagging is the process of assigning a part-of-speech or other lexical class marker to each word into naturally occurring text. Part-of-speech tagging is the most fundamental and basic task almost in all natural language processing. In natural language processing, the problem of providing large amount of manually annotated data is a knowledge acquisition bottleneck. Since, Amharic is one of under-resourced language, the availability of tagged corpus is the bottleneck problem for natural language processing especially for POS tagging. A promising direction to tackle this problem is to provide a system that does not require manually tagged data. In unsupervised learning, the learner is not provided with classifications. Unsupervised algorithms seek out similarity between pieces of data in order to determine whether they can be characterized as forming a group. This paper explicates the development of unsupervised part-of-speech tagger using K-Means clustering for Amharic language since large amount of data is produced in day-to-day activities. In the development of the tagger, the following procedures are followed. First, the unlabeled data (raw text) is divided into 10 folds and tokenization phase takes place; at this level, the raw text is chunked at sentence level and then into words. The second phase is feature extraction which includes word frequency, syntactic and morphological features of a word. The third phase is clustering. Among different clustering algorithms, K-means is selected and implemented in this study that brings group of similar words together. The fourth phase is mapping, which deals with looking at each cluster carefully and the most common tag is assigned to a group. This study finds out two features that are capable of distinguishing one part-of-speech from others these are morphological feature and positional information and show that it is possible to use unsupervised learning for Amharic POS tagging. In order to increase performance of the unsupervised part-of-speech tagger, there is a need to incorporate other features that are not included in this study, such as semantic related information. Finally, based on experimental result, the performance of the system achieves a maximum of 81% accuracy.

Keywords: POS tagging, Amharic, unsupervised learning, k-means

Procedia PDF Downloads 444
466 Assessing the Quality of Clinical Photographs Taken for Orthodontic Patients at Queen’s Hospital, Romford

Authors: Maya Agarwala

Abstract:

Objectives: Audit the quality of clinical photographs taken for Orthodontic patients at Queen’s hospital, Romford. Design and setting: All Orthodontic photographs are taken in the Medical Photography Department at Queen’s Hospital. Retrospective audit with data collected between January - March 2023. Gold standard: Institute of Medical Illustrators (IMI) standard 12 photographs: 6 extraoral and 6 intraoral. 100% of patients to have the standard 12 photographs meeting a satisfactory diagnostic quality. Materials and methods: 30 patients randomly selected. All photographs analysed against the IMI gold standard. Results: A total of 360 photographs were analysed. 100% of the photographs had the 12 photographic views. Of which, 93.1% met the gold standard. Of the extraoral photos: 99.4% met the gold standard, 0.6% had incorrect head positioning. Of the intraoral photographs: 87.2% met the gold standard. The most common intraoral errors were: the presence of saliva pooling (7.2%), insufficient soft tissue retraction (3.3%), incomplete occlusal surface visibility (2.2%) and mirror fogging (1.1%). Conclusion: The gold standard was not met, however the overall standard of Orthodontic photographs is high. Further training of the Medical Photography team is needed to improve the quality of photographs. Following the training, the audit will be repeated. High-quality clinical photographs are an important part of clinical record keeping.

Keywords: orthodontics, paediatric, photography, audit

Procedia PDF Downloads 85
465 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 65
464 Combining Corpus Linguistics and Critical Discourse Analysis to Study Power Relations in Hindi Newspapers

Authors: Vandana Mishra, Niladri Sekhar Dash, Jayshree Charkraborty

Abstract:

This present paper focuses on the application of corpus linguistics techniques for critical discourse analysis (CDA) of Hindi newspapers. While Corpus linguistics is the study of language as expressed in corpora (samples) of 'real world' text, CDA is an interdisciplinary approach to the study of discourse that views language as a form of social practice. CDA has mainly been studied from a qualitative perspective. However, we can say that recent studies have begun combining corpus linguistics with CDA in analyzing large volumes of text for the study of existing power relations in society. The corpus under our study is also of a sizable amount (1 million words of Hindi newspaper texts) and its analysis requires an alternative analytical procedure. So, we have combined both the quantitative approach i.e. the use of corpus techniques with CDA’s traditional qualitative analysis. In this context, we have focused on the Keyword Analysis Sorting Concordance Lines of the selected Keywords and calculating collocates of the keywords. We have made use of the Wordsmith Tool for all these analysis. The analysis starts with identifying the keywords in the political news corpus when compared with the main news corpus. The keywords are extracted from the corpus based on their keyness calculated through statistical tests like chi-squared test and log-likelihood test on the frequent words of the corpus. Some of the top occurring keywords are मोदी (Modi), भाजपा (BJP), कांग्रेस (Congress), सरकार (Government) and पार्टी (Political party). This is followed by the concordance analysis of these keywords which generates thousands of lines but we have to select few lines and examine them based on our objective. We have also calculated the collocates of the keywords based on their Mutual Information (MI) score. Both concordance and collocation help to identify lexical patterns in the political texts. Finally, all these quantitative results derived from the corpus techniques will be subjectively interpreted in accordance to the CDA’s theory to examine the ways in which political news discourse produces social and political inequality, power abuse or domination.

Keywords: critical discourse analysis, corpus linguistics, Hindi newspapers, power relations

Procedia PDF Downloads 219
463 Modeling the Saltatory Conduction in Myelinated Axons by Order Reduction

Authors: Ruxandra Barbulescu, Daniel Ioan, Gabriela Ciuprina

Abstract:

The saltatory conduction is the way the action potential is transmitted along a myelinated axon. The potential diffuses along the myelinated compartments and it is regenerated in the Ranvier nodes due to the ion channels allowing the flow across the membrane. For an efficient simulation of populations of neurons, it is important to use reduced order models both for myelinated compartments and for Ranvier nodes and to have control over their accuracy and inner parameters. The paper presents a reduced order model of this neural system which allows an efficient simulation method for the saltatory conduction in myelinated axons. This model is obtained by concatenating reduced order linear models of 1D myelinated compartments and nonlinear 0D models of Ranvier nodes. The models for the myelinated compartments are selected from a series of spatially distributed models developed and hierarchized according to their modeling errors. The extracted model described by a nonlinear PDE of hyperbolic type is able to reproduce the saltatory conduction with acceptable accuracy and takes into account the finite propagation speed of potential. Finally, this model is again reduced in order to make it suitable for the inclusion in large-scale neural circuits.

Keywords: action potential, myelinated segments, nonlinear models, Ranvier nodes, reduced order models, saltatory conduction

Procedia PDF Downloads 155
462 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software

Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman

Abstract:

Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.

Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation

Procedia PDF Downloads 120
461 PhD Research Design and Descriptive Theory: Theoretical Framework for Development of Integrated Management System

Authors: Samuel Quashie

Abstract:

The importance of theory for PhD construction management research cannot be underestimated, as it requires a sound theoretical basis. Theory efficiency reduces errors in the research problem, solving it by building upon current theory. Provides a structure for examination, enables the efficient development of the construction management field and to it practical real world problems. The aim is to develop the theoretical framework for the application of descriptive theory within the PhD research design To apply the proposed theoretical framework using the case of the topic of ‘integrated management system,’ classifying the phenomena into categories, explore the association between the category–defining attributes and the outcome observed. Forming categorization based upon attributes of phenomena (framework and typologies), and statement of association (models). Predicting (deductive process) and confirming (inductive process). The descriptive theory is important and provides a structure for examination, enables the efficient development of construction management field and to it practical real world problems. In conclusion, the work done in management presents fertile ground for research and theory development.

Keywords: descriptive theory, PhD research design, theoretical framework, construction management

Procedia PDF Downloads 421
460 Spontaneous and Posed Smile Detection: Deep Learning, Traditional Machine Learning, and Human Performance

Authors: Liang Wang, Beste F. Yuksel, David Guy Brizan

Abstract:

A computational model of affect that can distinguish between spontaneous and posed smiles with no errors on a large, popular data set using deep learning techniques is presented in this paper. A Long Short-Term Memory (LSTM) classifier, a type of Recurrent Neural Network, is utilized and compared to human classification. Results showed that while human classification (mean of 0.7133) was above chance, the LSTM model was more accurate than human classification and other comparable state-of-the-art systems. Additionally, a high accuracy rate was maintained with small amounts of training videos (70 instances). The derivation of important features to further understand the success of our computational model were analyzed, and it was inferred that thousands of pairs of points within the eyes and mouth are important throughout all time segments in a smile. This suggests that distinguishing between a posed and spontaneous smile is a complex task, one which may account for the difficulty and lower accuracy of human classification compared to machine learning models.

Keywords: affective computing, affect detection, computer vision, deep learning, human-computer interaction, machine learning, posed smile detection, spontaneous smile detection

Procedia PDF Downloads 122
459 A Study of Adaptive Fault Detection Method for GNSS Applications

Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee

Abstract:

A purpose of this study is to develop efficient detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive estimation. Due to dependence of radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. Thus, to utilize GNSS for aerospace or ground vehicles requiring high level of safety, unhealthy measurements should be considered seriously. For the reason, this paper proposes adaptive fault detection method to deal with unhealthy measurements in various harsh environments. By the proposed method, the test statistics for fault detection is generated by estimated measurement noise. Pseudorange and carrier-phase measurement noise are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. Performance of the proposed method was evaluated by field-collected GNSS measurements. To evaluate the fault detection capability, intentional faults were added to measurements. The experimental result shows that the proposed detection method is efficient in detecting unhealthy measurements and improves the accuracy of GNSS positioning under fault occurrence.

Keywords: adaptive estimation, fault detection, GNSS, residual

Procedia PDF Downloads 568
458 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform

Authors: Khadija Refouh

Abstract:

Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.

Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms

Procedia PDF Downloads 141
457 Symo-syl: A Meta-Phonological Intervention to Support Italian Pre-Schoolers’ Emergent Literacy Skills

Authors: Tamara Bastianello, Rachele Ferrari, Marinella Majorano

Abstract:

The adoption of the syllabic approach in preschool programmes could support and reinforce meta-phonological awareness and literacy skills in children. The introduction of a meta-phonological intervention in preschool could facilitate the transition to primary school, especially for children with learning fragilities. In the present contribution, we want to investigate the efficacy of "Simo-syl" intervention in enhancing emergent literacy skills in children (especially for reading). Simo-syl is a 12 weeks multimedia programme developed for children to improve their language and communication skills and later literacy development in preschool. During the intervention, Simo-syl, an invented character, leads children in a series of meta-phonological games. Forty-six Italian preschool children (i.e., the Simo-syl group) participated in the programme; seventeen preschool children (i.e., the control group) did not participate in the intervention. Children in the two groups were between 4;10 and 5;9 years. They were assessed on their vocabulary, morpho-syntactical, meta-phonological, phonological, and phono-articulatory skills twice: 1) at the beginning of the last year of the preschool through standardised paper-based assessment tools and 2) one week after the intervention. All children in the Simo-syl group took part in the meta-phonological programme based on the syllabic approach. The intervention lasted 12 weeks (three activities per week; week 1: activities focused on syllable blending and spelling and a first approach to the written code; weeks 2-11: activities focused on syllables recognition; week 12: activities focused on vowels recognition). Very few children (Simo-syl group = 21, control group = 9) were tested again (post-test) one week after the intervention. Before starting the intervention programme, the Simo-syl and the control groups had similar meta-phonological, phonological, lexical skills (all ps > .05). One week after the intervention, a significant difference emerged between the two groups in their meta-phonological skills (syllable blending, p = .029; syllable spelling, p = .032), in their vowel recognition ability (p = .032) and their word reading skills (p = .05). An ANOVA confirmed the effect of the group membership on the developmental growth for the word reading task (F (1,28) = 6.83, p = .014, ηp2 = .196). Taking part in the Simo-syl intervention has a positive effect on the ability to read in preschool children.

Keywords: intervention programme, literacy skills, meta-phonological skills, syllabic approach

Procedia PDF Downloads 158
456 Prediction of Saturated Hydraulic Conductivity Dynamics in an Iowan Agriculture Watershed

Authors: Mohamed Elhakeem, A. N. Thanos Papanicolaou, Christopher Wilson, Yi-Jia Chang

Abstract:

In this study, a physically-based, modelling framework was developed to predict saturated hydraulic conductivity (KSAT) dynamics in the Clear Creek Watershed (CCW), Iowa. The modelling framework integrated selected pedotransfer functions and watershed models with geospatial tools. A number of pedotransfer functions and agricultural watershed models were examined to select the appropriate models that represent the study site conditions. Models selection was based on statistical measures of the models’ errors compared to the KSAT field measurements conducted in the CCW under different soil, climate and land use conditions. The study has shown that the predictions of the combined pedotransfer function of Rosetta and the Water Erosion Prediction Project (WEPP) provided the best agreement to the measured KSAT values in the CCW compared to the other tested models. Therefore, Rosetta and WEPP were integrated with the Geographic Information System (GIS) tools for visualization of the data in forms of geospatial maps and prediction of KSAT variability in CCW due to the seasonal changes in climate and land use activities.

Keywords: saturated hydraulic conductivity, pedotransfer functions, watershed models, geospatial tools

Procedia PDF Downloads 257
455 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 73
454 Tibyan Automated Arabic Correction Using Machine-Learning in Detecting Syntactical Mistakes

Authors: Ashwag O. Maghraby, Nida N. Khan, Hosnia A. Ahmed, Ghufran N. Brohi, Hind F. Assouli, Jawaher S. Melibari

Abstract:

The Arabic language is one of the most important languages. Learning it is so important for many people around the world because of its religious and economic importance and the real challenge lies in practicing it without grammatical or syntactical mistakes. This research focused on detecting and correcting the syntactic mistakes of Arabic syntax according to their position in the sentence and focused on two of the main syntactical rules in Arabic: Dual and Plural. It analyzes each sentence in the text, using Stanford CoreNLP morphological analyzer and machine-learning approach in order to detect the syntactical mistakes and then correct it. A prototype of the proposed system was implemented and evaluated. It uses support vector machine (SVM) algorithm to detect Arabic grammatical errors and correct them using the rule-based approach. The prototype system has a far accuracy 81%. In general, it shows a set of useful grammatical suggestions that the user may forget about while writing due to lack of familiarity with grammar or as a result of the speed of writing such as alerting the user when using a plural term to indicate one person.

Keywords: Arabic language acquisition and learning, natural language processing, morphological analyzer, part-of-speech

Procedia PDF Downloads 147
453 High-Value Health System for All: Technologies for Promoting Health Education and Awareness

Authors: M. P. Sebastian

Abstract:

Health for all is considered as a sign of well-being and inclusive growth. New healthcare technologies are contributing to the quality of human lives by promoting health education and awareness, leading to the prevention, early diagnosis and treatment of the symptoms of diseases. Healthcare technologies have now migrated from the medical and institutionalized settings to the home and everyday life. This paper explores these new technologies and investigates how they contribute to health education and awareness, promoting the objective of high-value health system for all. The methodology used for the research is literature review. The paper also discusses the opportunities and challenges with futuristic healthcare technologies. The combined advances in genomics medicine, wearables and the IoT with enhanced data collection in electronic health record (EHR) systems, environmental sensors, and mobile device applications can contribute in a big way to high-value health system for all. The promise by these technologies includes reduced total cost of healthcare, reduced incidence of medical diagnosis errors, and reduced treatment variability. The major barriers to adoption include concerns with security, privacy, and integrity of healthcare data, regulation and compliance issues, service reliability, interoperability and portability of data, and user friendliness and convenience of these technologies.

Keywords: big data, education, healthcare, information communication technologies (ICT), patients, technologies

Procedia PDF Downloads 204
452 Orthodontic Treatment Using CAD/CAM System

Authors: Cristiane C. B. Alves, Livia Eisler, Gustavo Mota, Kurt Faltin Jr., Cristina L. F. Ortolani

Abstract:

The correct positioning of the brackets is essential for the success of orthodontic treatment. Indirect bracket placing technique has the main objective of eliminating the positioning errors, which commonly occur in the technique of direct system of brackets. The objective of this study is to demonstrate that the exact positioning of the brackets is of extreme relevance for the success of the treatment. The present work shows a case report of an adult female patient who attended the clinic with the complaint of being in orthodontic treatment for more than 5 years without noticing any progress. As a result of the intra-oral clinical examination and documentation analysis, a class III malocclusion, an anterior open bite, and absence of all third molars and first upper and lower bilateral premolars were observed. For the treatment, the indirect bonding technique with self-ligating ceramic braces was applied. The preparation of the trays was done after the intraoral digital scanning and printing of models with a 3D printer. Brackets were positioned virtually, using a specialized software. After twelve months of treatment, correction of the malocclusion was observed, as well as the closing of the anterior open bite. It is concluded that the adequate and precise positioning of brackets is necessary for a successful treatment.

Keywords: anterior open-bite, CAD/CAM, orthodontics, malocclusion, angle class III

Procedia PDF Downloads 185
451 An Experience of Translating an Excerpt from Sophie Adonon’s Echos de Femmes from French to English, Using Reverso.

Authors: Michael Ngongeh Mombe

Abstract:

This Paper seeks to investigate an assertion made by some colleagues that there is no need paying a human translator to translate their literary texts, that there are softwares such as Reverso that can be used to do the translation. The main objective of this study is to examine the veracity of this assertion using Reverso to translate a literary text without any post-editing by a human translator. The work is based on two theories: Skopos and Communicative theories of translation. The work is a documentary research where data were collected from published documents in libraries, on the internet and from the translation produced by Reverso. We made a comparative text analyses of both source and target texts in a bid to highlight the weaknesses and strengths of the software. Findings of this work revealed that those who advocate the use of only Machine translation do so in ignorance of the translation mistakes usually made by the software. From the review of all the 268 segments of translation, we found out that the translation produced by Reverso is fraught with errors. We therefore recommend the use of human translators to either do the translation of their literary texts or revise the translation produced by machine to conform to the skopos of the work. This paper is based on Reverso translation. Similar works in the near future will be based on the other translation softwares to determine their weaknesses and strengths.

Keywords: machine translation, human translator, Reverso, literary text

Procedia PDF Downloads 92
450 How to Teach Italian Intransitive Verbs: Focusing on Unaccusatives and Unergatives

Authors: Joung Hyoun Lee

Abstract:

Intransitive verbs consist of two subclasses called unergatives and unaccusatives. However, traditionally Italian intransitive verbs have been taught regardless their semantic distinctions and any mention of grammatical terms such as unaccusatives and unergatives even though there is a huge gap between them. This paper aims to explore the teaching of Italian intransitive verbs categorizing them into unaccusatives and unergatives, which is compared with researches on the teaching of English unaccusative and unergative verbs. For this purpose, first, the study analyses various aspects of English vs. Italian unergatives and unaccusatives, and their properties of the constructions. Next, this study highlights the research trend on Korean students' learning errors, which is leaning toward causal analyses of the over passivization of English unaccusative verbs. In order to investigate these issues, 53 students of the Busan University of Foreign Studies, who are studying Italian language as a second language, were surveyed through a grammaticality judgment test divided into 9 sections. As expected, the findings confirmed that the test results of Italian unaccusatives and unergatives showed similar and different aspects comparing to those of English. Moreover, there was a highly affirmative demand for a more careful way of teaching which should be considered both syntactically and semantically according to the grammatical items. The research provides a framework of a more effective and systematic teaching method of Italian intransitive verbs for further research.

Keywords: unaccusative verbs, unergative verbs, agent, patient, theme, overpassivization

Procedia PDF Downloads 255
449 A Study on Multidimensional Locus of Control and the Procrastinating Behavior in Employees

Authors: Richa Mishra, Sonia Munjal

Abstract:

In this increasingly hectic and competitive climate, employees are expected to manage the resources available to them to perform their work. However, many are wasting the most precious and scarce resource at their disposal, time, by procrastinating on tasks and thereby costing themselves and their organizations. As timely performance is a requirement of most jobs, procrastination is particularly problematic in the workplace. Evidence suggests that procrastination and poor performance go hand-in-hand, as procrastinators miss more deadlines than non-procrastinators and make more errors and work at a slower speed than non-procrastinators when performing timed tasks. This research is hence an effort to add a little in the sparse knowledge base. It is an effort to throw light on the relationship of Levenson’s multi dimensions of locus of control and also an effort to identify if it is one of the causes and of employees procrastination which have not been explored earlier. The study also explores the effect and relationship of multidimensional locus of control and various levels of stress on procrastination. The results of the research have ascertained that there is significant impact of LOC dimensions on the procrastinating behavior of the employees. One of the major findings to emerge from the current research that managers with powerful others as their LOC dimensions were least procrastinating, contradicts the previous research results that external procrastinate more than internals.

Keywords: Multidimensional Locus of Control, workplace procrastination, employee behaviour, manufacturing industry

Procedia PDF Downloads 237
448 Optical Signal-To-Noise Ratio Monitoring Based on Delay Tap Sampling Using Artificial Neural Network

Authors: Feng Wang, Shencheng Ni, Shuying Han, Shanhong You

Abstract:

With the development of optical communication, optical performance monitoring (OPM) has received more and more attentions. Since optical signal-to-noise ratio (OSNR) is directly related to bit error rate (BER), it is one of the important parameters in optical networks. Recently, artificial neural network (ANN) has been greatly developed. ANN has strong learning and generalization ability. In this paper, a method of OSNR monitoring based on delay-tap sampling (DTS) and ANN has been proposed. DTS technique is used to extract the eigenvalues of the signal. Then, the eigenvalues are input into the ANN to realize the OSNR monitoring. The experiments of 10 Gb/s non-return-to-zero (NRZ) on–off keying (OOK), 20 Gb/s pulse amplitude modulation (PAM4) and 20 Gb/s return-to-zero (RZ) differential phase-shift keying (DPSK) systems are demonstrated for the OSNR monitoring based on the proposed method. The experimental results show that the range of OSNR monitoring is from 15 to 30 dB and the root-mean-square errors (RMSEs) for 10 Gb/s NRZ-OOK, 20 Gb/s PAM4 and 20 Gb/s RZ-DPSK systems are 0.36 dB, 0.45 dB and 0.48 dB respectively. The impact of chromatic dispersion (CD) on the accuracy of OSNR monitoring is also investigated in the three experimental systems mentioned above.

Keywords: artificial neural network (ANN), chromatic dispersion (CD), delay-tap sampling (DTS), optical signal-to-noise ratio (OSNR)

Procedia PDF Downloads 109
447 Inverse Heat Conduction Analysis of Cooling on Run-Out Tables

Authors: M. S. Gadala, Khaled Ahmed, Elasadig Mahdi

Abstract:

In this paper, we introduced a gradient-based inverse solver to obtain the missing boundary conditions based on the readings of internal thermocouples. The results show that the method is very sensitive to measurement errors, and becomes unstable when small time steps are used. The artificial neural networks are shown to be capable of capturing the whole thermal history on the run-out table, but are not very effective in restoring the detailed behavior of the boundary conditions. Also, they behave poorly in nonlinear cases and where the boundary condition profile is different. GA and PSO are more effective in finding a detailed representation of the time-varying boundary conditions, as well as in nonlinear cases. However, their convergence takes longer. A variation of the basic PSO, called CRPSO, showed the best performance among the three versions. Also, PSO proved to be effective in handling noisy data, especially when its performance parameters were tuned. An increase in the self-confidence parameter was also found to be effective, as it increased the global search capabilities of the algorithm. RPSO was the most effective variation in dealing with noise, closely followed by CRPSO. The latter variation is recommended for inverse heat conduction problems, as it combines the efficiency and effectiveness required by these problems.

Keywords: inverse analysis, function specification, neural net works, particle swarm, run-out table

Procedia PDF Downloads 237
446 A Script for Presentation to the Management of a Teaching Hospital on DXplain Clinical Decision Support System

Authors: Jacob Nortey

Abstract:

Introduction: In recent years, there has been an enormous success in discoveries of scientific knowledge in medicine coupled with the advancement of technology. Despite all these successes, diagnoses and treatment of diseases have become complex. According to the Ibero – American Study of Adverse Effects (IBEAS), about 10% of hospital patients suffer from secondary damage during the care process, and approximately 2% die from this process. Many clinical decision support systems have been developed to help mitigate some healthcare medical errors. Method: Relevant databases were searched, including ones that were peculiar to the clinical decision support system (that is, using google scholar, Pub Med and general google searches). The articles were then screened for a comprehensive overview of the functionality, consultative style and statistical usage of Dxplain Clinical decision support systems. Results: Inferences drawn from the articles showed high usage of Dxplain clinical decision support system for problem-based learning among students in developed countries as against little or no usage among students in Low – and Middle – income Countries. The results also indicated high usage among general practitioners. Conclusion: Despite the challenges Dxplain presents, the benefits of its usage to clinicians and students are enormous.

Keywords: dxplain, clinical decision support sytem, diagnosis, support systems

Procedia PDF Downloads 76
445 Affective Transparency in Compound Word Processing

Authors: Jordan Gallant

Abstract:

In the compound word processing literature, much attention has been paid to the relationship between a compound’s denotational meaning and that of its morphological whole-word constituents, which is referred to as ‘semantic transparency’. However, the parallel relationship between a compound’s connotation and that of its constituents has not been addressed at all. For instance, while a compound like ‘painkiller’ might be semantically transparent, it is not ‘affectively transparent’. That is, both constituents have primarily negative connotations, while the whole compound has a positive one. This paper investigates the role of affective transparency on compound processing using two methodologies commonly employed in this field: a lexical decision task and a typing task. The critical stimuli used were 112 English bi-constituent compounds that differed in terms of the effective transparency of their constituents. Of these, 36 stimuli contained constituents with similar connotations to the compound (e.g., ‘dreamland’), 36 contained constituents with more positive connotations (e.g. ‘bedpan’), and 36 contained constituents with more negative connotations (e.g. ‘painkiller’). Connotation of whole-word constituents and compounds were operationalized via valence ratings taken from an off-line ratings database. In Experiment 1, compound stimuli and matched non-word controls were presented visually to participants, who were then asked to indicate whether it was a real word in English. Response times and accuracy were recorded. In Experiment 2, participants typed compound stimuli presented to them visually. Individual keystroke response times and typing accuracy were recorded. The results of both experiments provided positive evidence that compound processing is influenced by effective transparency. In Experiment 1, compounds in which both constituents had more negative connotations than the compound itself were responded to significantly more slowly than compounds in which the constituents had similar or more positive connotations. Typed responses from Experiment 2 showed that inter-keystroke intervals at the morphological constituent boundary were significantly longer when the connotation of the head constituent was either more positive or more negative than that of the compound. The interpretation of this finding is discussed in the context of previous compound typing research. Taken together, these findings suggest that affective transparency plays a role in the recognition, storage, and production of English compound words. This study provides a promising first step in a new direction for research on compound words.

Keywords: compound processing, semantic transparency, typed production, valence

Procedia PDF Downloads 121
444 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data

Authors: N. Borjalilu, P. Rabiei, A. Enjoo

Abstract:

Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.

Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety

Procedia PDF Downloads 163
443 Study and Calibration of Autonomous UAV Systems with Thermal Sensing Allowing Screening of Environmental Concerns

Authors: Raahil Sheikh, Abhishek Maurya, Priya Gujjar, Himanshu Dwivedi, Prathamesh Minde

Abstract:

UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoided.

Keywords: UAV, drone, autonomous system, thermal imaging

Procedia PDF Downloads 69
442 Integrated Navigation System Using Simplified Kalman Filter Algorithm

Authors: Othman Maklouf, Abdunnaser Tresh

Abstract:

GPS and inertial navigation system (INS) have complementary qualities that make them ideal use for sensor fusion. The limitations of GPS include occasional high noise content, outages when satellite signals are blocked, interference and low bandwidth. The strengths of GPS include its long-term stability and its capacity to function as a stand-alone navigation system. In contrast, INS is not subject to interference or outages, have high bandwidth and good short-term noise characteristics, but have long-term drift errors and require external information for initialization. A combined system of GPS and INS subsystems can exhibit the robustness, higher bandwidth and better noise characteristics of the inertial system with the long-term stability of GPS. The most common estimation algorithm used in integrated INS/GPS is the Kalman Filter (KF). KF is able to take advantages of these characteristics to provide a common integrated navigation implementation with performance superior to that of either subsystem (GPS or INS). This paper presents a simplified KF algorithm for land vehicle navigation application. In this integration scheme, the GPS derived positions and velocities are used as the update measurements for the INS derived PVA. The KF error state vector in this case includes the navigation parameters as well as the accelerometer and gyroscope error states.

Keywords: GPS, INS, Kalman filter, inertial navigation system

Procedia PDF Downloads 466
441 The Predicted Values of the California Bearing Ratio (CBR) by Using the Measurements of the Soil Resistivity Method (DC)

Authors: Fathi Ali Swaid

Abstract:

The CBR test is widely used in the assessment of granular materials in base, subbase and subgrade layers of road and airfield pavements. Despite the success of this method, but it depends on a limited numbers of soil samples. This limitation do not adequately account for the spatial variability of soil properties. Thus, assessment is derived using these cursory soil data are likely to contain errors and thus make interpretation and soil characterization difficult. On the other hand quantitative methods of soil inventory at the field scale involve the design and adoption of sampling regimes and laboratory analysis that are time consuming and costly. In the latter case new technologies are required to efficiently sample and observe the soil in the field. This is particularly the case where soil bearing capacity is prevalent, and detailed quantitative information for determining its cause is required. In this paper, an electrical resistivity method DC is described and its application in Elg'deem Dirt road, located in Gasser Ahmad - Misurata, Libya. Results from the DC instrument were found to be correlated with the CBR values (r2 = 0.89). Finally, it is noticed that, the correlation can be used with experience for determining CBR value using basic soil electrical resistivity measurements and checked by few CBR test representing a similar range of CBR.

Keywords: California bearing ratio, basic soil electrical resistivity, CBR, soil, subgrade, new technologies

Procedia PDF Downloads 444