Search results for: grammatical errors
343 The Development of Explicit Pragmatic Knowledge: An Exploratory Study
Authors: Aisha Siddiqa
Abstract:
The knowledge of pragmatic practices in a particular language is considered key to effective communication. Unlike one’s native language where this knowledge is acquired spontaneously, more conscious attention is required to learn second language pragmatics. Traditional foreign language (FL) classrooms generally focus on the acquisition of vocabulary and lexico-grammatical structures, neglecting pragmatic functions that are essential for effective communication in the multilingual networks of the modern world. In terms of effective communication, of particular importance is knowledge of what is perceived as polite or impolite in a certain language, an aspect of pragmatics which is not perceived as obligatory but is nonetheless indispensable for successful intercultural communication and integration. While learning a second language, the acquisition of politeness assumes more prominence as the politeness norms and practices vary according to language and culture. Therefore, along with focusing on the ‘use’ of politeness strategies, it is crucial to examine the ‘acquisition’ and the ‘acquisitional development’ of politeness strategies by second language learners, particularly, by lower proficiency leaners as the norms of politeness are usually focused in lower levels. Hence, there is an obvious need for a study that not only investigates the acquisition of pragmatics by young FL learners using innovative multiple methods; but also identifies the potential causes of the gaps in their development. The present research employs a cross sectional design to explore the acquisition of politeness by young English as a foreign language learners (EFL) in France; at three levels of secondary school learning. The methodology involves two phases. In the first phase a cartoon oral production task (COPT) is used to elicit samples of requests from young EFL learners in French schools. These data are then supplemented by a) role plays, b) an analysis of textbooks, and c) video recordings of classroom activities. This mixed method approach allows us to explore the repertoire of politeness strategies the learners possess and delve deeper into the opportunities available to learners in classrooms to learn politeness strategies in requests. The paper will provide the results of the analysis of COPT data for 250 learners at three different stages of English as foreign language development. Data analysis is based on categorization of requests developed in CCSARP project. The preliminary analysis of the COPT data shows that there is substantial evidence of pragmalinguistic development across all levels but the developmental process seems to gain momentum in the second half of the secondary school period as compared to the early period at school. However, there is very little evidence of sociopragmatic development. The study aims to document the current classroom practices in France by looking at the development of young EFL learner’s politeness strategies across three levels of secondary schools.Keywords: acquisition, English, France, interlanguage pragmatics, politeness
Procedia PDF Downloads 424342 An Efficient Traceability Mechanism in the Audited Cloud Data Storage
Authors: Ramya P, Lino Abraham Varghese, S. Bose
Abstract:
By cloud storage services, the data can be stored in the cloud, and can be shared across multiple users. Due to the unexpected hardware/software failures and human errors, which make the data stored in the cloud be lost or corrupted easily it affected the integrity of data in cloud. Some mechanisms have been designed to allow both data owners and public verifiers to efficiently audit cloud data integrity without retrieving the entire data from the cloud server. But public auditing on the integrity of shared data with the existing mechanisms will unavoidably reveal confidential information such as identity of the person, to public verifiers. Here a privacy-preserving mechanism is proposed to support public auditing on shared data stored in the cloud. It uses group signatures to compute verification metadata needed to audit the correctness of shared data. The identity of the signer on each block in shared data is kept confidential from public verifiers, who are easily verifying shared data integrity without retrieving the entire file. But on demand, the signer of the each block is reveal to the owner alone. Group private key is generated once by the owner in the static group, where as in the dynamic group, the group private key is change when the users revoke from the group. When the users leave from the group the already signed blocks are resigned by cloud service provider instead of owner is efficiently handled by efficient proxy re-signature scheme.Keywords: data integrity, dynamic group, group signature, public auditing
Procedia PDF Downloads 392341 Accuracy of Autonomy Navigation of Unmanned Aircraft Systems through Imagery
Authors: Sidney A. Lima, Hermann J. H. Kux, Elcio H. Shiguemori
Abstract:
The Unmanned Aircraft Systems (UAS) usually navigate through the Global Navigation Satellite System (GNSS) associated with an Inertial Navigation System (INS). However, GNSS can have its accuracy degraded at any time or even turn off the signal of GNSS. In addition, there is the possibility of malicious interferences, known as jamming. Therefore, the image navigation system can solve the autonomy problem, because if the GNSS is disabled or degraded, the image navigation system would continue to provide coordinate information for the INS, allowing the autonomy of the system. This work aims to evaluate the accuracy of the positioning though photogrammetry concepts. The methodology uses orthophotos and Digital Surface Models (DSM) as a reference to represent the object space and photograph obtained during the flight to represent the image space. For the calculation of the coordinates of the perspective center and camera attitudes, it is necessary to know the coordinates of homologous points in the object space (orthophoto coordinates and DSM altitude) and image space (column and line of the photograph). So if it is possible to automatically identify in real time the homologous points the coordinates and attitudes can be calculated whit their respective accuracies. With the methodology applied in this work, it is possible to verify maximum errors in the order of 0.5 m in the positioning and 0.6º in the attitude of the camera, so the navigation through the image can reach values equal to or higher than the GNSS receivers without differential correction. Therefore, navigating through the image is a good alternative to enable autonomous navigation.Keywords: autonomy, navigation, security, photogrammetry, remote sensing, spatial resection, UAS
Procedia PDF Downloads 191340 Blockchain Technology for Secure and Transparent Oil and Gas Supply Chain Management
Authors: Gaurav Kumar Sinha
Abstract:
The oil and gas industry, characterized by its complex and global supply chains, faces significant challenges in ensuring security, transparency, and efficiency. Blockchain technology, with its decentralized and immutable ledger, offers a transformative solution to these issues. This paper explores the application of blockchain technology in the oil and gas supply chain, highlighting its potential to enhance data security, improve transparency, and streamline operations. By leveraging smart contracts, blockchain can automate and secure transactions, reducing the risk of fraud and errors. Additionally, the integration of blockchain with IoT devices enables real-time tracking and monitoring of assets, ensuring data accuracy and integrity throughout the supply chain. Case studies and pilot projects within the industry demonstrate the practical benefits and challenges of implementing blockchain solutions. The findings suggest that blockchain technology can significantly improve trust and collaboration among supply chain participants, ultimately leading to more efficient and resilient operations. This study provides valuable insights for industry stakeholders considering the adoption of blockchain technology to address their supply chain management challenges.Keywords: blockchain technology, oil and gas supply chain, data security, transparency, smart contracts, IoT integration, real-time tracking, asset monitoring, fraud reduction, supply chain efficiency, data integrity, case studies, industry implementation, trust, collaboration.
Procedia PDF Downloads 37339 The Renewed Constitutional Roots of Agricultural Law in Hungary in Line with Sustainability
Authors: Gergely Horvath
Abstract:
The study analyzes the special provisions of the highest level of national agricultural legislation in the Fundamental Law of Hungary (25 April 2011) with descriptive, analytic and comparative methods. The agriculturally relevant articles of the constitution are very important, because –in spite of their high level of abstraction– they can determine and serve the practice comprehensively and effectively. That is why the objective of the research is to interpret the concrete sentences and phrases in connection with agriculture compared with the methods of some other relevant constitutions (historical-grammatical interpretation). The major findings of the study focus on searching for the appropriate provisions and approach capable of solving the problems of sustainable food production. The real challenge agricultural law must face with in the future is protecting or conserving its background and subjects: the environment, the ecosystem services and all the 'roots' of food production. In effect, agricultural law is the legal aspect of the production of 'our daily bread' from farm to table. However, it also must guarantee the safe daily food for our children and for all our descendants. In connection with sustainability, this unique, value-oriented constitution of an agrarian country even deals with uncustomary questions in this level of legislation like GMOs (by banning the production of genetically modified crops). The starting point is that the principle of public good (principium boni communis) must be the leading notion of the norm, which is an idea partly outside the law. The public interest is reflected by the agricultural law mainly in the concept of public health (in connection with food security) and the security of supply with healthy food. The construed Article P claims the general protection of our natural resources as a requirement. The enumeration of the specific natural resources 'which all form part of the common national heritage' also means the conservation of the grounds of sustainable agriculture. The reference of the arable land represents the subfield of law of the protection of land (and soil conservation), that of the water resources represents the subfield of water protection, the reference of forests and the biological diversity visualize the specialty of nature conservation, which is an essential support for agrobiodiversity. The mentioned protected objects constituting the nation's common heritage metonymically melt with their protective regimes, strengthening them and forming constitutional references of law. This regimes also mean the protection of the natural foundations of the life of the living and also the future generations, in the name of intra- and intergenerational equity.Keywords: agricultural law, constitutional values, natural resources, sustainability
Procedia PDF Downloads 166338 Evaluation of Turbulence Prediction over Washington, D.C.: Comparison of DCNet Observations and North American Mesoscale Model Outputs
Authors: Nebila Lichiheb, LaToya Myles, William Pendergrass, Bruce Hicks, Dawson Cagle
Abstract:
Atmospheric transport of hazardous materials in urban areas is increasingly under investigation due to the potential impact on human health and the environment. In response to health and safety concerns, several dispersion models have been developed to analyze and predict the dispersion of hazardous contaminants. The models of interest usually rely on meteorological information obtained from the meteorological models of NOAA’s National Weather Service (NWS). However, due to the complexity of the urban environment, NWS forecasts provide an inadequate basis for dispersion computation in urban areas. A dense meteorological network in Washington, DC, called DCNet, has been operated by NOAA since 2003 to support the development of urban monitoring methodologies and provide the driving meteorological observations for atmospheric transport and dispersion models. This study focuses on the comparison of wind observations from the DCNet station on the U.S. Department of Commerce Herbert C. Hoover Building against the North American Mesoscale (NAM) model outputs for the period 2017-2019. The goal is to develop a simple methodology for modifying NAM outputs so that the dispersion requirements of the city and its urban area can be satisfied. This methodology will allow us to quantify the prediction errors of the NAM model and propose adjustments of key variables controlling dispersion model calculation.Keywords: meteorological data, Washington D.C., DCNet data, NAM model
Procedia PDF Downloads 234337 Design and Implementation of PD-NN Controller Optimized Neural Networks for a Quad-Rotor
Authors: Chiraz Ben Jabeur, Hassene Seddik
Abstract:
In this paper, a full approach of modeling and control of a four-rotor unmanned air vehicle (UAV), known as quad-rotor aircraft, is presented. In fact, a PD and a PD optimized Neural Networks Approaches (PD-NN) are developed to be applied to control a quad-rotor. The goal of this work is to concept a smart self-tuning PD controller based on neural networks able to supervise the quad-rotor for an optimized behavior while tracking the desired trajectory. Many challenges could arise if the quad-rotor is navigating in hostile environments presenting irregular disturbances in the form of wind added to the model on each axis. Thus, the quad-rotor is subject to three-dimensional unknown static/varying wind disturbances. The quad-rotor has to quickly perform tasks while ensuring stability and accuracy and must behave rapidly with regard to decision-making facing disturbances. This technique offers some advantages over conventional control methods such as PD controller. Simulation results are obtained with the use of Matlab/Simulink environment and are founded on a comparative study between PD and PD-NN controllers based on wind disturbances. These later are applied with several degrees of strength to test the quad-rotor behavior. These simulation results are satisfactory and have demonstrated the effectiveness of the proposed PD-NN approach. In fact, this controller has relatively smaller errors than the PD controller and has a better capability to reject disturbances. In addition, it has proven to be highly robust and efficient, facing turbulences in the form of wind disturbances.Keywords: hostile environment, PD and PD-NN controllers, quad-rotor control, robustness against disturbance
Procedia PDF Downloads 137336 A Spatial Approach to Model Mortality Rates
Authors: Yin-Yee Leong, Jack C. Yue, Hsin-Chung Wang
Abstract:
Human longevity has been experiencing its largest increase since the end of World War II, and modeling the mortality rates is therefore often the focus of many studies. Among all mortality models, the Lee–Carter model is the most popular approach since it is fairly easy to use and has good accuracy in predicting mortality rates (e.g., for Japan and the USA). However, empirical studies from several countries have shown that the age parameters of the Lee–Carter model are not constant in time. Many modifications of the Lee–Carter model have been proposed to deal with this problem, including adding an extra cohort effect and adding another period effect. In this study, we propose a spatial modification and use clusters to explain why the age parameters of the Lee–Carter model are not constant. In spatial analysis, clusters are areas with unusually high or low mortality rates than their neighbors, where the “location” of mortality rates is measured by age and time, that is, a 2-dimensional coordinate. We use a popular cluster detection method—Spatial scan statistics, a local statistical test based on the likelihood ratio test to evaluate where there are locations with mortality rates that cannot be described well by the Lee–Carter model. We first use computer simulation to demonstrate that the cluster effect is a possible source causing the problem of the age parameters not being constant. Next, we show that adding the cluster effect can solve the non-constant problem. We also apply the proposed approach to mortality data from Japan, France, the USA, and Taiwan. The empirical results show that our approach has better-fitting results and smaller mean absolute percentage errors than the Lee–Carter model.Keywords: mortality improvement, Lee–Carter model, spatial statistics, cluster detection
Procedia PDF Downloads 171335 Improving Second Language Speaking Skills via Video Exchange
Authors: Nami Takase
Abstract:
Computer-mediated-communication allows people to connect and interact with each other as if they were sharing the same space. The current study examined the effects of using video letters (VLs) on the development of second language speaking skills of Common European Framework of Reference for Languages (CEFR) A1 and CEFR B2 level learners of English as a foreign language. Two groups were formed to measure the impact of VLs. The experimental and control groups were given the same topic, and both groups worked with a native English-speaking university student from the United States of America. Students in the experimental group exchanged VLs, and students in the control group used video conferencing. Pre- and post-tests were conducted to examine the effects of each practice mode. The transcribed speech-text data showed that the VL group had improved speech accuracy scores, while the video conferencing group had increased sentence complexity scores. The use of VLs may be more effective for beginner-level learners because they are able to notice their own errors and replay videos to better understand the native speaker’s speech at their own pace. Both the VL and video conferencing groups provided positive feedback regarding their interactions with native speakers. The results showed how different types of computer-mediated communication impacts different areas of language learning and speaking practice and how each of these types of online communication tool is suited to different teaching objectives.Keywords: computer-assisted-language-learning, computer-mediated-communication, english as a foreign language, speaking
Procedia PDF Downloads 99334 Compensatory Articulation of Pressure Consonants in Telugu Cleft Palate Speech: A Spectrographic Analysis
Authors: Indira Kothalanka
Abstract:
For individuals born with a cleft palate (CP), there is no separation between the nasal cavity and the oral cavity, due to which they cannot build up enough air pressure in the mouth for speech. Therefore, it is common for them to have speech problems. Common cleft type speech errors include abnormal articulation (compensatory or obligatory) and abnormal resonance (hyper, hypo and mixed nasality). These are generally resolved after palate repair. However, in some individuals, articulation problems do persist even after the palate repair. Such individuals develop variant articulations in an attempt to compensate for the inability to produce the target phonemes. A spectrographic analysis is used to investigate the compensatory articulatory behaviours of pressure consonants in the speech of 10 Telugu speaking individuals aged between 7-17 years with a history of cleft palate. Telugu is a Dravidian language which is spoken in Andhra Pradesh and Telangana states in India. It is a language with the third largest number of native speakers in India and the most spoken Dravidian language. The speech of the informants is analysed using single word list, sentences, passage and conversation. Spectrographic analysis is carried out using PRAAT, speech analysis software. The place and manner of articulation of consonant sounds is studied through spectrograms with the help of various acoustic cues. The types of compensatory articulation identified are glottal stops, palatal stops, uvular, velar stops and nasal fricatives which are non-native in Telugu.Keywords: cleft palate, compensatory articulation, spectrographic analysis, PRAAT
Procedia PDF Downloads 443333 Efficient Estimation for the Cox Proportional Hazards Cure Model
Authors: Khandoker Akib Mohammad
Abstract:
While analyzing time-to-event data, it is possible that a certain fraction of subjects will never experience the event of interest, and they are said to be cured. When this feature of survival models is taken into account, the models are commonly referred to as cure models. In the presence of covariates, the conditional survival function of the population can be modelled by using the cure model, which depends on the probability of being uncured (incidence) and the conditional survival function of the uncured subjects (latency), and a combination of logistic regression and Cox proportional hazards (PH) regression is used to model the incidence and latency respectively. In this paper, we have shown the asymptotic normality of the profile likelihood estimator via asymptotic expansion of the profile likelihood and obtain the explicit form of the variance estimator with an implicit function in the profile likelihood. We have also shown the efficient score function based on projection theory and the profile likelihood score function are equal. Our contribution in this paper is that we have expressed the efficient information matrix as the variance of the profile likelihood score function. A simulation study suggests that the estimated standard errors from bootstrap samples (SMCURE package) and the profile likelihood score function (our approach) are providing similar and comparable results. The numerical result of our proposed method is also shown by using the melanoma data from SMCURE R-package, and we compare the results with the output obtained from the SMCURE package.Keywords: Cox PH model, cure model, efficient score function, EM algorithm, implicit function, profile likelihood
Procedia PDF Downloads 144332 Merits and Demerits of Participation of Fellow Examinee as Subjects in Observed Structured Practical Examination in Physiology
Authors: Mohammad U. A. Khan, Md. D. Hossain
Abstract:
Background: Department of Physiology finds difficulty in managing ‘subjects’ in practical procedure. To avoid this difficulty fellow examinees of other group may be used as subjects. Objective: To find out the merits and demerits of using fellow examinees as subjects in the practical procedure. Method: This cross-sectional descriptive study was conducted in the Department of Physiology, Noakhali Medical College, Bangladesh during May-June’14. Forty-two 1st year undergraduate medical students from a selected public medical college of Bangladesh were enrolled for the study purposively. Consent of students and authority was taken. Eighteen of them were selected as subjects and designated as subject-examinees. Other fellow examinees (non-subject) examined their blood pressure and pulse as part of ‘observed structured practical examination’ (OSPE). The opinion of all examinees regarding the merits and demerits of using fellow examinee as subjects in the practical procedure was recorded. Result: Examinees stated that they could perform their practical procedure without nervousness (24/42, 57.14%), accurately and comfortably (14/42, 33.33%) and subjects were made available without wasting time (2/42, 4.76%). Nineteen students (45.24%) found no disadvantage and 2 (4.76%) felt embracing when the subject was of opposite sex. The subject-examinees narrated that they could learn from the errors done by their fellow examinee (11/18, 61.1%). 75% non-subject examinees expressed their willingness to be subject so that they can learn from their fellows’ error. Conclusion: Using fellow examinees as subjects is beneficial for both the non-subject and subject examinees. Funding sources: Navana, Beximco, Unihealth, Square & Acme Pharma, Bangladesh Ltd.Keywords: physiology, teaching, practical, OSPE
Procedia PDF Downloads 151331 Comparative Evaluation of Pharmacologically Guided Approaches (PGA) to Determine Maximum Recommended Starting Dose (MRSD) of Monoclonal Antibodies for First Clinical Trial
Authors: Ibraheem Husain, Abul Kalam Najmi, Karishma Chester
Abstract:
First-in-human (FIH) studies are a critical step in clinical development of any molecule that has shown therapeutic promise in preclinical evaluations, since preclinical research and safety studies into clinical development is a crucial step for successful development of monoclonal antibodies for guidance in pharmaceutical industry for the treatment of human diseases. Therefore, comparison between USFDA and nine pharmacologically guided approaches (PGA) (simple allometry, maximum life span potential, brain weight, rule of exponent (ROE), two species methods and one species methods) were made to determine maximum recommended starting dose (MRSD) for first in human clinical trials using four drugs namely Denosumab, Bevacizumab, Anakinra and Omalizumab. In our study, the predicted pharmacokinetic (pk) parameters and the estimated first-in-human dose of antibodies were compared with the observed human values. The study indicated that the clearance and volume of distribution of antibodies can be predicted with reasonable accuracy in human and a good estimate of first human dose can be obtained from the predicted human clearance and volume of distribution. A pictorial method evaluation chart was also developed based on fold errors for simultaneous evaluation of various methods.Keywords: clinical pharmacology (CPH), clinical research (CRE), clinical trials (CTR), maximum recommended starting dose (MRSD), clearance and volume of distribution
Procedia PDF Downloads 374330 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 260329 Code-Switching as a Bilingual Phenomenon among Students in Prishtina International Schools
Authors: Festa Shabani
Abstract:
This paper aims at investigating bilingual speech in the International Schools of Prishtina. More particularly, it seeks to analyze bilingual phenomena among adolescent students highly exposed to English with the latter as the language of instruction at school in naturally-occurring conversations within school environment. Adolescence was deliberately chosen since it is regarded as an age when peer influence on language choice is the greatest. Driven by daily unsystematic observation and prior research already undertaken, the hypothesis stated is that Albanian continues to be the dominant language among Prishtina international schools’ students with a lot of code-switched items from the English. Furthermore, they will also use lexical borrowings - words already adapted in the receiving language, from the language they have been in contact with, in their speech often in the lack of existing equivalents in Albanian or for other reasons. This is done owing to the fact that the language of instruction at school is English, and any topic related to the language they have been exposed to will trigger them to use English. Therefore, this needs special attention in an attempt to identify patterns of their speech; in this way, linguistic and socio-pragmatic factors will be considered when analyzing the motivations behind their language choice. Methodology for collecting data include participant systematic observation and tape-recording. While observing them in their natural conversations, the fieldworker also took notes, which helped transcribe details better. The paper starts by raising the question of whether code-switching is occurring among Prishtina International Schools’ students highly exposed to English. The data gathered from students in informal settings suggests that there are well-founded grounds for an affirmative answer. The participants in this study are observed to be code-switching, although showing differences in degree. However, a generalization cannot be made on the basis of the findings except in so far it appears that English has, in turn, became a language to which they turn when identifying with the group when discussing about particular school topics. Particularly, participants seemed to use intra-sentential CS in cases when they seem to find an English expression rather easier than an Albanian one when repeating or emphasizing a point when urged to talk about educational issues with English being their language of instruction, and inter-sentential code-switching, particularly when quoting others. Concerning the grammatical aspect of code-switching, the intrasentential CS is used more than the intersentetial one. Speaking of gender, the results show that there were really no significant differences in regards quantity between male and female participants. However, the slight tendency for men to code switch intrasententially more than women was manifested. Similarly, a slight tendency again for a difference to emerge is on intersentential switching, which contributes 21% to the total number of switches for women, but 11% to the total number of switches for men.Keywords: Albanian, code-switching contact linguistics, bilingual phenomena, lexical borrowing, English
Procedia PDF Downloads 127328 Automated User Story Driven Approach for Web-Based Functional Testing
Authors: Mahawish Masud, Muhammad Iqbal, M. U. Khan, Farooque Azam
Abstract:
Manual writing of test cases from functional requirements is a time-consuming task. Such test cases are not only difficult to write but are also challenging to maintain. Test cases can be drawn from the functional requirements that are expressed in natural language. However, manual test case generation is inefficient and subject to errors. In this paper, we have presented a systematic procedure that could automatically derive test cases from user stories. The user stories are specified in a restricted natural language using a well-defined template. We have also presented a detailed methodology for writing our test ready user stories. Our tool “Test-o-Matic” automatically generates the test cases by processing the restricted user stories. The generated test cases are executed by using open source Selenium IDE. We evaluate our approach on a case study, which is an open source web based application. Effectiveness of our approach is evaluated by seeding faults in the open source case study using known mutation operators. Results show that the test case generation from restricted user stories is a viable approach for automated testing of web applications.Keywords: automated testing, natural language, restricted user story modeling, software engineering, software testing, test case specification, transformation and automation, user story, web application testing
Procedia PDF Downloads 387327 Automatic Reporting System for Transcriptome Indel Identification and Annotation Based on Snapshot of Next-Generation Sequencing Reads Alignment
Authors: Shuo Mu, Guangzhi Jiang, Jinsa Chen
Abstract:
The analysis of Indel for RNA sequencing of clinical samples is easily affected by sequencing experiment errors and software selection. In order to improve the efficiency and accuracy of analysis, we developed an automatic reporting system for Indel recognition and annotation based on image snapshot of transcriptome reads alignment. This system includes sequence local-assembly and realignment, target point snapshot, and image-based recognition processes. We integrated high-confidence Indel dataset from several known databases as a training set to improve the accuracy of image processing and added a bioinformatical processing module to annotate and filter Indel artifacts. Subsequently, the system will automatically generate data, including data quality levels and images results report. Sanger sequencing verification of the reference Indel mutation of cell line NA12878 showed that the process can achieve 83% sensitivity and 96% specificity. Analysis of the collected clinical samples showed that the interpretation accuracy of the process was equivalent to that of manual inspection, and the processing efficiency showed a significant improvement. This work shows the feasibility of accurate Indel analysis of clinical next-generation sequencing (NGS) transcriptome. This result may be useful for RNA study for clinical samples with microsatellite instability in immunotherapy in the future.Keywords: automatic reporting, indel, next-generation sequencing, NGS, transcriptome
Procedia PDF Downloads 191326 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller
Authors: Alireza Dantism
Abstract:
Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller
Procedia PDF Downloads 97325 Machine Learning Models for the Prediction of Heating and Cooling Loads of a Residential Building
Authors: Aaditya U. Jhamb
Abstract:
Due to the current energy crisis that many countries are battling, energy-efficient buildings are the subject of extensive research in the modern technological era because of growing worries about energy consumption and its effects on the environment. The paper explores 8 factors that help determine energy efficiency for a building: (relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area, and glazing area distribution), with Tsanas and Xifara providing a dataset. The data set employed 768 different residential building models to anticipate heating and cooling loads with a low mean squared error. By optimizing these characteristics, machine learning algorithms may assess and properly forecast a building's heating and cooling loads, lowering energy usage while increasing the quality of people's lives. As a result, the paper studied the magnitude of the correlation between these input factors and the two output variables using various statistical methods of analysis after determining which input variable was most closely associated with the output loads. The most conclusive model was the Decision Tree Regressor, which had a mean squared error of 0.258, whilst the least definitive model was the Isotonic Regressor, which had a mean squared error of 21.68. This paper also investigated the KNN Regressor and the Linear Regression, which had to mean squared errors of 3.349 and 18.141, respectively. In conclusion, the model, given the 8 input variables, was able to predict the heating and cooling loads of a residential building accurately and precisely.Keywords: energy efficient buildings, heating load, cooling load, machine learning models
Procedia PDF Downloads 96324 Challenges in Translating Malay Idiomatic Expressions: A Study
Authors: Nor Ruba’Yah Binti Abd Rahim, Norsyahidah Binti Jaafar
Abstract:
Translating Malay idiomatic expressions into other languages presents unique challenges due to the deep cultural nuances and linguistic intricacies embedded within these expressions. This study examined these challenges through a two-pronged methodology: a comparative analysis using survey questionnaires and a quiz administered to 50 semester 6 students who are taking Translation 1 course, and in-depth interviews with their lecturers. The survey aimed to capture students’ experiences and difficulties in translating selected Malay idioms into English, highlighting common errors and misunderstandings. Complementing this, interviews with lecturers provided expert insights into the nuances of these expressions and effective translation strategies. The findings revealed that literal translations often fail to convey the intended meanings, underscoring the importance of cultural competence and contextual awareness. The study also identified key factors that contribute to successful translations, such as the translator’s familiarity with both source and target cultures and their ability to adapt expressions creatively. This research contributed to the field of translation studies by offering practical recommendations for improving the translation of idiomatic expressions, thereby enhancing cross-cultural communication. The insights gained from this study are valuable for translators, educators, and students, emphasizing the need for a nuanced approach that respects the cultural richness of the source language while ensuring clarity in the target language.Keywords: idiomatic expressions, cultural competence, translation strategies, cross-cultural communication, students’ difficulties
Procedia PDF Downloads 14323 Artificial Intelligence in the Design of a Retaining Structure
Authors: Kelvin Lo
Abstract:
Nowadays, numerical modelling in geotechnical engineering is very common but sophisticated. Many advanced input settings and considerable computational efforts are required to optimize the design to reduce the construction cost. To optimize a design, it usually requires huge numerical models. If the optimization is conducted manually, there is a potentially dangerous consequence from human errors, and the time spent on the input and data extraction from output is significant. This paper presents an automation process introduced to numerical modelling (Plaxis 2D) of a trench excavation supported by a secant-pile retaining structure for a top-down tunnel project. Python code is adopted to control the process, and numerical modelling is conducted automatically in every 20m chainage along the 200m tunnel, with maximum retained height occurring in the middle chainage. Python code continuously changes the geological stratum and excavation depth under groundwater flow conditions in each 20m section. It automatically conducts trial and error to determine the required pile length and the use of props to achieve the required factor of safety and target displacement. Once the bending moment of the pile exceeds its capacity, it will increase in size. When the pile embedment reaches the default maximum length, it will turn on the prop system. Results showed that it saves time, increases efficiency, lowers design costs, and replaces human labor to minimize error.Keywords: automation, numerical modelling, Python, retaining structures
Procedia PDF Downloads 51322 Analytical Development of a Failure Limit and Iso-Uplift Curves for Eccentrically Loaded Shallow Foundations
Authors: N. Abbas, S. Lagomarsino, S. Cattari
Abstract:
Examining existing experimental results for shallow rigid foundations subjected to vertical centric load (N), accompanied or not with a bending moment (M), two main non-linear mechanisms governing the cyclic response of the soil-foundation system can be distinguished: foundation uplift and soil yielding. A soil-foundation failure limit, is defined as a domain of resistance in the two dimensional (2D) load space (N, M) inside of which lie all the admissible combinations of loads; these latter correspond to a pure elastic, non-linear elastic or plastic behavior of the soil-foundation system, while the points lying on the failure limit correspond to a combination of loads leading to a failure of the soil-foundation system. In this study, the proposed resistance domain is constructed analytically based on mechanics. Original elastic limit, uplift initiation limit and iso-uplift limits are constructed inside this domain. These limits give a prediction of the mechanisms activated for each combination of loads applied to the foundation. A comparison of the proposed failure limit with experimental tests existing in the literature shows interesting results. Also, the developed uplift initiation limit and iso-uplift curves are confronted with others already proposed in the literature and widely used due to the absence of other alternatives, and remarkable differences are noted, showing evident errors in the past proposals and relevant accuracy for those given in the present work.Keywords: foundation uplift, iso-uplift curves, resistance domain, soil yield
Procedia PDF Downloads 383321 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm
Authors: Muhammad Bilal, Zhongfeng Qiu
Abstract:
Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.Keywords: AEORNET, AOD, SARA, GOCI, Beijing
Procedia PDF Downloads 171320 Ambivalence as Ethical Practice: Methodologies to Address Noise, Bias in Care, and Contact Evaluations
Authors: Anthony Townsend, Robyn Fasser
Abstract:
While complete objectivity is a desirable scientific position from which to conduct a care and contact evaluation (CCE), it is precisely the recognition that we are inherently incapable of operating objectively that is the foundation of ethical practice and skilled assessment. Drawing upon recent research from Daniel Kahneman (2021) on the differences between noise and bias, as well as different inherent biases collectively termed “The Elephant in the Brain” by Kevin Simler and Robin Hanson (2019) from Oxford University, this presentation addresses both the various ways in which our judgments, perceptions and even procedures can be distorted and contaminated while conducting a CCE, but also considers the value of second order cybernetics and the psychodynamic concept of ‘ambivalence’ as a conceptual basis to inform our assessment methodologies to limit such errors or at least better identify them. Both a conceptual framework for ambivalence, our higher-order capacity to allow for the convergence and consideration of multiple emotional experiences and cognitive perceptions to inform our reasoning, and a practical methodology for assessment relying on data triangulation, Bayesian inference and hypothesis testing is presented as a means of promoting ethical practice for health care professionals conducting CCEs. An emphasis on widening awareness and perspective, limiting ‘splitting’, is demonstrated both in how this form of emotional processing plays out in alienating dynamics in families as well as the assessment thereof. In addressing this concept, this presentation aims to illuminate the value of ambivalence as foundational to ethical practice for assessors.Keywords: ambivalence, forensic, psychology, noise, bias, ethics
Procedia PDF Downloads 87319 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer
Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved
Abstract:
Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.Keywords: computer-aided system, detection, image segmentation, morphology
Procedia PDF Downloads 150318 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter
Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri
Abstract:
Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion
Procedia PDF Downloads 696317 Applying Image Schemas and Cognitive Metaphors to Teaching/Learning Italian Preposition a in Foreign/Second Language Context
Authors: Andrea Fiorista
Abstract:
The learning of prepositions is a quite problematic aspect in foreign language instruction, and Italian is certainly not an exception. In their prototypical function, prepositions express schematic relations of two entities in a highly abstract, typically image-schematic way. In other terms, prepositions assume concepts such as directionality, collocation of objects in space and time and, in Cognitive Linguistics’ terms, the position of a trajector with respect to a landmark. Learners of different native languages may conceptualize them differently, implying that they are supposed to operate a recategorization (or create new categories) fitting with the target language. However, most current Italian Foreign/Second Language handbooks and didactic grammars do not facilitate learners in carrying out the task, as they tend to provide partial and idiosyncratic descriptions, with the consequent learner’s effort to memorize them, most of the time without success. In their prototypical meaning, prepositions are used to specify precise topographical positions in the physical environment which become less and less accurate as they radiate out from what might be termed a concrete prototype. According to that, the present study aims to elaborate a cognitive and conceptually well-grounded analysis of some extensive uses of the Italian preposition a, in order to propose effective pedagogical solutions in the Teaching/Learning process. Image schemas, cognitive metaphors and embodiment represent efficient cognitive tools in a task like this. Actually, while learning the merely spatial use of the preposition a (e.g. Sono a Roma = I am in Rome; vado a Roma = I am going to Rome,…) is quite straightforward, it is more complex when a appears in constructions such as verbs of motion +a + infinitive (e.g. Vado a studiare = I am going to study), inchoative periphrasis (e.g. Tra poco mi metto a leggere = In a moment I will read), causative construction (e.g. Lui mi ha mandato a lavorare = He sent me to work). The study reports data from a teaching intervention of Focus on Form, in which a basic cognitive schema is used to facilitate both teachers and students to respectively explain/understand the extensive uses of a. The educational material employed translates Cognitive Linguistics’ theoretical assumptions, such as image schemas and cognitive metaphors, into simple images or proto-scenes easily comprehensible for learners. Illustrative material, indeed, is supposed to make metalinguistic contents more accessible. Moreover, the concept of embodiment is pedagogically applied through activities including motion and learners’ bodily involvement. It is expected that replacing rote learning with a methodology that gives grammatical elements a proper meaning, makes learning process more effective both in the short and long term.Keywords: cognitive approaches to language teaching, image schemas, embodiment, Italian as FL/SL
Procedia PDF Downloads 87316 Internal Audit and the Effectiveness and Efficiency of Operations in Hospitals
Authors: Naziru Suleiman
Abstract:
The ever increasing cases of financial frauds and corporate accounting scandals in recent years have raised more concern on the operation of internal control mechanisms and performance of the internal audit departments in organizations. In most cases the seeming presence of both the internal control system and internal audit in organizations do not prove useful as frauds errors and irregularities are being perpetuated. The aim of this study, therefore, is to assess the role of internal audit in achieving the objectives of internal control system of federal hospitals in Kano State from the perception of the respondents. The study used survey research design and generated data from primary source by means of questionnaire. A total number of 100 copies of questionnaire were administered out of which 68 were duly completed and returned. Cronbach’s alpha was used to test the internal validity of the various items in the constructs. Descriptive statistics, chi-square test, Mann Whitney U test and Kruskal Wallis ANOVA were employed for the analysis of data. The study finds that from the perception of the respondents, internal audit departments in Federal Hospitals in Kano State are effective and that they contribute positively to the overall attainment of the objectives of internal control system of these hospitals. There is no significant difference found on the views of the respondents from the three hospitals. Hence, the study concludes that strong and functional internal audit department is a basic requirement for effectiveness of operations of the internal control system. In the light of the findings, it is recommended that internal audit should continue to ensure that the objectives of internal control system of these hospitals are achieved through proper and adequate evaluation and review of the system.Keywords: internal audit, internal control, federal hospitals, financial frauds
Procedia PDF Downloads 353315 Design of an Improved Distributed Framework for Intrusion Detection System Based on Artificial Immune System and Neural Network
Authors: Yulin Rao, Zhixuan Li, Burra Venkata Durga Kumar
Abstract:
Intrusion detection refers to monitoring the actions of internal and external intruders on the system and detecting the behaviours that violate security policies in real-time. In intrusion detection, there has been much discussion about the application of neural network technology and artificial immune system (AIS). However, many solutions use static methods (signature-based and stateful protocol analysis) or centralized intrusion detection systems (CIDS), which are unsuitable for real-time intrusion detection systems that need to process large amounts of data and detect unknown intrusions. This article proposes a framework for a distributed intrusion detection system (DIDS) with multi-agents based on the concept of AIS and neural network technology to detect anomalies and intrusions. In this framework, multiple agents are assigned to each host and work together, improving the system's detection efficiency and robustness. The trainer agent in the central server of the framework uses the artificial neural network (ANN) rather than the negative selection algorithm of AIS to generate mature detectors. Mature detectors can distinguish between self-files and non-self-files after learning. Our analyzer agents use genetic algorithms to generate memory cell detectors. This kind of detector will effectively reduce false positive and false negative errors and act quickly on known intrusions.Keywords: artificial immune system, distributed artificial intelligence, multi-agent, intrusion detection system, neural network
Procedia PDF Downloads 109314 Working Memory and Phonological Short-Term Memory in the Acquisition of Academic Formulaic Language
Authors: Zhicheng Han
Abstract:
This study examines the correlation between knowledge of formulaic language, working memory (WM), and phonological short-term memory (PSTM) in Chinese L2 learners of English. This study investigates if WM and PSTM correlate differently to the acquisition of formulaic language, which may be relevant for the discourse around the conceptualization of formulas. Connectionist approaches have lead scholars to argue that formulas are form-meaning connections stored whole, making PSTM significant in the acquisitional process as it pertains to the storage and retrieval of chunk information. Generativist scholars, on the other hand, argued for active participation of interlanguage grammar in the acquisition and use of formulaic language, where formulas are represented in the mind but retain the internal structure built around a lexical core. This would make WM, especially the processing component of WM an important cognitive factor since it plays a role in processing and holding information for further analysis and manipulation. The current study asked L1 Chinese learners of English enrolled in graduate programs in China to complete a preference raking task where they rank their preference for formulas, grammatical non-formulaic expressions, and ungrammatical phrases with and without the lexical core in academic contexts. Participants were asked to rank the options in order of the likeliness of them encountering these phrases in the test sentences within academic contexts. Participants’ syntactic proficiency is controlled with a cloze test and grammar test. Regression analysis found a significant relationship between the processing component of WM and preference of formulaic expressions in the preference ranking task while no significant correlation is found for PSTM or syntactic proficiency. The correlational analysis found that WM, PSTM, and the two proficiency test scores have significant covariates. However, WM and PSTM have different predictor values for participants’ preference for formulaic language. Both storage and processing components of WM are significantly correlated with the preference for formulaic expressions while PSTM is not. These findings are in favor of the role of interlanguage grammar and syntactic knowledge in the acquisition of formulaic expressions. The differing effects of WM and PSTM suggest that selective attention to and processing of the input beyond simple retention play a key role in successfully acquiring formulaic language. Similar correlational patterns were found for preferring the ungrammatical phrase with the lexical core of the formula over the ones without the lexical core, attesting to learners’ awareness of the lexical core around which formulas are constructed. These findings support the view that formulaic phrases retain internal syntactic structures that are recognized and processed by the learners.Keywords: formulaic language, working memory, phonological short-term memory, academic language
Procedia PDF Downloads 63