Search results for: large language models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15821

Search results for: large language models

13331 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 88
13330 The Translation Of Original Metaphor In Literature

Authors: Esther Matthews

Abstract:

This paper looks at ways of translating new metaphors: those conceived and created by authors, which are often called ‘original’ metaphors in the world of Translation Studies. An original metaphor is the most extreme form of figurative language, often dramatic and shocking in effect. It displays unexpected juxtapositions of language, suggesting there could be as many different translations as there are translators. However, some theorists say original metaphors should be translated ‘literally’ or ‘word for word’ as far as possible, suggesting a similarity between translators’ solutions. How do literary translators approach this challenge? This study focuses on Spanish-English translations of a novel full of original metaphors: Nada by Carmen Laforet (1921 – 2004). Original metaphors from the text were compared to the four published English translations by Inez Muñoz, Charles Franklin Payne, Glafyra Ennis, and Edith Grossman. These four translators employed a variety of translation methods, but they translated ‘literally’ in well over half of the original metaphors studied. In a two-part translation exercise and questionnaire, professional literary translators were asked to translate a number of these metaphors. Many different methods were employed, but again, over half of the original metaphors were translated literally. Although this investigation was limited to one author and language pair, it gives a clear indication that, although literary translators’ solutions vary, on the whole, they prefer to translate original metaphors as literally as possible within the confines of English grammar and syntax. It also reveals literary translators’ desire to reproduce the distinctive character of an author’s work as accurately as possible for the target reader.

Keywords: translation, original metaphor, literature, translator training

Procedia PDF Downloads 256
13329 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.

Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes

Procedia PDF Downloads 17
13328 Functional and Efficient Query Interpreters: Principle, Application and Performances’ Comparison

Authors: Laurent Thiry, Michel Hassenforder

Abstract:

This paper presents a general approach to implement efficient queries’ interpreters in a functional programming language. Indeed, most of the standard tools actually available use an imperative and/or object-oriented language for the implementation (e.g. Java for Jena-Fuseki) but other paradigms are possible with, maybe, better performances. To proceed, the paper first explains how to model data structures and queries in a functional point of view. Then, it proposes a general methodology to get performances (i.e. number of computation steps to answer a query) then it explains how to integrate some optimization techniques (short-cut fusion and, more important, data transformations). It then compares the functional server proposed to a standard tool (Fuseki) demonstrating that the first one can be twice to ten times faster to answer queries.

Keywords: data transformation, functional programming, information server, optimization

Procedia PDF Downloads 145
13327 Churn Prediction for Savings Bank Customers: A Machine Learning Approach

Authors: Prashant Verma

Abstract:

Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.

Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling

Procedia PDF Downloads 125
13326 EFL Learners’ Perceptions in Using Online Tools in Developing Writing Skills

Authors: Zhikal Qadir Salih, Hanife Bensen

Abstract:

As the advent of modern technology continues to make towering impacts on everything, its relevance permeates to all spheres, language learning, and writing skills in particular not an exception. This study aimed at finding out how EFL learners perceive online tools to improve their writing skills. The study was carried out at Tishk University. Copies of the questionnaire were distributed to the participants, in order to elicit their perceptions. The collected data were subjected to descriptive and inferential statistics. The outcome revealed that the participants have positive perceptions about online tools in using them to enhance their writing skills. The study however found out that both gender and the class level of the participants do not make any significant difference in their perceptions about the use of online tools, as far as writing skill is concerned. Based on these outcomes, relevant recommendations were made.

Keywords: online tools, writing skills, EFL learners, language learning

Procedia PDF Downloads 87
13325 Educational Practices and Brain Based Language Learning

Authors: Dur-E- Shahwar

Abstract:

Much attention has been given to ‘bridging the gap’ between neuroscience and educational practice. In order to gain a better understanding of the nature of this gap and of possibilities to enable the linking process, we have taken a boundary perspective on these two fields and the brain-based learning approach, focusing on boundary-spanning actors, boundary objects, and boundary work. In 26 semi-structured interviews, neuroscientists and education professionals were asked about their perceptions in regard to the gap between science and practice and the role they play in creating, managing, and disrupting this boundary. Neuroscientists and education professionals often hold conflicting views and expectations of both brain-based learning and of each other. This leads us to argue that there are increased prospects for a neuro-scientifically informed learning practice if science and practice work together as equal stakeholders in developing and implementing neuroscience research.

Keywords: language learning, explore, educational practices, mentalist, practice

Procedia PDF Downloads 321
13324 A Traditional Settlement in a Modernized City: Yanbu, Saudi Arabia

Authors: Hisham Mortada

Abstract:

Transition in the urban configuration of Arab cities has never been as radical and visible as it has been since the turn of the last century. The emergence of new cities near historical settlements of Arabia has spawned a series of developments in and around the old city precincts. New developments are based on advanced technology and conform to globally prevalent standards of city planning, superseding the vernacular arrangements based on traditional norms that guided so-called ‘city planning’. Evidence to this fact are the extant Arab buildings present at the urban core of modern cities, which inform us about intricate spatial organization. Organization that subscribed to multiple norms such as, satisfying gender segregation and socialization, economic sustainability, and ensuring security and environmental coherence etc., within settlement compounds. Several participating factors achieved harmony in such an inclusive city—an organization that was challenged and apparently replaced by the new planning order in the face of growing needs of globalized, economy-centric and high-tech models of development. Communities found it difficult to acclimatize with the new western planning models that were implemented at a very large scale throughout the Kingdom, which later experienced spatial re-structuring to suit users’ needs. A closer look the ancient city of Yanbu, now flanked with such new developments, allows us to differentiate and track the beginnings of this unprecedented transition in settlement formations. This paper aims to elaborate the Arabian context offered to both the ‘traditional’ and ‘modern’ planning approaches, in order to understand challenges and solutions offered by both at different times. In the process it will also establish the inconsistencies and conflicts that arose with the shift in planning paradigm, from traditional-'cultural norms’, to modern-'physical planning', in the Arabian context. Thus, by distinguishing the two divergent planning philosophies, their impact of the Arabian morphology, relevance to lifestyle and suitability to the biophysical environment, it concludes with a perspective on sustainability particularly for in case of Yanbu.

Keywords: Yanbu, traditional architecture, Hijaz, coral building, Saudi Arabia

Procedia PDF Downloads 298
13323 Adsorption of Cerium as One of the Rare Earth Elements Using Multiwall Carbon Nanotubes from Aqueous Solution: Modeling, Equilibrium and Kinetics

Authors: Saeb Ahmadi, Mohsen Vafaie Sefti, Mohammad Mahdi Shadman, Ebrahim Tangestani

Abstract:

Carbon nanotube has shown great potential for the removal of various inorganic and organic components due to properties such as large surface area and high adsorption capacity. Central composite design is widely used method for determining optimal conditions. Also due to the economic reasons and wide application, the rare earth elements are important components. The analyses of cerium (Ce(III)) adsorption as one of the Rare Earth Elements (REEs) adsorption on Multiwall Carbon Nanotubes (MWCNTs) have been studied. The optimization process was performed using Response Surface Methodology (RSM). The optimum amount conditions were pH of 4.5, initial Ce (III) concentration of 90 mg/l and MWCNTs dosage of 80 mg. Under this condition, the optimum adsorption percentage of Ce (III) was obtained about 96%. Next, at the obtained optimum conditions the kinetic and isotherm studied and result showed the pseudo-second order and Langmuir isotherm are more fitted with experimental data than other models.

Keywords: cerium, rare earth element, MWCNTs, adsorption, optimization

Procedia PDF Downloads 149
13322 Spanish Language Violence Corpus: An Analysis of Offensive Language in Twitter

Authors: Beatriz Botella-Gil, Patricio Martínez-Barco, Lea Canales

Abstract:

The Internet and ICT are an integral element of and omnipresent in our daily lives. Technologies have changed the way we see the world and relate to it. The number of companies in the ICT sector is increasing every year, and there has also been an increase in the work that occurs online, from sending e-mails to the way companies promote themselves. In social life, ICT’s have gained momentum. Social networks are useful for keeping in contact with family or friends that live far away. This change in how we manage our relationships using electronic devices and social media has been experienced differently depending on the age of the person. According to currently available data, people are increasingly connected to social media and other forms of online communication. Therefore, it is no surprise that violent content has also made its way to digital media. One of the important reasons for this is the anonymity provided by social media, which causes a sense of impunity in the victim. Moreover, it is not uncommon to find derogatory comments, attacking a person’s physical appearance, hobbies, or beliefs. This is why it is necessary to develop artificial intelligence tools that allow us to keep track of violent comments that relate to violent events so that this type of violent online behavior can be deterred. The objective of our research is to create a guide for detecting and recording violent messages. Our annotation guide begins with a study on the problem of violent messages. First, we consider the characteristics that a message should contain for it to be categorized as violent. Second, the possibility of establishing different levels of aggressiveness. To download the corpus, we chose the social network Twitter for its ease of obtaining free messages. We chose two recent, highly visible violent cases that occurred in Spain. Both of them experienced a high degree of social media coverage and user comments. Our corpus has a total of 633 messages, manually tagged, according to the characteristics we considered important, such as, for example, the verbs used, the presence of exclamations or insults, and the presence of negations. We consider it necessary to create wordlists that are present in violent messages as indicators of violence, such as lists of negative verbs, insults, negative phrases. As a final step, we will use automatic learning systems to check the data obtained and the effectiveness of our guide.

Keywords: human language technologies, language modelling, offensive language detection, violent online content

Procedia PDF Downloads 111
13321 The Audio-Visual and Syntactic Priming Effect on Specific Language Impairment and Gender in Modern Standard Arabic

Authors: Mohammad Al-Dawoody

Abstract:

This study aims at exploring if priming is affected by gender in Modern Standard Arabic and if it is restricted solely to subjects with no specific language impairment (SLI). The sample in this study consists of 74 subjects, between the ages of 11;1 and 11;10, distributed into (a) 2 SLI experimental groups of 38 subjects divided into two gender groups of 18 females and 20 males and (b) 2 non-SLI control groups of 36 subjects divided into two gender groups of 17 females and 19 males. Employing a mixed research design, the researcher conducted this study within the framework of the relevance theory (RT) whose main assumption is that human beings are endowed with a biological ability to magnify the relevance of the incoming stimuli. Each of the four groups was given two different priming stimuli: audio-visual priming (T1) and syntactic priming (T2). The results showed that the priming effect was sheer distinct among SLI participants especially when retrieving typical responses (TR) in T1 and T2 with slight superiority of males over females. The results also revealed that non-SLI females showed stronger original response (OR) priming in T1 than males and that non-SLI males in T2 excelled in OR priming than females. Furthermore, the results suggested that the audio-visual priming has a stronger effect on SLI females than non-SLI females and that syntactic priming seems to have the same effect on the two groups (non-SLI and SLI females). The conclusion is that the priming effect varies according to gender and is not confined merely to non-SLI subjects.

Keywords: specific language impairment, relevance theory, audio-visual priming, syntactic priming, modern standard Arabic

Procedia PDF Downloads 162
13320 Factors Influencing Milk Yield, Quality, and Revenue of Dairy Farms in Southern Vietnam

Authors: Ngoc-Hieu Vu

Abstract:

Dairy production in Vietnam is a relatively new agricultural activity and milk production increased remarkably in recent years. Smallholders are still the main drivers for this development, especially in the southern part of the country. However, information on the farming practices is very limited. Therefore, this study aimed to determine factors influencing milk yield and quality (milk fat, total solids, solids-not-fat, total number of bacteria, and somatic cell count) and revenue of dairy farms in Southern Vietnam. The collection of data was at the farm level; individual animal records were unavailable. The 539 studied farms were located in the provinces Lam Dong (N=111 farms), Binh Duong (N=69 farms), Long An (N=174 farms), and Ho Chi Minh city (N=185 farms). The dataset included 9221 monthly test-day records of the farms from January 2013 to May 2015. Seasons were defined as rainy and dry. Farms sizes were classified as small (< 10 milking cows), medium (10 to 19 milking cows) and large (≥ 20 milking cows). The model for each trait contained year-season and farm region-farm size as subclass fixed effects, and individual farm and residual as random effects. Results showed that year-season, region, and farm size were determining sources of variation affecting all studied traits. Milk yield was higher in dry than in rainy seasons (P < 0.05), while it tended to increase from years 2013 to 2015. Large farms had higher yields (445.6 kg/cow) than small (396.7 kg/cow) and medium (428.0 kg/cow) farms (P < 0.05). Small farms, in contrast, were superior to large farms in terms of milk fat, total solids, solids-not-fat, total number of bacteria, and somatic cell count than large farms (P < 0.05). Revenue per cow was higher in large compared with medium and small farms. In conclusion, large farms achieved higher milk yields and revenues per cow, while small farms were superior in milk quality. Overall, milk yields were low and better training, financial support and marketing opportunities for farmers are needed to improve dairy production and increase farm revenues in Southern Vietnam.

Keywords: farm size, milk yield and quality, season, Southern Vietnam

Procedia PDF Downloads 342
13319 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 414
13318 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 383
13317 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 363
13316 Speech Rhythm Variation in Languages and Dialects: F0, Natural and Inverted Speech

Authors: Imen Ben Abda

Abstract:

Languages have been classified into different rhythm classes. 'Stress-timed' languages are exemplified by English, 'syllable-timed' languages by French and 'mora-timed' languages by Japanese. However, to our best knowledge, acoustic studies have not been unanimous in strictly establishing which rhythm category a given language belongs to and failed to show empirical evidence for isochrony. Perception seems to be a good approach to categorize languages into different rhythm classes. This study, within the scope of experimental phonetics, includes an account of different perceptual experiments using cues from natural and inverted speech, as well as pitch extracted from speech data. It is an attempt to categorize speech rhythm over a large set of Arabic (Tunisian, Algerian, Lebanese and Moroccan) and English dialects (Welsh, Irish, Scottish and Texan) as well as other languages such as Chinese, Japanese, French, and German. Listeners managed to classify the different languages and dialects into different rhythm classes using suprasegmental cues mainly rhythm and pitch (F0). They also perceived rhythmic differences even among languages and dialects belonging to the same rhythm class. This may show that there are different subclasses within very broad rhythmic typologies.

Keywords: F0, inverted speech, mora-timing, rhythm variation, stress-timing, syllable-timing

Procedia PDF Downloads 504
13315 Beliefs, Practices and Identity about Bilingualism: Korean-australian Immigrant Parents and Family Language Policies

Authors: Eun Kyong Park

Abstract:

This study explores the relationships between immigrant parents’ beliefs about bilingualism, family literacy practices, and their children’s identity development in Sydney, Australia. This project examines how these parents’ ideological beliefs and knowledge are related to their provision of family literacy practices and management of the environment for their bilingual children based on family language policy (FLP). This is a follow-up study of the author’s prior thesis that presented Korean immigrant mothers’ beliefs and decision-making in support of their children’s bilingualism. It includes fathers’ perspectives within the participating families as a whole by foregrounding their perceptions of bilingual and identity development. It adopts a qualitative approach with twelve immigrant mothers and fathers living in a Korean-Australian community whose child attends one of the communities Korean language programs. This time, it includes introspective and self-evocative auto-ethnographic data. The initial data set collected from the first part of this study demonstrated the mothers provided rich, diverse, and specific family literacy activities for their children. These mothers selected specific practices to facilitate their child’s bilingual development at home. The second part of data has been collected over a three month period: 1) a focus group interview with mothers; 2) a brief self-report of fathers; 3) the researcher’s reflective diary. To analyze these multiple data, thematic analysis and coding were used to reveal the parents’ ideologies surrounding bilingualism and bilingual identities. It will highlight the complexity of language and literacy practices in the family domain interrelated with sociocultural factors. This project makes an original contribution to the field of bilingualism and FLP and a methodological contribution by introducing auto-ethnographic input of this community’s lived practices. This project will empower Korean-Australian immigrant families and other multilingual communities to reflect their beliefs and practices for their emerging bilingual children. It will also enable educators and policymakers to access authentic information about how bilingualism is practiced within these immigrant families in multiple ways and to help build the culturally appropriate partnership between home and school community.

Keywords: bilingualism, beliefs, identity, family language policy, Korean immigrant parents in Australia

Procedia PDF Downloads 123
13314 Optimizing the Passenger Throughput at an Airport Security Checkpoint

Authors: Kun Li, Yuzheng Liu, Xiuqi Fan

Abstract:

High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.

Keywords: queue theory, security check, stochatic process, Monte Carlo simulation

Procedia PDF Downloads 191
13313 Examining the Development of Complexity, Accuracy and Fluency in L2 Learners' Writing after L2 Instruction

Authors: Khaled Barkaoui

Abstract:

Research on second-language (L2) learning tends to focus on comparing students with different levels of proficiency at one point in time. However, to understand L2 development, we need more longitudinal research. In this study, we adopt a longitudinal approach to examine changes in three indicators of L2 ability, complexity, accuracy, and fluency (CAF), as reflected in the writing of L2 learners when writing on different tasks before and after a period L2 instruction. Each of 85 Chinese learners of English at three levels of English language proficiency responded to two writing tasks (independent and integrated) before and after nine months of English-language study in China. Each essay (N= 276) was analyzed in terms of numerous CAF indices using both computer coding and human rating: number of words written, number of errors per 100 words, ratings of error severity, global syntactic complexity (MLS), complexity by coordination (T/S), complexity by subordination (C/T), clausal complexity (MLC), phrasal complexity (NP density), syntactic variety, lexical density, lexical variation, lexical sophistication, and lexical bundles. Results were then compared statistically across tasks, L2 proficiency levels, and time. Overall, task type had significant effects on fluency and some syntactic complexity indices (complexity by coordination, structural variety, clausal complexity, phrase complexity) and lexical density, sophistication, and bundles, but not accuracy. L2 proficiency had significant effects on fluency, accuracy, and lexical variation, but not syntactic complexity. Finally, fluency, frequency of errors, but not accuracy ratings, syntactic complexity indices (clausal complexity, global complexity, complexity by subordination, phrase complexity, structural variety) and lexical complexity (lexical density, variation, and sophistication) exhibited significant changes after instruction, particularly for the independent task. We discuss the findings and their implications for assessment, instruction, and research on CAF in the context of L2 writing.

Keywords: second language writing, Fluency, accuracy, complexity, longitudinal

Procedia PDF Downloads 137
13312 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran

Authors: Fatemeh Faramarzi, Hosein Mahjoob

Abstract:

Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.

Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6

Procedia PDF Downloads 301
13311 Application of Signature Verification Models for Document Recognition

Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova

Abstract:

In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.

Keywords: signature recognition, biometric data, artificial intelligence, neural networks

Procedia PDF Downloads 132
13310 Translation and Legal Terminology: Techniques for Coping with the Untranslatability of Legal Terms between Arabic and English

Authors: Rafat Alwazna

Abstract:

Technical lexicon is witnessing a large upsurge in the use of new terminologies whose emergence is an inevitable result of the spread of high-quality technology, the existence of scientific paradigms and the fast growth of research in different disciplines. One important subfield of terminology is legal terminology, which forms a crucial part of legal studies, and whose translation from one legal system into another is deemed a formidable and arduous task that needs to be properly performed by legal translators. Indeed, the issue of untranslatability of legal terms, particularly between originally unrelated languages, like legal Arabic and legal English, has long been a real challenge in legal translation. It stems from the conceptual incongruency between legal terms of different legal languages, which are derived from different legal cultures and legal systems. Such conceptual asymmetry is owing to the fact that law has no universal reference and that legal language is what determines the degree of difference in conceptual correspondence. The present paper argues that although conceptual asymmetry, which is the main reason for the issue of untranslatability of legal terms, cannot be denied in legal translation, there exist certain translation techniques which, if properly adopted, would resolve the issue of untranslatability of legal terms and therefore achieve acceptable legal translation. Hence, the question of untranslatability of legal terms should no longer exist within the context of legal translation.

Keywords: conceptual incongruency, Legal terms, translation techniques, untranslatability

Procedia PDF Downloads 170
13309 A Multi-criteria Decision Support System for Migrating Legacies into Open Systems

Authors: Nasser Almonawer

Abstract:

Timely reaction to an evolving global business environment and volatile market conditions necessitates system and process flexibility, which in turn demands agile and adaptable architecture and a steady infusion of affordable new technologies. On the contrary, a large number of organizations utilize systems characterized by inflexible and obsolete legacy architectures. To effectively respond to the dynamic contemporary business environments, such architectures must be migrated to robust and modular open architectures. To this end, this paper proposes an integrated decision support system for a seamless migration to open systems. The proposed decision support system (DSS) integrates three well-established quantitative and qualitative decision-making models—namely, the Delphi method, Analytic Hierarchy Process (AHP) and Goal Programming (GP) to (1) assess risks and establish evaluation criteria; (2) formulate migration strategy and rank candidate systems; and (3) allocate resources among the selected systems.

Keywords: decision support systems, open systems architecture, analytic hierarchy process (AHP), goal programming (GP), delphi method

Procedia PDF Downloads 16
13308 [Keynote Talk]: A Blueprint for an Educational Trajectory: The Power of Discourse in Constructing “Naughty” and “Adorable” Kindergarten Students

Authors: Fernanda T. Orsati, Julie Causton

Abstract:

Discursive practices enacted by educators in kindergarten create a blueprint for how the educational trajectories of students with disabilities are constructed. This two-year ethnographic case study critically examine educators’ relationships with students considered to present challenging behaviors in one kindergarten classroom located in a predominantly White middle-class school district in the Northeast of the United States. Focusing on the language and practices used by one special education teacher and three teaching assistants, this paper analyzes how teacher responses to students’ behaviors constructs and positions students over one year of kindergarten education. Using a critical discourse analysis, it shows that educators understand students’ behaviors as a deficit and needing consequences. This study highlights how educators’ responses reflect students' individual characteristics including family background, socioeconomics and ability status. This paper offers in-depth analysis of two students’ stories, which evidenced that the language used by educators amplifies the social positioning of students within the classroom and creates a foundation for who they are constructed to be. Through exploring routine language and practices, this paper demonstrates that educators outlined a blueprint of kindergartners, which positioned students as learners in ways that became the ground for either a limited or a promising educational pathway for them.

Keywords: behavior, early education, special education, critical discourse analysis

Procedia PDF Downloads 289
13307 Analog Input Output Buffer Information Specification Modelling Techniques for Single Ended Inter-Integrated Circuit and Differential Low Voltage Differential Signaling I/O Interfaces

Authors: Monika Rawat, Rahul Kumar

Abstract:

Input output Buffer Information Specification (IBIS) models are used for describing the analog behavior of the Input Output (I/O) buffers of a digital device. They are widely used to perform signal integrity analysis. Advantages of using IBIS models include simple structure, IP protection and fast simulation time with reasonable accuracy. As design complexity of driver and receiver increases, capturing exact behavior from transistor level model into IBIS model becomes an essential task to achieve better accuracy. In this paper, an improvement in existing methodology of generating IBIS model for complex I/O interfaces such as Inter-Integrated Circuit (I2C) and Low Voltage Differential Signaling (LVDS) is proposed. Furthermore, the accuracy and computational performance of standard method and proposed approach with respect to SPICE are presented. The investigations will be useful to further improve the accuracy of IBIS models and to enhance their wider acceptance.

Keywords: IBIS, signal integrity, open-drain buffer, low voltage differential signaling, behavior modelling, transient simulation

Procedia PDF Downloads 180
13306 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans

Authors: Rene Hellmuth

Abstract:

Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.

Keywords: building information modeling, digital factory model, factory planning, restructuring

Procedia PDF Downloads 96
13305 Neologisms and Word-Formation Processes in Board Game Rulebook Corpus: Preliminary Results

Authors: Athanasios Karasimos, Vasiliki Makri

Abstract:

This research focuses on the design and development of the first text Corpus based on Board Game Rulebooks (BGRC) with direct application on the morphological analysis of neologisms and tendencies in word-formation processes. Corpus linguistics is a dynamic field that examines language through the lens of vast collections of texts. These corpora consist of diverse written and spoken materials, ranging from literature and newspapers to transcripts of everyday conversations. By morphologically analyzing these extensive datasets, morphologists can gain valuable insights into how language functions and evolves, as these extensive datasets can reflect the byproducts of inflection, derivation, blending, clipping, compounding, and neology. This entails scrutinizing how words are created, modified, and combined to convey meaning in a corpus of challenging, creative, and straightforward texts that include rules, examples, tutorials, and tips. Board games teach players how to strategize, consider alternatives, and think flexibly, which are critical elements in language learning. Their rulebooks reflect not only their weight (complexity) but also the language properties of each genre and subgenre of these games. Board games are a captivating realm where strategy, competition, and creativity converge. Beyond the excitement of gameplay, board games also spark the art of word creation. Word games, like Scrabble, Codenames, Bananagrams, Wordcraft, Alice in the Wordland, Once uUpona Time, challenge players to construct words from a pool of letters, thus encouraging linguistic ingenuity and vocabulary expansion. These games foster a love for language, motivating players to unearth obscure words and devise clever combinations. On the other hand, the designers and creators produce rulebooks, where they include their joy of discovering the hidden potential of language, igniting the imagination, and playing with the beauty of words, making these games a delightful fusion of linguistic exploration and leisurely amusement. In this research, more than 150 rulebooks in English from all types of modern board games, either language-independent or language-dependent, are used to create the BGRC. A representative sample of each genre (family, party, worker placement, deckbuilding, dice, and chance games, strategy, eurogames, thematic, role-playing, among others) was selected based on the score from BoardGameGeek, the size of the texts and the level of complexity (weight) of the game. A morphological model with morphological networks, multi-word expressions, and word-creation mechanics based on the complexity of the textual structure, difficulty, and board game category will be presented. In enabling the identification of patterns, trends, and variations in word formation and other morphological processes, this research aspires to make avail of this creative yet strict text genre so as to (a) give invaluable insight into morphological creativity and innovation that (re)shape the lexicon of the English language and (b) test morphological theories. Overall, it is shown that corpus linguistics empowers us to explore the intricate tapestry of language, and morphology in particular, revealing its richness, flexibility, and adaptability in the ever-evolving landscape of human expression.

Keywords: board game rulebooks, corpus design, morphological innovations, neologisms, word-formation processes

Procedia PDF Downloads 73
13304 Communicative Competence Is About Speaking a Lot: Teacher’s Voice on the Art of Developing Communicative Competence

Authors: Bernice Badal

Abstract:

The South African English curriculum emphasizes the adoption of the Communicative Approach (CA) using Communicative Language Teaching (CLT) methodologies to develop English as a second language (ESL) learners’ communicative competence in contexts such as township schools in South Africa. However, studies indicate that the adoption of the approach largely remains a rhetoric. Poor English language proficiency among learners and poor student performance, which continues from the secondary to the tertiary phase, is widely attributed to a lack of English language proficiency in South Africa. Consequently, this qualitative study, using a mix of classroom observations and interviews, sought to investigate teacher knowledge of Communicative Competence and the methods and strategies ESL teachers used to develop their learners’ communicative competence. The success of learners’ ability to develop communicative competence in contexts such as township schools in South Africa is inseparable from materials, tasks, teacher knowledge and how they implement the approach in the classrooms. Accordingly, teacher knowledge of the theory and practical implications of the CLT approach is imperative for the negotiation of meaning and appropriate use of language in context in resource-impoverished areas like the township. Using a mix of interviews and observations as data sources, this qualitative study examined teachers’ definitions and knowledge of Communicative competence with a focus on how it influenced their classroom practices. The findings revealed that teachers were not familiar with the notion of communicative competence, the communication process, and the underpinnings of CLT. Teachers’ narratives indicated an awareness that there should be interactions and communication in the classroom, but a lack of theoretical understanding of the types of communication necessary scuttled their initiatives. Thus, conceptual deficiency influences teachers’ practices as they engage in classroom activities in a superficial manner or focus on stipulated learner activities prescribed by the CAPS document. This study, therefore, concluded that partial or limited conceptual and coherent understandings with ‘teacher-proof’ stipulations for classroom practice do not inspire teacher efficacy and mastery of prescribed approaches; thus, more efforts should be made by the Department of Basic Education to strengthen the existing Professional Development workshops to support teachers in improving their understandings and application of CLT for the development of Communicative competence in their learners. The findings of the study contribute to the field of teacher knowledge acquisition, teacher beliefs and practices and professional development in the context of second language teaching and learning with a recommendation that frameworks for the development of communicative competence with wider applicability in resource-poor environments be developed to support teacher understanding and application in classrooms.

Keywords: communicative competence, CLT, conceptual understanding of reforms, professional development

Procedia PDF Downloads 42
13303 Surface-Enhanced Raman Spectroscopy on Gold Nanoparticles in the Kidney Disease

Authors: Leonardo C. Pacheco-Londoño, Nataly J Galan-Freyle, Lisandro Pacheco-Lugo, Antonio Acosta-Hoyos, Elkin Navarro, Gustavo Aroca-Martinez, Karin Rondón-Payares, Alberto C. Espinosa-Garavito, Samuel P. Hernández-Rivera

Abstract:

At the Life Science Research Center at Simon Bolivar University, a primary focus is the diagnosis of various diseases, and the use of gold nanoparticles (Au-NPs) in diverse biomedical applications is continually expanding. In the present study, Au-NPs were employed as substrates for Surface-Enhanced Raman Spectroscopy (SERS) aimed at diagnosing kidney diseases arising from Lupus Nephritis (LN), preeclampsia (PC), and Hypertension (H). Discrimination models were developed for distinguishing patients with and without kidney diseases based on the SERS signals from urine samples by partial least squares-discriminant analysis (PLS-DA). A comparative study of the Raman signals across the three conditions was conducted, leading to the identification of potential metabolite signals. Model performance was assessed through cross-validation and external validation, determining parameters like sensitivity and specificity. Additionally, a secondary analysis was performed using machine learning (ML) models, wherein different ML algorithms were evaluated for their efficiency. Models’ validation was carried out using cross-validation and external validation, and other parameters were determined, such as sensitivity and specificity; the models showed average values of 0.9 for both parameters. Additionally, it is not possible to highlight this collaborative effort involved two university research centers and two healthcare institutions, ensuring ethical treatment and informed consent of patient samples.

Keywords: SERS, Raman, PLS-DA, kidney diseases

Procedia PDF Downloads 30
13302 Integration of Immigrant Students into Local Education System

Authors: Suheyla Demi̇rkol Orak

Abstract:

The requirement of inclusive education is one of the utmost important results of both regular and irregular immigration. The matter in the case of Syrian immigrants is even worse than the other immigrants cases in world history since a massive immigration wave has affected all world countries' socio-economic profiles. When Syrians immigrated from Syria all over the world, they aimed to survive and left behind the war, but surviving is not optional occasion without handling language-related problems. Humans exist and preserve their existence with their language. That is a matter of concern for the integration of Syrians into the hosting countries. Many countries are proceeding with various programs to integrate Syrians into the majority groups by either assimilation or adaptation policies. Turkey has got the lion's share of the Syrian immigration apple, and in the same vein with this situation, its language education system should be analyzed severely in order to come up with a perfect match program for the integration of Syrians. It aimed to generate an inclusive education model for catalyzing the integration process of immigrant Syrian students into the majority socio-economic group via overcoming the language barrier. The identity of the immigrants is prioritized. The study follows a narrative literature review, which aims to review and critique relevant literature and offers a new conceptualization derived from the previous literature. The study derives a critical localized bilingual education model. As the outcome of the narrative literature review, a bilingual education model which prioritized the identity of the target community was designed. In the present study, main bilingual education programs and most of the countries' bilingual education policies were reviewed critically and suggestions were listed for the Syrian immigrants dominantly in Turkey and suggested to be benefitted by the other countries through localizing the practices.

Keywords: bi/multilingual education, sheltered education, immigrants, glocalization, submersion program, immersion program

Procedia PDF Downloads 66