Search results for: the creative learning process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20896

Search results for: the creative learning process

16906 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data

Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos

Abstract:

Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.

Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia

Procedia PDF Downloads 12
16905 The Implementation of the Lean Six Sigma Production Process in a Telecommunications Company in Brazil

Authors: Carlos Fontanillas

Abstract:

The implementation of the lean six sigma methodology aims to implement practices to systematically improve processes by eliminating defects, making them cheaper. The implementation of projects with the methodology uses a division into five phases: definition, measurement, analysis, implementation, and control. In this process, it is understood that the implementation of said methodology generates benefits to organizations that adhere through the improvement of their processes. In the case of a telecommunications company, it was realized that the implementation of a lean six sigma project contributed to the improvement of the presented process, generating a financial return with the avoided cost. However, such study has limitations such as a specific segment of performance and procedure, i.e., it can not be defined that return under other circumstances will be the same. It is also concluded that lean six sigma projects tend to contribute to improved processes evaluated due to their methodology that is based on statistical analysis and quality management tools and can generate a financial return. It is hoped that the present study can be used to provide a clearer view of the methodology for entrepreneurs who wish to implement process improvement actions in their companies, as well as to provide a foundation for professionals working with lean six sigma projects. After the review of the processes, the completion of the project stages and the monitoring for three months in partnership with the owner of the process to ensure the effectiveness of the actions, the project was completed with the objective reached. There was an average of 60% reduction with the issuance of undue invoices generated after the deactivation and it was possible to extend the project to other companies, which allowed a reduction well above the initially stipulated target.

Keywords: quality, process, lean six sigma, organization

Procedia PDF Downloads 126
16904 Automation of Finite Element Simulations for the Design Space Exploration and Optimization of Type IV Pressure Vessel

Authors: Weili Jiang, Simon Cadavid Lopera, Klaus Drechsler

Abstract:

Fuel cell vehicle has become the most competitive solution for the transportation sector in the hydrogen economy. Type IV pressure vessel is currently the most popular and widely developed technology for the on-board storage, based on their high reliability and relatively low cost. Due to the stringent requirement on mechanical performance, the pressure vessel is subject to great amount of composite material, a major cost driver for the hydrogen tanks. Evidently, the optimization of composite layup design shows great potential in reducing the overall material usage, yet requires comprehensive understanding on underlying mechanisms as well as the influence of different design parameters on mechanical performance. Given the type of materials and manufacturing processes by which the type IV pressure vessels are manufactured, the design and optimization are a nuanced subject. The manifold of stacking sequence and fiber orientation variation possibilities have an out-standing effect on vessel strength due to the anisotropic property of carbon fiber composites, which make the design space high dimensional. Each variation of design parameters requires computational resources. Using finite element analysis to evaluate different designs is the most common method, however, the model-ing, setup and simulation process can be very time consuming and result in high computational cost. For this reason, it is necessary to build a reliable automation scheme to set up and analyze the di-verse composite layups. In this research, the simulation process of different tank designs regarding various parameters is conducted and automatized in a commercial finite element analysis framework Abaqus. Worth mentioning, the modeling of the composite overwrap is automatically generated using an Abaqus-Python scripting interface. The prediction of the winding angle of each layer and corresponding thickness variation on dome region is the most crucial step of the modeling, which is calculated and implemented using analytical methods. Subsequently, these different composites layups are simulated as axisymmetric models to facilitate the computational complexity and reduce the calculation time. Finally, the results are evaluated and compared regarding the ultimate tank strength. By automatically modeling, evaluating and comparing various composites layups, this system is applicable for the optimization of the tanks structures. As mentioned above, the mechanical property of the pressure vessel is highly dependent on composites layup, which requires big amount of simulations. Consequently, to automatize the simulation process gains a rapid way to compare the various designs and provide an indication of the optimum one. Moreover, this automation process can also be operated for creating a data bank of layups and corresponding mechanical properties with few preliminary configuration steps for the further case analysis. Subsequently, using e.g. machine learning to gather the optimum by the data pool directly without the simulation process.

Keywords: type IV pressure vessels, carbon composites, finite element analy-sis, automation of simulation process

Procedia PDF Downloads 132
16903 Discussion about Frequent Adjustment of Urban Master Planning in China: A Case Study of Changshou District, Chongqing City

Authors: Sun Ailu, Zhao Wanmin

Abstract:

Since the reform and opening, the urbanization process of China has entered a rapid development period. In recent years, the authors participated in some projects of urban master planning in China and found a phenomenon that the rapid urbanization area of China is experiencing frequent adjustment process of urban master planning. This phenomenon is not the natural process of urbanization development. It may be caused by different government roles from different levels. Through the methods of investigation, data comparison and case study, this paper aims to explore the reason why the rapid urbanization area is experiencing frequent adjustment of master planning and give some solution strategies. Firstly, taking Changshou district of Chongqing city as an example, this paper wants to introduce the phenomenon about frequent adjustment process in China. And then, discuss distinct roles in the process between national government, provincial government and local government of China. At last, put forward preliminary solutions strategies for this area in China from the aspects of land use, intergovernmental cooperation and so on.

Keywords: urban master planning, frequent adjustment, urbanization development, problems and strategies, China

Procedia PDF Downloads 360
16902 Factors Affecting General Practitioners’ Transfer of Specialized Self-Care Knowledge to Patients

Authors: Weidong Xia, Malgorzata Kolotylo, Xuan Tan

Abstract:

This study examines the key factors that influence general practitioners’ learning and transfer of specialized arthritis knowledge and self-care techniques to patients during normal patient visits. Drawing on the theory of planed behavior and using matched survey data collected from general practitioners before and after training sessions provided by specialized orthopedic physicians, the study suggests that the general practitioner’s intention to use and transfer learned knowledge was influenced mainly by intrinsic motivation, organizational learning culture and absorptive capacity, but was not influenced by extrinsic motivation. The results provide both theoretical and practical implications.

Keywords: empirical study, healthcare knowledge management, patient self-care, physician knowledge transfer

Procedia PDF Downloads 294
16901 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning

Authors: T. Bryan , V. Kepuska, I. Kostnaic

Abstract:

A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.

Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit

Procedia PDF Downloads 249
16900 The Delone and McLean Model: A Review and Reconceptualisation for Explaining Organisational IS Success

Authors: Probir Kumar Banerjee

Abstract:

Though the revised DeLone and McLean (DM) model of IS success is found to be effective at the individual level of analysis, there is lack of consensus in regard to its effectiveness at the organisational level. This research reviews the DM model in the light of business/IT alignment theory and supporting literature, and suggests its reconceptualization. Specifically, arguments are made for augmenting it with business process quality. Business process quality, it is argued, captures the effect of intent to use, use and user satisfaction interactions, thus eliminating the need to capture their interaction effects in explaining organisational IS success. It is also argued that ‘operational performance’ driven by systems and business process quality, and higher order measures of organisational performance tied to operational performance are appropriate measures of ‘net benefit’. Suggestions are made for reconceptualisation of the other constructs and an adapted model of organisational IS success is proposed.

Keywords: organisational IS success, business/IT alignment, systems quality, business process quality, operational performance, market performance

Procedia PDF Downloads 392
16899 Importance of an E-Learning Program in Stress Field for Postgraduate Courses of Doctors

Authors: Ramona-Niculina Jurcau, Ioana-Marieta Jurcau

Abstract:

Background: Preparing in the stress field (SF) is, increasingly, a concern for doctors of different specialties. Aims: The aim was to evaluate the importance of an e-learning program for doctors postgraduate courses, in SF. Methods: Doctors (n= 40 male, 40 female) of different specialties and ages (31-71 years), who attended postgraduate courses in SF, voluntarily responded to a questionnaire that included the following themes: Importance of SF courses for specialty practiced by each respondent doctor (using visual analogue scale, VAS); What SF themes would be indicated as e-learning (EL); Preferred form of SF information assimilation: Classical lectures (CL), EL or a combination of these methods (CL+EL); Which information on the SF course are facilitated by EL model versus CL; In their view which are the first four advantages and the first four disadvantages of EL compared to CL, for SF. Results: To most respondents, the SF courses are important for the specialty they practiced (VAS by an average of 4). The SF themes suggested to be done as EL were: Stress mechanisms; stress factor models for different medical specialties; stress assessment methods; primary stress management methods for different specialties. Preferred form of information assimilation was CL+EL. Aspects of the course facilitated by EL versus CL model: Active reading of theoretical information, with fast access to keywords details; watching documentaries in everyone's favorite order; practice through tests and the rapid control of results. The first four EL advantages, mentioned for SF were: Autonomy in managing the time allocated to the study; saving time for traveling to the venue; the ability to read information in various contexts of time and space; communication with colleagues, in good times for everyone. The first three EL disadvantages, mentioned for SF were: It decreases capabilities for group discussion and mobilization for active participation; EL information accession may depend on electrical source or/and Internet; learning slowdown can appear, by temptation of postponing the implementation. Answering questions was partially influenced by the respondent's age and genre. Conclusions: 1) Post-graduate courses in SF are of interest to doctors of different specialties. 2) The majority of participating doctors preferred EL, but combined with CL (CL+EL). 3) Preference for EL was manifested mainly by young or middle age men doctors. 4) It is important to balance the proper formula for chosen EL, to be the most efficient, interesting, useful and agreeable.

Keywords: stress field, doctors’ postgraduate courses, classical lectures, e-learning lecture

Procedia PDF Downloads 236
16898 Vocational and Technical Educators’ Acceptance and Use of Digital Learning Environments Beyond Working Hours: Implications for Work-Life Balance and the Role of Integration Preference

Authors: Jacinta Ifeoma Obidile

Abstract:

Teachers (vocational and technical educators inclusive) use Information and Communications Technology (ICT) for tasks outside of their normal working hours. This expansion of work duties to non-work time challenges their work-life balance. However, there has been inconsistency in the results on how these relationships correlate. This, therefore, calls for further research studies to examine the moderating mechanisms of such relationships. The present study, therefore, ascertained how vocational and technical educators’ technology acceptance relates to their work-related ICT use beyond their working hours and work-life balance, as well as how their integration affects these relationships. The population of the study comprised 320 Vocational and Technical Educators from the Southeast geopolitical zone of Nigeria. Data were collected from the respondents using the structured questionnaire. The questionnaire was validated by three experts. The reliability of the study was conducted using 20 vocational and technical educators from the South who were not part of the population. The overall reliability coefficient of 0.81 was established using Cronbach’s alpha method. The data collected was analyzed using Structural equation modeling. Findings, among others, revealed that vocational and technical educators’ work-life balance was mediated by increased digital learning environment use after work hours, although reduced by social influence.

Keywords: vocational and technical educators, digital learning environment, working hours, work-life balance, integration preference

Procedia PDF Downloads 61
16897 Design of Visual Repository, Constraint and Process Modeling Tool Based on Eclipse Plug-Ins

Authors: Rushiraj Heshi, Smriti Bhandari

Abstract:

Master Data Management requires creation of Central repository, applying constraints on Repository and designing processes to manage data. Designing of Repository, constraints on repository and business processes is very tedious and time consuming task for large Enterprise. Hence Visual Repository, constraints and Process (Workflow) modeling is the most critical step in Master Data Management.In this paper, we realize a Visual Modeling tool for implementing Repositories, Constraints and Processes based on Eclipse Plugin using GMF/EMF which follows principles of Model Driven Engineering (MDE).

Keywords: EMF, GMF, GEF, repository, constraint, process

Procedia PDF Downloads 489
16896 A Systematic Review Of Literature On The Importance Of Cultural Humility In Providing Optimal Palliative Care For All Persons

Authors: Roseanne Sharon Borromeo, Mariana Carvalho, Mariia Karizhenskaia

Abstract:

Healthcare providers need to comprehend cultural diversity for optimal patient-centered care, especially near the end of life. Although a universal method for navigating cultural differences would be ideal, culture’s high complexity makes this strategy impossible. Adding cultural humility, a process of self-reflection to understand personal and systemic biases and humbly acknowledging oneself as a learner when it comes to understanding another's experience leads to a meaningful process in palliative care generating respectful, honest, and trustworthy relationships. This study is a systematic review of the literature on cultural humility in palliative care research and best practices. Race, religion, language, values, and beliefs can affect an individual’s access to palliative care, underscoring the importance of culture in palliative care. Cultural influences affect end-of-life care perceptions, impacting bereavement rituals, decision-making, and attitudes toward death. Cultural factors affecting the delivery of care identified in a scoping review of Canadian literature include cultural competency, cultural sensitivity, and cultural accessibility. As the different parts of the world become exponentially diverse and multicultural, healthcare providers have been encouraged to give culturally competent care at the bedside. Therefore, many organizations have made cultural competence training required to expose professionals to the special needs and vulnerability of diverse populations. Cultural competence is easily standardized, taught, and implemented; however, this theoretically finite form of knowledge can dangerously lead to false assumptions or stereotyping, generating poor communication, loss of bonds and trust, and poor healthcare provider-patient relationship. In contrast, Cultural humility is a dynamic process that includes self-reflection, personal critique, and growth, allowing healthcare providers to respond to these differences with an open mind, curiosity, and awareness that one is never truly a “cultural” expert and requires life-long learning to overcome common biases and ingrained societal influences. Cultural humility concepts include self-awareness and power imbalances. While being culturally competent requires being skilled and knowledgeable in one’s culture, being culturally humble involves the sometimes-uncomfortable position of healthcare providers as students of the patient. Incorporating cultural humility emphasizes the need to approach end-of-life care with openness and responsiveness to various cultural perspectives. Thus, healthcare workers need to embrace lifelong learning in individual beliefs and values on suffering, death, and dying. There have been different approaches to this as well. Some adopt strategies for cultural humility, addressing conflicts and challenges through relational and health system approaches. In practice and research, clinicians and researchers must embrace cultural humility to advance palliative care practices, using qualitative methods to capture culturally nuanced experiences. Cultural diversity significantly impacts patient-centered care, particularly in end-of-life contexts. Cultural factors also shape end-of-life perceptions, impacting rituals, decision-making, and attitudes toward death. Cultural humility encourages openness and acknowledges the limitations of expertise in one’s culture. A consistent self-awareness and a desire to understand patients’ beliefs drive the practice of cultural humility. This dynamic process requires practitioners to learn continuously, fostering empathy and understanding. Cultural humility enhances palliative care, ensuring it resonates genuinely across cultural backgrounds and enriches patient-provider interactions.

Keywords: cultural competency, cultural diversity, cultural humility, palliative care, self-awareness

Procedia PDF Downloads 60
16895 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide

Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva

Abstract:

Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.

Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning

Procedia PDF Downloads 157
16894 Effect of Rubber Tyre and Plastic Wastes Use in Asphalt Concrete Pavement

Authors: F. Onyango, Salim R. Wanjala, M. Ndege, L. Masu

Abstract:

Asphalt concrete pavements have a short life cycle, failing mainly due to temperature changes, traffic loading and ageing. Modified asphalt mixtures provide the technology to produce a bituminous binder with improved viscoelastic properties which remain in balance over a wider temperature range and loading conditions. In this research, 60/70 penetration grade asphalt binder was modified by adding 2, 4, 6, 8, and 10 percent by weight of asphalt binder following the wet process and the mineral aggregate was modified by adding 1, 2, 3, 4, and 5 percent crumb rubber by volume of the mineral aggregate following the dry process. The LDPE modified asphalt binder Rheological properties were evaluated. The laboratory results showed an increase in viscosity, softening point and stiffness of the binder. The modified asphalt was then used in preparing asphalt mixtures by Marshall Mix design procedure. The Marshall stability values for mixes containing 2% crumb rubber and 4% LDPE were found to be 30% higher than the conventional asphalt concrete mix.

Keywords: crumb rubber, dry process, hot mix asphalt, wet process

Procedia PDF Downloads 364
16893 Levels of Reflection in Engineers EFL Learners: The Path to Content and Language Integrated Learning Implementation in Chilean Higher Education

Authors: Sebastián Olivares Lizana, Marianna Oyanedel González

Abstract:

This study takes part of a major project based on implementing a CLIL program (Content and Language Integrated Learning) at Universidad Técnica Federico Santa María, a leading Chilean tertiary Institution. It aims at examining the relationship between the development of Reflective Processes (RP) and Cognitive Academic Language Proficiency (CALP) in weekly learning logs written by faculty members, participants of an initial professional development online course on English for Academic Purposes (EAP). Such course was designed with a genre-based approach, and consists of multiple tasks directed to academic writing proficiency. The results of this analysis will be described and classified in a scale of key indicators that represent both the Reflective Processes and the advances in CALP, and that also consider linguistic proficiency and task progression. Such indicators will evidence affordances and constrains of using a genre-based approach in an EFL Engineering CLIL program implementation at tertiary level in Chile, and will serve as the starting point to the design of a professional development course directed to teaching methodologies in a CLIL EFL environment in Engineering education at Universidad Técnica Federico Santa María.

Keywords: EFL, EAL, genre, CLIL, engineering

Procedia PDF Downloads 392
16892 Intensive Intercultural English Language Pedagogy among Parents from Culturally and Linguistically Diverse Backgrounds (CALD)

Authors: Ann Dashwood

Abstract:

Using Standard Australian English with confidence is a cultural expectation of parents of primary school aged children who want to engage effectively with their children’s teachers and school administration. That confidence in support of their children’s learning at school is seldom experienced by parents whose first language is not English. Sharing language with competence in an intercultural environment is the common denominator for meaningful communication and engagement to occur in a school community. Experience in relevant, interactive sessions is known to enhance engagement and participation. The purpose of this paper is to identify a pedagogy for parents otherwise isolated from daily use of functional Australian cultural language learned to engage effectively in their children’s learning at school. The outcomes measure parents’ intercultural engagement with classroom teachers and attention to the school’s administrative procedures using quantitative and qualitative methods. A principled communicative task-based language learning approach, combined with intercultural communication strategies provide the theoretical base for intensive English inquiry-based learning and engagement. The quantitative analysis examines data samples collected by classroom teachers and administrators and parents’ writing samples. Interviews and observations qualitatively inform the study. Currently, significant numbers of projects are active in community centers and schools to enhance English language knowledge of parents from Language Backgrounds Other Than English (LBOTE). The study is significant to explore the effects of an intensive English pedagogy with parents of varied English language backgrounds, by targeting inquiry-based language use for social interactions in the school and wider community, specific engagement and cultural interaction with teachers and school activities and procedures.

Keywords: engagement, intercultural communication, language teaching pedagogy, LBOTE, school community

Procedia PDF Downloads 118
16891 Examining Language as a Crucial Factor in Determining Academic Performance: A Case of Business Education in Hong Kong

Authors: Chau So Ling

Abstract:

I.INTRODUCTION: Educators have always been interested in exploring factors that contribute to students’ academic success. It is beyond question that language, as a medium of instruction, will affect student learning. This paper tries to investigate whether language is a crucial factor in determining students’ achievement in their studies. II. BACKGROUND AND SIGNIFICANCE OF STUDY: The issue of using English as a medium of instruction in Hong Kong is a special topic because Hong Kong is a post-colonial and international city which a British colony. In such a specific language environment, researchers in the education field have always been interested in investigating students’ language proficiency and its relation to academic achievement and other related educational indicators such as motivation to learn, self-esteem, learning effectiveness, self-efficacy, etc. Along this line of thought, this study specifically focused on business education. III. METHODOLOGY: The methodology in this study involved two sequential stages, namely, a focus group interview and a data analysis. The whole study was directed towards both qualitative and quantitative aspects. The subjects of the study were divided into two groups. For the first group participating in the interview, a total of ten high school students were invited. They studied Business Studies, and their English standard was varied. The theme of the discussion was “Does English affect your learning and examination results of Business Studies?” The students were facilitated to discuss the extent to which English standard affected their learning of Business subjects and requested to rate the correlation between English and performance of Business Studies on a five-point scale. The second stage of the study involved another group of students. They were high school graduates who had taken the public examination for entering universities. A database containing their public examination results for different subjects has been obtained for the purpose of statistical analysis. Hypotheses were tested and evidence was obtained from the focus group interview to triangulate the findings. V. MAJOR FINDINGS AND CONCLUSION: By sharing of personal experience, the discussion of focus group interviews indicated that higher English standards could help the students achieve better learning and examination performance. In order to end the interview, the students were asked to indicate the correlation between English proficiency and performance of Business Studies on a five-point scale. With point one meant least correlated, ninety percent of the students gave point four for the correlation. The preliminary results illustrated that English plays an important role in students’ learning of Business Studies, or at least this was what the students perceived, which set the hypotheses for the study. After conducting the focus group interview, further evidence had to be gathered to support the hypotheses. The data analysis part tried to find out the relationship by correlating the students’ public examination results of Business Studies and levels of English standard. The results indicated a positive correlation between their English standard and Business Studies examination performance. In order to highlight the importance of the English language to the study of Business Studies, the correlation between the public examination results of other non-business subjects was also tested. Statistical results showed that language does play a role in affecting students’ performance in studying Business subjects than the other subjects. The explanation includes the dynamic subject nature, examination format and study requirements, the specialist language used, etc. Unlike Science and Geography, students in their learning process might find it more difficult to relate business concepts or terminologies to their own experience, and there are not many obvious physical or practical activities or visual aids to serve as evidence or experiments. It is well-researched in Hong Kong that English proficiency is a determinant of academic success. Other research studies verified such a notion. For example, research revealed that the more enriched the language experience, the better the cognitive performance in conceptual tasks. The ability to perform this kind of task is particularly important to students taking Business subjects. Another research was carried out in the UK, which was geared towards identifying and analyzing the reasons for underachievement across a cohort of GCSE students taking Business Studies. Results showed that weak language ability was the main barrier to raising students’ performance levels. It seemed that the interview result was successfully triangulated with data findings. Although education failure cannot be restricted to linguistic failure and language is just one of the variables to play in determining academic achievement, it is generally accepted that language does affect students’ academic performance. It is just a matter of extent. This paper provides recommendations for business educators on students’ language training and sheds light on more research possibilities in this area.

Keywords: academic performance, language, learning, medium of instruction

Procedia PDF Downloads 114
16890 Application of Digital Tools for Improving Learning

Authors: José L. Jiménez

Abstract:

The use of technology in the classroom is an issue that is constantly evolving. Digital age students learn differently than their teachers did, so now the teacher should be constantly evolving their methods and teaching techniques to be more in touch with the student. In this paper a case study presents how were used some of these technologies by accompanying a classroom course, this in order to provide students with a different and innovative experience as their teacher usually presented the activities to develop. As students worked in the various activities, they increased their digital skills by employing unknown tools that helped them in their professional training. The twenty-first century teacher should consider the use of Information and Communication Technologies in the classroom thinking in skills that students of the digital age should possess. It also takes a brief look at the history of distance education and it is also highlighted the importance of integrating technology as part of the student's training.

Keywords: digital tools, on-line learning, social networks, technology

Procedia PDF Downloads 396
16889 Determination of the Economic Planning Depth for Assembly Process Planning

Authors: A. Kampker, P. Burggräf, Y. Bäumers

Abstract:

In order to be competitive, companies have to reduce their production costs while meeting increasing quality requirements. Therefore, companies try to plan their assembly processes as detailed as possible. However, increasing product individualization leading to a higher number of variants, smaller batch sizes and shorter product life cycles raise the question to what extent the effort of detailed planning is still justified. An important approach in this field of research is the concept of determining the economic planning depth for assembly process planning based on production specific influencing factors. In this paper, first solution hypotheses as well as a first draft of the resulting method will be presented.

Keywords: assembly process planning, economic planning depth, planning benefit, planning effort

Procedia PDF Downloads 500
16888 Hand Gesture Interpretation Using Sensing Glove Integrated with Machine Learning Algorithms

Authors: Aqsa Ali, Aleem Mushtaq, Attaullah Memon, Monna

Abstract:

In this paper, we present a low cost design for a smart glove that can perform sign language recognition to assist the speech impaired people. Specifically, we have designed and developed an Assistive Hand Gesture Interpreter that recognizes hand movements relevant to the American Sign Language (ASL) and translates them into text for display on a Thin-Film-Transistor Liquid Crystal Display (TFT LCD) screen as well as synthetic speech. Linear Bayes Classifiers and Multilayer Neural Networks have been used to classify 11 feature vectors obtained from the sensors on the glove into one of the 27 ASL alphabets and a predefined gesture for space. Three types of features are used; bending using six bend sensors, orientation in three dimensions using accelerometers and contacts at vital points using contact sensors. To gauge the performance of the presented design, the training database was prepared using five volunteers. The accuracy of the current version on the prepared dataset was found to be up to 99.3% for target user. The solution combines electronics, e-textile technology, sensor technology, embedded system and machine learning techniques to build a low cost wearable glove that is scrupulous, elegant and portable.

Keywords: American sign language, assistive hand gesture interpreter, human-machine interface, machine learning, sensing glove

Procedia PDF Downloads 295
16887 Defining Methodology for Multi Model Software Process Improvement Framework

Authors: Aedah Abd Rahman

Abstract:

Software organisations may implement single or multiple frameworks in order to remain competitive. There are wide selection of generic Software Process Improvement (SPI) frameworks, best practices and standards implemented with different focuses and goals. Issues and difficulties emerge in the SPI practices from the context of software development and IT Service Management (ITSM). This research looks into the integration of multiple frameworks from the perspective of software development and ITSM. The research question of this study is how to define steps of methodology to solve the multi model software process improvement problem. The objective of this study is to define the research approach and methodologies to produce a more integrated and efficient Multi Model Process Improvement (MMPI) solution. A multi-step methodology is used which contains the case study, framework mapping and Delphi study. The research outcome has proven the usefulness and appropriateness of the proposed framework in SPI and quality practice in Malaysian software industries. This mixed method research approach is used to tackle problems from every angle in the context of software development and services. This methodology is used to facilitate the implementation and management of multi model environment of SPI frameworks in multiple domains.

Keywords: Delphi study, methodology, multi model software process improvement, service management

Procedia PDF Downloads 257
16886 A Full Factorial Analysis of Microhardness Variation in Bead Welds Deposited by the Process Cold Wire Gas Metal Arc Welding (CW-GMAW)

Authors: R. A. Ribeiro, P. D. Angelo Assunção, E. M. Braga

Abstract:

The microhardness in weld beads is a function of the microstructure obtained in the welding process, and this by its time is dependent of the input variables established at the outset of the process. In this study the influence of angle between the plate and the cold wire, the position in which the cold wire is introduced and the rate in which this introduction is made are assessed as input parameters in CW-GMAW process. This paper looks to show that ordinary changes in the frame of CW-GMAW can improve microhardness, which is expected to vary as the input parameters change. To properly correlate the changes in the input parameters to consequent changes in microhardness of the weld bead, a full factorial design was employed. In fact, changes in the operational parameters improved the overall microhardness of the weld bead, which in turns can be an indication of improvement in the resistance to abrasive wear, constituting a cheap way to augment the abrasion wear resistance of welds used for cladding.

Keywords: abrasion, CW-GMAW, full factorial design, microhardness

Procedia PDF Downloads 542
16885 Investigation of Operational Conditions for Treatment of Industrial Wastewater Contaminated with Pesticides Using Electro-Fenton Process

Authors: Mohamed Gar Alalm

Abstract:

This study aims to investigate various operating conditions that affect the performance of the electro-Fenton process for degradation of pesticides. Stainless steel electrodes were utilized in the electro-Fenton cell due to their relatively low cost. The favored conditions of current intensity, pH, iron loading, and pesticide concentration were deeply discussed. Complete removal of pesticide was attained at the optimum conditions. The degradation kinetics were described by pseudo- first-order pattern. In addition, a response surface model was developed to describe the performance of electro-Fenton process under different operational conditions. The model indicated that the coefficient of determination was (R² = 0.995).

Keywords: electro-Fenton, stainless steel, pesticide, wastewater

Procedia PDF Downloads 138
16884 Student Perceptions on Administrative Support in the Delivering of Open Distance Learning Programmes – A Case Study

Authors: E. J. Spamer, J. M. Van Zyl, MHA Combrinck

Abstract:

The Unit for Open Distance Learning (UODL) at the North-West University (NWU), South Africa was established in 2013 with its main function to deliver open distance learning (ODL) programmes to approximately 30 000 students from the Faculties of Education Sciences, Health Sciences, Theology and Arts and Culture. Quality operational and administrative processes are key components in the delivery of these programmes and they need to function optimally for students to be successful in their studies. Operational and administrative processes include aspects such as applications, registration, dissemination of study material, availability of electronic platforms, the management of assessment, and the dissemination of important information. To be able to ensure and enhance quality during these processes, it is vital to determine students’ perceptions with regards to these mentioned processes. A questionnaire was available online and also distributed to the 63 tuition centres. The purpose of this research was to determine the perceptions of ODL students from NWU regarding operational and administrative processes. 1903 students completed and submitted the questionnaire. The data was quantitatively analysed and discussed. Results indicated that the majority of students are satisfied with the operational and administrative processes; however, the results also indicated some areas that need improvement. The data gathered is important to identify strengths and areas for improvement and form part of a bigger strategy of qualitative assurance at the UODL.

Keywords: administrative support, ODL programmes, quantitative study, students' perceptions

Procedia PDF Downloads 265
16883 Fraud Detection in Credit Cards with Machine Learning

Authors: Anjali Chouksey, Riya Nimje, Jahanvi Saraf

Abstract:

Online transactions have increased dramatically in this new ‘social-distancing’ era. With online transactions, Fraud in online payments has also increased significantly. Frauds are a significant problem in various industries like insurance companies, baking, etc. These frauds include leaking sensitive information related to the credit card, which can be easily misused. Due to the government also pushing online transactions, E-commerce is on a boom. But due to increasing frauds in online payments, these E-commerce industries are suffering a great loss of trust from their customers. These companies are finding credit card fraud to be a big problem. People have started using online payment options and thus are becoming easy targets of credit card fraud. In this research paper, we will be discussing machine learning algorithms. We have used a decision tree, XGBOOST, k-nearest neighbour, logistic-regression, random forest, and SVM on a dataset in which there are transactions done online mode using credit cards. We will test all these algorithms for detecting fraud cases using the confusion matrix, F1 score, and calculating the accuracy score for each model to identify which algorithm can be used in detecting frauds.

Keywords: machine learning, fraud detection, artificial intelligence, decision tree, k nearest neighbour, random forest, XGBOOST, logistic regression, support vector machine

Procedia PDF Downloads 143
16882 State Estimation of a Biotechnological Process Using Extended Kalman Filter and Particle Filter

Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte, V. Grincas

Abstract:

This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.

Keywords: biomass concentration, extended Kalman filter, particle filter, state estimation, specific growth rate

Procedia PDF Downloads 420
16881 Development of an EEG-Based Real-Time Emotion Recognition System on Edge AI

Authors: James Rigor Camacho, Wansu Lim

Abstract:

Over the last few years, the development of new wearable and processing technologies has accelerated in order to harness physiological data such as electroencephalograms (EEGs) for EEG-based applications. EEG has been demonstrated to be a source of emotion recognition signals with the highest classification accuracy among physiological signals. However, when emotion recognition systems are used for real-time classification, the training unit is frequently left to run offline or in the cloud rather than working locally on the edge. That strategy has hampered research, and the full potential of using an edge AI device has yet to be realized. Edge AI devices are computers with high performance that can process complex algorithms. It is capable of collecting, processing, and storing data on its own. It can also analyze and apply complicated algorithms like localization, detection, and recognition on a real-time application, making it a powerful embedded device. The NVIDIA Jetson series, specifically the Jetson Nano device, was used in the implementation. The cEEGrid, which is integrated to the open-source brain computer-interface platform (OpenBCI), is used to collect EEG signals. An EEG-based real-time emotion recognition system on Edge AI is proposed in this paper. To perform graphical spectrogram categorization of EEG signals and to predict emotional states based on input data properties, machine learning-based classifiers were used. Until the emotional state was identified, the EEG signals were analyzed using the K-Nearest Neighbor (KNN) technique, which is a supervised learning system. In EEG signal processing, after each EEG signal has been received in real-time and translated from time to frequency domain, the Fast Fourier Transform (FFT) technique is utilized to observe the frequency bands in each EEG signal. To appropriately show the variance of each EEG frequency band, power density, standard deviation, and mean are calculated and employed. The next stage is to identify the features that have been chosen to predict emotion in EEG data using the K-Nearest Neighbors (KNN) technique. Arousal and valence datasets are used to train the parameters defined by the KNN technique.Because classification and recognition of specific classes, as well as emotion prediction, are conducted both online and locally on the edge, the KNN technique increased the performance of the emotion recognition system on the NVIDIA Jetson Nano. Finally, this implementation aims to bridge the research gap on cost-effective and efficient real-time emotion recognition using a resource constrained hardware device, like the NVIDIA Jetson Nano. On the cutting edge of AI, EEG-based emotion identification can be employed in applications that can rapidly expand the research and implementation industry's use.

Keywords: edge AI device, EEG, emotion recognition system, supervised learning algorithm, sensors

Procedia PDF Downloads 103
16880 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format

Authors: Maryam Fallahpoor, Biswajeet Pradhan

Abstract:

Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.

Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format

Procedia PDF Downloads 80
16879 On Driving Forces of Cultural Globalization and its Retroaction: Under the Guidance of Skopos Theory

Authors: Zhai Yujia

Abstract:

None of the scholars and researchers has ever stepped into this field, though there are quite a few papers worked on various topics relevant to cultural and economic globalization separately. Economic globalization is earlier than cultural globalization. Since the invention of currency, people have had the sense of making money for the purpose of living, supporting their families, or other personal reasons. Their strong desire for earning a living is one of the incentives to propel the trade, tourism and other related economic activities that provide the service within the homeland at first and expand into the whole world later, as the global markets grow and mature. The need for operation impels international communication or interaction. To achieve this, it is vital to realize or recognize other cultures to some degree, concluding language, customs, social etiquette and history of different nations. All this drives the cultural globalization process. In contrast, it is clear that the development of cultural globalization does accelerate the process of economic globalization in return. Under the guidance of Skopos theory (first proposed by Hans Vermeer, and its core principle is that the translation process is determined by the purpose), this paper aims to demonstrate that cultural globalization is not a process in isolation by analyzing its driving forces and retroaction thoroughly with an approach of overview. It intertwines with economic globalization. The two push each other to proper gradually during their development, serving as the indispensable parts of the globalization process.

Keywords: cultural globalization, driving forces, retroaction, Skopos theory

Procedia PDF Downloads 151
16878 A Study on the Impacts of Computer Aided Design on the Architectural Design Process

Authors: Halleh Nejadriahi, Kamyar Arab

Abstract:

Computer-aided design (CAD) tools have been extensively used by the architects for the several decades. It has evolved from being a simple drafting tool to being an intelligent architectural software and a powerful means of communication for architects. CAD plays an essential role in the profession of architecture and is a basic tool for any architectural firm. It is not possible for an architectural firm to compete without taking the advantage of computer software, due to the high demand and competition in the architectural industry. The aim of this study is to evaluate the impacts of CAD on the architectural design process from conceptual level to final product, particularly in architectural practice. It examines the range of benefits of integrating CAD into the industry and discusses the possible defects limiting the architects. Method of this study is qualitatively based on data collected from the professionals’ perspective. The identified benefits and limitations of CAD on the architectural design process will raise the awareness of professionals on the potentials of CAD and proper utilization of that in the industry, which would result in a higher productivity along with a better quality in the architectural offices.

Keywords: architecture, architectural practice, computer aided design (CAD), design process

Procedia PDF Downloads 354
16877 Using SMS Mobile Technology to Assess the Mastery of Subject Content Knowledge of Science and Mathematics Teachers of Secondary Schools in Tanzania

Authors: Joel S. Mtebe, Aron Kondoro, Mussa M. Kissaka, Elia Kibga

Abstract:

Sub-Saharan Africa is described as the second fastest growing mobile phone penetration in the world more than in the United States or the European Union. Mobile phones have been used to provide a lot of opportunities to improve people’s lives in the region such as in banking, marketing, entertainment, and paying various bills such as water, TV, and electricity. However, the potential of using mobile phones to enhance teaching and learning has not been explored. This study presents an experience of developing and delivering SMS quizzes questions that were used to assess mastery of the subject content knowledge of science and mathematics secondary school teachers in Tanzania. The SMS quizzes were used as a follow up support mechanism to 500 teachers who participated in a project to upgrade subject content knowledge of science and mathematics subjects. Quizzes of 10-15 questions were sent to teachers each week for 8 weeks and the results were analyzed using SPSS. The results showed that chemistry and biology had better performance compared to mathematics and physics. Teachers reported some challenges that led to poor performance, invalid answers, and non-responses and they are presented. This research has several practical implications for those who are implementing or planning to use mobile phones for teaching and learning especially in rural secondary schools in sub-Saharan Africa.

Keywords: mobile learning, elearning, educational technolgies, SMS, secondary education, assessment

Procedia PDF Downloads 279