Search results for: learning transitions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7461

Search results for: learning transitions

5121 Machine Learning Assisted Prediction of Sintered Density of Binary W(MO) Alloys

Authors: Hexiong Liu

Abstract:

Powder metallurgy is the optimal method for the consolidation and preparation of W(Mo) alloys, which exhibit excellent application prospects at high temperatures. The properties of W(Mo) alloys are closely related to the sintered density. However, controlling the sintered density and porosity of these alloys is still challenging. In the past, the regulation methods mainly focused on time-consuming and costly trial-and-error experiments. In this study, the sintering data for more than a dozen W(Mo) alloys constituted a small-scale dataset, including both solid and liquid phases of sintering. Furthermore, simple descriptors were used to predict the sintered density of W(Mo) alloys based on the descriptor selection strategy and machine learning method (ML), where the ML algorithm included the least absolute shrinkage and selection operator (Lasso) regression, k-nearest neighbor (k-NN), random forest (RF), and multi-layer perceptron (MLP). The results showed that the interpretable descriptors extracted by our proposed selection strategy and the MLP neural network achieved a high prediction accuracy (R>0.950). By further predicting the sintered density of W(Mo) alloys using different sintering processes, the error between the predicted and experimental values was less than 0.063, confirming the application potential of the model.

Keywords: sintered density, machine learning, interpretable descriptors, W(Mo) alloy

Procedia PDF Downloads 79
5120 Modeling and Mapping of Soil Erosion Risk Using Geographic Information Systems, Remote Sensing, and Deep Learning Algorithms: Case of the Oued Mikkes Watershed, Morocco

Authors: My Hachem Aouragh, Hind Ragragui, Abdellah El-Hmaidi, Ali Essahlaoui, Abdelhadi El Ouali

Abstract:

This study investigates soil erosion susceptibility in the Oued Mikkes watershed, located in the Meknes-Fez region of northern Morocco, utilizing advanced techniques such as deep learning algorithms and remote sensing integrated within Geographic Information Systems (GIS). Spanning approximately 1,920 km², the watershed is characterized by a semi-arid Mediterranean climate with irregular rainfall and limited water resources. The waterways within the watershed, especially the Oued Mikkes, are vital for agricultural irrigation and potable water supply. The research assesses the extent of erosion risk upstream of the Sidi Chahed dam while developing a spatial model of soil loss. Several important factors, including topography, land use/land cover, and climate, were analyzed, with data on slope, NDVI, and rainfall erosivity processed using deep learning models (DLNN, CNN, RNN). The results demonstrated excellent predictive performance, with AUC values of 0.92, 0.90, and 0.88 for DLNN, CNN, and RNN, respectively. The resulting susceptibility maps provide critical insights for soil management and conservation strategies, identifying regions at high risk for erosion across 24% of the study area. The most high-risk areas are concentrated on steep slopes, particularly near the Ifrane district and the surrounding mountains, while low-risk areas are located in flatter regions with less rugged topography. The combined use of remote sensing and deep learning offers a powerful tool for accurate erosion risk assessment and resource management in the Mikkes watershed, highlighting the implications of soil erosion on dam siltation and operational efficiency.

Keywords: soil erosion, GIS, remote sensing, deep learning, Mikkes Watershed, Morocco

Procedia PDF Downloads 15
5119 The Impact of Information and Communication Technology in Education: Opportunities and Challenges

Authors: M. Nadeem, S. Nasir, K. A. Moazzam, R. Kashif

Abstract:

The remarkable growth and evolution in information and communication technology (ICT) in the past few decades has transformed modern society in almost every aspect of life. The impact and application of ICT have been observed in almost all walks of life including science, arts, business, health, management, engineering, sports, and education. ICT in education is being used extensively for student learning, creativity, interaction, and knowledge sharing and as a valuable source of teaching instrument. Apart from the student’s perspective, it plays a vital role for teacher education, instructional methods and curriculum development. There is a significant difference in growth of ICT enabled education in developing countries compared to developed nations and according to research, this gap is widening. ICT gradually infiltrate in almost every aspect of life. It has a deep and profound impact on our social, economic, health, environment, development, work, learning, and education environments. ICT provides very effective and dominant tools for information and knowledge processing. It is firmly believed that the coming generation should be proficient and confident in the use of ICT to cope with the existing international standards. This is only possible if schools can provide basic ICT infrastructure to students and to develop an ICT-integrated curriculum which covers all aspects of learning and creativity in students. However, there is a digital divide and steps must be taken to reduce this digital divide considerably to have the profound impact of ICT in education all around the globe. This study is based on theoretical approach and an extensive literature review is being conducted to see the successful implementations of ICT integration in education and to identify technologies and models which have been used in education in developed countries. This paper deals with the modern applications of ICT in schools for both teachers and students to uplift the learning and creativity amongst the students. A brief history of technology in education is presented and discussed are some important ICT tools for both student and teacher’s perspective. Basic ICT-based infrastructure for academic institutions is presented. The overall conclusion leads to the positive impact of ICT in education by providing an interactive, collaborative and challenging environment to students and teachers for knowledge sharing, learning and critical thinking.

Keywords: information and communication technology, ICT, education, ICT infrastructure, learning

Procedia PDF Downloads 121
5118 Digital Literacy Transformation and Implications in Institutions of Higher Learning in Kenya

Authors: Emily Cherono Sawe, Elisha Ondieki Makori

Abstract:

Knowledge and digital economies have brought challenges and potential opportunities for universities to innovate and improve the quality of learning. Disruption technologies and information dynamics continue to transform and change the landscape in teaching, scholarship, and research activities across universities. Digital literacy is a fundamental and imperative element in higher education and training, as witnessed during the new norm. COVID-19 caused unprecedented disruption in universities, where teaching and learning depended on digital innovations and applications. Academic services and activities were provided online, including library information services. Information professionals were forced to adopt various digital platforms in order to provide information services to patrons. University libraries’ roles in fulfilling educational responsibilities continue to evolve in response to changes in pedagogy, technology, economy, society, policies, and strategies of parent institutions. Libraries are currently undergoing considerable transformational change as a result of the inclusion of a digital environment. Academic libraries have been at the forefront of providing online learning resources and online information services, as well as supporting students and staff to develop digital literacy skills via online courses, tutorials, and workshops. Digital literacy transformation and information staff are crucial elements reminiscent of the prioritization of skills and knowledge for lifelong learning. The purpose of this baseline research is to assess the implications of digital literacy transformation in institutions of higher learning in Kenya and share appropriate strategies to leverage and sustain teaching and research. Objectives include examining the leverage and preparedness of the digital literacy environment in streamlining learning in the universities, exploring and benchmarking imperative digital competence for information professionals, establishing the perception of information professionals towards digital literacy skills, and determining lessons, best practices, and strategies to accelerate digital literacy transformation for effective research and learning in the universities. The study will adopt a descriptive research design using questionnaires and document analysis as the instruments for data collection. The targeted population is librarians and information professionals, as well as academics in public and private universities teaching information literacy programmes. Data and information are to be collected through an online structured questionnaire and digital face-to-face interviews. Findings and results will provide promising lessons together with best practices and strategies to transform and change digital literacies in university libraries in Kenya.

Keywords: digital literacy, digital innovations, information professionals, librarians, higher education, university libraries, digital information literacy

Procedia PDF Downloads 91
5117 Teaching Academic Writing for Publication: A Liminal Threshold Experience Towards Development of Scholarly Identity

Authors: Belinda du Plooy, Ruth Albertyn, Christel Troskie-De Bruin, Ella Belcher

Abstract:

In the academy, scholarliness or intellectual craftsmanship is considered the highest level of achievement, culminating in being consistently successfully published in impactful, peer-reviewed journals and books. Scholarliness implies rigorous methods, systematic exposition, in-depth analysis and evaluation, and the highest level of critical engagement and reflexivity. However, being a scholar does not happen automatically when one becomes an academic or completes graduate studies. A graduate qualification is an indication of one’s level of research competence but does not necessarily prepare one for the type of scholarly writing for publication required after a postgraduate qualification has been conferred. Scholarly writing for publication requires a high-level skillset and a specific mindset, which must be intentionally developed. The rite of passage to become a scholar is an iterative process with liminal spaces, thresholds, transitions, and transformations. The journey from researcher to published author is often fraught with rejection, insecurity, and disappointment and requires resilience and tenacity from those who eventually triumph. It cannot be achieved without support, guidance, and mentorship. In this article, the authors use collective auto-ethnography (CAE) to describe the phases and types of liminality encountered during the liminal journey toward scholarship. The authors speak as long-time facilitators of Writing for Academic Publication (WfAP) capacity development events (training workshops and writing retreats) presented at South African universities. Their WfAP facilitation practice is structured around experiential learning principles that allow them to act as critical reading partners and reflective witnesses for the writer-participants of their WfAP events. They identify three essential facilitation features for the effective holding of a generative, liminal, and transformational writing space for novice academic writers in order to enable their safe passage through the various liminal spaces they encounter during their scholarly development journey. These features are that facilitators should be agents of disruption and liminality while also guiding writers through these liminal spaces; that there should be a sense of mutual trust and respect, shared responsibility and accountability in order for writers to produce publication-worthy scholarly work; and that this can only be accomplished with the continued application of high levels of sensitivity and discernment by WfAP facilitators. These are key features for successful WfAP scholarship training events, where focused, individual input triggers personal and professional transformational experiences, which in turn translate into high-quality scholarly outputs.

Keywords: academic writing, liminality, scholarship, scholarliness, threshold experience, writing for publication

Procedia PDF Downloads 42
5116 Probing Syntax Information in Word Representations with Deep Metric Learning

Authors: Bowen Ding, Yihao Kuang

Abstract:

In recent years, with the development of large-scale pre-trained lan-guage models, building vector representations of text through deep neural network models has become a standard practice for natural language processing tasks. From the performance on downstream tasks, we can know that the text representation constructed by these models contains linguistic information, but its encoding mode and extent are unclear. In this work, a structural probe is proposed to detect whether the vector representation produced by a deep neural network is embedded with a syntax tree. The probe is trained with the deep metric learning method, so that the distance between word vectors in the metric space it defines encodes the distance of words on the syntax tree, and the norm of word vectors encodes the depth of words on the syntax tree. The experiment results on ELMo and BERT show that the syntax tree is encoded in their parameters and the word representations they produce.

Keywords: deep metric learning, syntax tree probing, natural language processing, word representations

Procedia PDF Downloads 64
5115 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game

Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin

Abstract:

Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.

Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design

Procedia PDF Downloads 420
5114 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images

Authors: Masood Varshosaz, Kamyar Hasanpour

Abstract:

In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.

Keywords: human recognition, deep learning, drones, disaster mitigation

Procedia PDF Downloads 91
5113 Constructivist Design Approaches to Video Production for Distance Education in Business and Economics

Authors: C. von Essen

Abstract:

This study outlines and evaluates a constructivist design approach to the creation of educational video on postgraduate business degree programmes. Many online courses are tapping into the educational affordances of video, as this form of online learning has the potential to create rich, multimodal experiences. And yet, in many learning contexts video is still being used to transmit instruction to passive learners, rather than promote learner engagement and knowledge creation. Constructivism posits the notion that learning is shaped as students make connections between their experiences and ideas. This paper pivots on the following research question: how can we design educational video in ways which promote constructivist learning and stimulate analytic viewing? By exploring and categorizing over two thousand educational videos created since 2014 for over thirty postgraduate courses in business, economics, mathematics and statistics, this paper presents and critically reflects on a taxonomy of video styles and features. It links the pedagogical intent of video – be it concept explanation, skill demonstration, feedback, real-world application of ideas, community creation, or the cultivation of course narrative – to specific presentational characteristics such as visual effects including diagrammatic and real-life graphics and aminations, commentary and sound options, chronological sequencing, interactive elements, and presenter set-up. The findings of this study inform a framework which captures the pedagogical, technological and production considerations instructional designers and educational media specialists should be conscious of when planning and preparing the video. More broadly, the paper demonstrates how learning theory and technology can coalesce to produce informed and pedagogical grounded instructional design choices. This paper reveals how crafting video in a more conscious and critical manner can produce powerful, new educational design.

Keywords: educational video, constructivism, instructional design, business education

Procedia PDF Downloads 236
5112 Spontaneous Message Detection of Annoying Situation in Community Networks Using Mining Algorithm

Authors: P. Senthil Kumari

Abstract:

Main concerns in data mining investigation are social controls of data mining for handling ambiguity, noise, or incompleteness on text data. We describe an innovative approach for unplanned text data detection of community networks achieved by classification mechanism. In a tangible domain claim with humble secrecy backgrounds provided by community network for evading annoying content is presented on consumer message partition. To avoid this, mining methodology provides the capability to unswervingly switch the messages and similarly recover the superiority of ordering. Here we designated learning-centered mining approaches with pre-processing technique to complete this effort. Our involvement of work compact with rule-based personalization for automatic text categorization which was appropriate in many dissimilar frameworks and offers tolerance value for permits the background of comments conferring to a variety of conditions associated with the policy or rule arrangements processed by learning algorithm. Remarkably, we find that the choice of classifier has predicted the class labels for control of the inadequate documents on community network with great value of effect.

Keywords: text mining, data classification, community network, learning algorithm

Procedia PDF Downloads 508
5111 Working with Interpreters: Using Role Play to Teach Social Work Students

Authors: Yuet Wah Echo Yeung

Abstract:

Working with people from minority ethnic groups, refugees and asylum seeking communities who have limited proficiency in the language of the host country often presents a major challenge for social workers. Because of language differences, social workers need to work with interpreters to ensure accurate information is collected for their assessment and intervention. Drawing from social learning theory, this paper discusses how role play was used as an experiential learning exercise in a training session to help social work students develop skills when working with interpreters. Social learning theory posits that learning is a cognitive process that takes place in a social context when people observe, imitate and model others’ behaviours. The roleplay also helped students understand the role of the interpreter and the challenges they may face when they rely on interpreters to communicate with service users and their family. The first part of the session involved role play. A tutor played the role of social worker and deliberately behaved in an unprofessional manner and used inappropriate body language when working alongside the interpreter during a home visit. The purpose of the roleplay is not to provide a positive role model for students to ‘imitate’ social worker’s behaviours. Rather it aims to active and provoke internal thinking process and encourages students to critically consider the impacts of poor practice on relationship building and the intervention process. Having critically reflected on the implications for poor practice, students were then asked to play the role of social worker and demonstrate what good practice should look like. At the end of the session, students remarked that they learnt a lot by observing the good and bad example; it showed them what not to do. The exercise served to remind students how practitioners can easily slip into bad habits and of the importance of respect for the cultural difference when working with people from different cultural backgrounds.

Keywords: role play, social learning theory, social work practice, working with interpreters

Procedia PDF Downloads 179
5110 The Reflections of the K-12 English Language Teachers on the Implementation of the K-12 Basic Education Program in the Philippines

Authors: Dennis Infante

Abstract:

This paper examined the reflections of teachers on curriculum reforms, the implementation of the K-12 Basic Education Program in the Philippines. The results revealed that problems and concerns raised by teachers could be classified into curriculum materials and design; competence, readiness and motivation of the teachers; the learning environment, and support systems; readiness, competence and motivation of students; and other relevant factors. The best features of the K-12 curriculum reforms included (1) the components, curriculum materials; (2) the design, structure and delivery of the lessons; (3) the framework and theoretical approach; (3) the qualities of the teaching-learning activities; (4) and other relevant features. With the demanding task of implementing the new curriculum, the teachers expressed their needs which included (1) making the curriculum materials available to achieve the goals of the curriculum reforms; (2) enrichment of the learning environments; (3) motivating and encouraging the teachers to embrace change; (4) providing appropriate support systems; (5) re-tooling, and empowering teachers to implement the curriculum reforms; and (6) other relevant factors. The research concluded with a synthesis that provided a paradigm for implementing curriculum reforms which recognizes the needs of the teachers and the features of the new curriculum.

Keywords: curriculum reforms, K-12, teachers' reflections, implementing curriculum change

Procedia PDF Downloads 276
5109 Assessing the Corporate Identity of Malaysia Universities in the East Coast Region with the Market Conditions in Ensuring Self-Sustainability: A Study on Universiti Sultan Zainal Abidin

Authors: Suffian Hadi Ayub, Mohammad Rezal Hamzah, Nor Hafizah Abdullah, Sharipah Nur Mursalina Syed Azmy, Hishamuddin Salim

Abstract:

The liberalisation of the education industry has exposed the institute of higher learning (IHL) in Malaysia to the financial challenges. Without good financial standing, public institution will rely on the government funding. Ostensibly, this contradicts with the government’s aspiration to make universities self-sufficient. With stiff competition from private institutes of higher learning, IHL need to be prepared at the forefront level. The corporate identity itself is the entrance to the world of higher learning and it is in this uniqueness, it will be able to distinguish itself from competitors. This paper examined the perception of the stakeholders at one of the public universities in the east coast region in Malaysia on the perceived reputation and how the university communicate its preparedness for self-sustainability through corporate identity. The findings indicated while the stakeholders embraced the challenges in facing the stiff competition and struggling market conditions, most of them felt the university should put more efforts in mobilising the corporate identity to its constituencies.

Keywords: communication, corporate identity, market conditions, universities

Procedia PDF Downloads 313
5108 Machine Learning Invariants to Detect Anomalies in Secure Water Treatment

Authors: Jonathan Heng, Yoong Cheah Huei

Abstract:

A strategic model that does not trigger any false alarms to detect anomalies in Secure Water Treatment (SWaT) test bed is presented. This model uses machine learning invariants formulated from streamlining the general form of Auto-Regressive models with eXogenous input. A creative generalized CUSUM algorithm to integrate the invariants and the detection strategy technique is successfully developed and tested in the SWaT Programmable Logic Controllers (PLCs). Three steps to fine-tune parameters, b and τ in the generalized algorithm are stated and an example used to demonstrate the tuning process is discussed. This approach can swiftly and effectively detect various scopes of cyber-attacks such as multiple points single stage and multiple points multiple stages in SWaT. This technique can be applied in water treatment plants and other cyber physical systems like power and gas plants too.

Keywords: machine learning invariants, generalized CUSUM algorithm with invariants and detection strategy, scope of cyber attacks, strategic model, tuning parameters

Procedia PDF Downloads 179
5107 Promoting Non-Formal Learning Mobility in the Field of Youth

Authors: Juha Kettunen

Abstract:

The purpose of this study is to develop a framework for the assessment of research and development projects. The assessment map is developed in this study based on the strategy map of the balanced scorecard approach. The assessment map is applied in a project that aims to reduce the inequality and risk of exclusion of young people from disadvantaged social groups. The assessment map denotes that not only funding but also necessary skills and qualifications should be carefully assessed in the implementation of the project plans so as to achieve the objectives of projects and the desired impact. The results of this study are useful for those who want to develop the implementation of the Erasmus+ Programme and the project teams of research and development projects.

Keywords: non-formal learning, youth work, social inclusion, innovation

Procedia PDF Downloads 294
5106 Satisfaction Among Preclinical Medical Students with Low-Fidelity Simulation-Based Learning

Authors: Shilpa Murthy, Hazlina Binti Abu Bakar, Juliet Mathew, Chandrashekhar Thummala Hlly Sreerama Reddy, Pathiyil Ravi Shankar

Abstract:

Simulation is defined as a technique that replaces or expands real experiences with guided experiences that interactively imitate real-world processes or systems. Simulation enables learners to train in a safe and non-threatening environment. For decades, simulation has been considered an integral part of clinical teaching and learning strategy in medical education. The several types of simulation used in medical education and the clinical environment can be applied to several models, including full-body mannequins, task trainers, standardized simulated patients, virtual or computer-generated simulation, or Hybrid simulation that can be used to facilitate learning. Simulation allows healthcare practitioners to acquire skills and experience while taking care of patient safety. The recent COVID pandemic has also led to an increase in simulation use, as there were limitations on medical student placements in hospitals and clinics. The learning is tailored according to the educational needs of students to make the learning experience more valuable. Simulation in the pre-clinical years has challenges with resource constraints, effective curricular integration, student engagement and motivation, and evidence of educational impact, to mention a few. As instructors, we may have more reliance on the use of simulation for pre-clinical students while the students’ confidence levels and perceived competence are to be evaluated. Our research question was whether the implementation of simulation-based learning positively influences preclinical medical students' confidence levels and perceived competence. This study was done to align the teaching activities with the student’s learning experience to introduce more low-fidelity simulation-based teaching sessions for pre-clinical years and to obtain students’ input into the curriculum development as part of inclusivity. The study was carried out at International Medical University, involving pre-clinical year (Medical) students who were started with low-fidelity simulation-based medical education from their first semester and were gradually introduced to medium fidelity, too. The Student Satisfaction and Self-Confidence in Learning Scale questionnaire from the National League of Nursing was employed to collect the responses. The internal consistency reliability for the survey items was tested with Cronbach’s alpha using an Excel file. IBM SPSS for Windows version 28.0 was used to analyze the data. Spearman’s rank correlation was used to analyze the correlation between students’ satisfaction and self-confidence in learning. The significance level was set at p value less than 0.05. The results from this study have prompted the researchers to undertake a larger-scale evaluation, which is currently underway. The current results show that 70% of students agreed that the teaching methods used in the simulation were helpful and effective. The sessions are dependent on the learning materials that are provided and how the facilitators engage the students and make the session more enjoyable. The feedback provided inputs on the following areas to focus on while designing simulations for pre-clinical students. There are quality learning materials, an interactive environment, motivating content, skills and knowledge of the facilitator, and effective feedback.

Keywords: low-fidelity simulation, pre-clinical simulation, students satisfaction, self-confidence

Procedia PDF Downloads 75
5105 Partial Knowledge Transfer Between the Source Problem and the Target Problem in Genetic Algorithms

Authors: Terence Soule, Tami Al Ghamdi

Abstract:

To study how the partial knowledge transfer may affect the Genetic Algorithm (GA) performance, we model the Transfer Learning (TL) process using GA as the model solver. The objective of the TL is to transfer the knowledge from one problem to another related problem. This process imitates how humans think in their daily life. In this paper, we proposed to study a case where the knowledge transferred from the S problem has less information than what the T problem needs. We sampled the transferred population using different strategies of TL. The results showed transfer part of the knowledge is helpful and speeds the GA process of finding a solution to the problem.

Keywords: transfer learning, partial transfer, evolutionary computation, genetic algorithm

Procedia PDF Downloads 130
5104 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing

Authors: Tolulope Aremu

Abstract:

This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.

Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving

Procedia PDF Downloads 26
5103 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters

Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu

Abstract:

Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.

Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning

Procedia PDF Downloads 198
5102 From Bureaucracy to Organizational Learning Model: An Organizational Change Process Study

Authors: Vania Helena Tonussi Vidal, Ester Eliane Jeunon

Abstract:

This article aims to analyze the change processes of management related bureaucracy and learning organization model. The theoretical framework was based on Beer and Nohria (2001) model, identified as E and O Theory. Based on this theory the empirical research was conducted in connection with six key dimensions: goal, leadership, focus, process, reward systems and consulting. We used a case study of an educational Institution located in Barbacena, Minas Gerais. This traditional center of technical knowledge for long time adopted the bureaucratic way of management. After many changes in a business model, as the creation of graduate and undergraduate courses they decided to make a deep change in management model that is our research focus. The data were collected through semi-structured interviews with director, managers and courses supervisors. The analysis were processed by the procedures of Collective Subject Discourse (CSD) method, develop by Lefèvre & Lefèvre (2000), Results showed the incremental growing of management model toward a learning organization. Many impacts could be seeing. As negative factors we have: people resistance; poor information about the planning and implementation process; old politics inside the new model and so on. Positive impacts are: new procedures in human resources, mainly related to manager skills and empowerment; structure downsizing, open discussions channel; integrated information system. The process is still under construction and now great stimulus is done to managers and employee commitment in the process.

Keywords: bureaucracy, organizational learning, organizational change, E and O theory

Procedia PDF Downloads 433
5101 Reflective Thinking and Experiential Learning – A Quasi-Experimental Quanti-Quali Response to Greater Diversification of Activities, Greater Integration of Student Profiles

Authors: Paulo Sérgio Ribeiro de Araújo Bogas

Abstract:

Although several studies have assumed (at least implicitly) that learners' approaches to learning develop into deeper approaches to higher education, there appears to be no clear theoretical basis for this assumption and no empirical evidence. As a scientific contribution to this discussion, a pedagogical intervention of a quasi-experimental nature was developed, with a mixed methodology, evaluating the intervention within a single curricular unit of Marketing, using cases based on real challenges of brands, business simulation, and customer projects. Primary and secondary experiences were incorporated in the intervention: the primary experiences are the experiential activities themselves; the secondary experiences result from the primary experience, such as reflection and discussion in work teams. A diversified learning relationship was encouraged through the various connections between the different members of the learning community. The present study concludes that in the same context, the student's responses can be described as students who reinforce the initial deep approach, students who maintain the initial deep approach level, and others who change from an emphasis on the deep approach to one closer to superficial. This typology did not always confirm studies reported in the literature, namely, whether the initial level of deep processing would influence the superficial and the opposite. The result of this investigation points to the inclusion of pedagogical and didactic activities that integrate different motivations and initial strategies, leading to the possible adoption of deep approaches to learning since it revealed statistically significant differences in the difference in the scores of the deep/superficial approach and the experiential level. In the case of real challenges, the categories of “attribution of meaning and meaning of studied” and the possibility of “contact with an aspirational context” for their future professional stand out. In this category, the dimensions of autonomy that will be required of them were also revealed when comparing the classroom context of real cases and the future professional context and the impact they may have on the world. Regarding the simulated practice, two categories of response stand out: on the one hand, the motivation associated with the possibility of measuring the results of the decisions taken, an awareness of oneself, and, on the other hand, the additional effort that this practice required for some of the students.

Keywords: experiential learning, higher education, mixed methods, reflective learning, marketing

Procedia PDF Downloads 83
5100 An Interactive Voice Response Storytelling Model for Learning Entrepreneurial Mindsets in Media Dark Zones

Authors: Vineesh Amin, Ananya Agrawal

Abstract:

In a prolonged period of uncertainty and disruptions in the pre-said normal order, non-cognitive skills, especially entrepreneurial mindsets, have become a pillar that can reform the educational models to inform the economy. Dreamverse Learning Lab’s IVR-based storytelling program -Call-a-Kahaani- is an evolving experiment with an aim to kindle entrepreneurial mindsets in the remotest locations of India in an accessible and engaging manner. At the heart of this experiment is the belief that at every phase in our life’s story, we have a choice which brings us closer to achieving our true potential. This interactive program is thus designed using real-time storytelling principles to empower learners, ages 24 and below, to make choices and take decisions as they become more self-aware, practice grit, try new things through stories, guided activities, and interactions, simply over a phone call. This research paper highlights the framework behind an ongoing scalable, data-oriented, low-tech program to kindle entrepreneurial mindsets in media dark zones supported by iterative design and prototyping to reach 13700+ unique learners who made 59000+ calls for 183900+min listening duration to listen to content pieces of around 3 to 4 min, with the last monitored (March 2022) record of 34% serious listenership, within one and a half years of its inception. The paper provides an in-depth account of the technical development, content creation, learning, and assessment frameworks, as well as mobilization models which have been leveraged to build this end-to-end system.

Keywords: non-cognitive skills, entrepreneurial mindsets, speech interface, remote learning, storytelling

Procedia PDF Downloads 208
5099 Developing Second Language Learners’ Reading Comprehension through Content and Language Integrated Learning

Authors: Kaine Gulozer

Abstract:

A strong methodological conception in the practice of teaching, content, and language integrated learning (CLIL) is adapted to boost efficiency in the second language (L2) instruction with a range of proficiency levels. This study aims to investigate whether the incorporation of two different mediums of meaningful CLIL reading activities (in-school and out-of-school settings) influence L2 students’ development of comprehension skills differently. CLIL based instructional methodology was adopted and total of 50 preparatory year students (N=50, 25 students for each proficiency level) from two distinct language proficiency learners (elementary and intermediate) majoring in engineering faculties were recruited for the study. Both qualitative and quantitative methods through a post-test design were adopted. Data were collected through a questionnaire, a reading comprehension test and a semi-structured interview addressed to the two proficiency groups. The results show that both settings in relation to the development of reading comprehension are beneficial, whereas the impact of the reading activities conducted in school settings was higher at the elementary language level of students than that of the one conducted out-of-class settings based on the reported interview results. This study suggests that the incorporation of meaningful CLIL reading activities in both settings for both proficiency levels could create students’ self-awareness of their language learning process and the sense of ownership in successful improvements of field-specific reading comprehension. Further potential suggestions and implications of the study were discussed.

Keywords: content and language integrated learning, in-school setting, language proficiency, out-of-school setting, reading comprehension

Procedia PDF Downloads 144
5098 Opinions of Pre-Service Teachers on Online Language Teaching: COVID-19 Pandemic Perspective

Authors: Neha J. Nandaniya

Abstract:

In the present research paper researcher put focuses on the opinions of pre-service teachers have been taken regarding online language teaching, which was held during the COVID-19 pandemic and is still going on. The researcher developed a three-point rating scale in Google Forms to find out the views of trainees on online language learning, in which 167 B. Ed. trainees having language content and method gave their responses. After scoring the responses obtained by the investigator, the chi-square value was calculated, and the findings were concluded. The major finding of the study is language learning is not as effective as offline teaching mode.

Keywords: online language teaching, ICT competency, B. Ed. trainees, COVID-19 pandemic

Procedia PDF Downloads 82
5097 Modelling the Impact of Installation of Heat Cost Allocators in District Heating Systems Using Machine Learning

Authors: Danica Maljkovic, Igor Balen, Bojana Dalbelo Basic

Abstract:

Following the regulation of EU Directive on Energy Efficiency, specifically Article 9, individual metering in district heating systems has to be introduced by the end of 2016. These directions have been implemented in member state’s legal framework, Croatia is one of these states. The directive allows installation of both heat metering devices and heat cost allocators. Mainly due to bad communication and PR, the general public false image was created that the heat cost allocators are devices that save energy. Although this notion is wrong, the aim of this work is to develop a model that would precisely express the influence of installation heat cost allocators on potential energy savings in each unit within multifamily buildings. At the same time, in recent years, a science of machine learning has gain larger application in various fields, as it is proven to give good results in cases where large amounts of data are to be processed with an aim to recognize a pattern and correlation of each of the relevant parameter as well as in the cases where the problem is too complex for a human intelligence to solve. A special method of machine learning, decision tree method, has proven an accuracy of over 92% in prediction general building consumption. In this paper, a machine learning algorithms will be used to isolate the sole impact of installation of heat cost allocators on a single building in multifamily houses connected to district heating systems. Special emphasises will be given regression analysis, logistic regression, support vector machines, decision trees and random forest method.

Keywords: district heating, heat cost allocator, energy efficiency, machine learning, decision tree model, regression analysis, logistic regression, support vector machines, decision trees and random forest method

Procedia PDF Downloads 249
5096 Simulation Study on Particle Fluidization and Drying in a Spray Fluidized Bed

Authors: Jinnan Guo, Daoyin Liu

Abstract:

The quality of final products in the coating process significantly depends on particle fluidization and drying in the spray-fluidized bed. In this study, fluidizing gas temperature and velocity are changed, and their effects on particle flow, moisture content, and heat transfer in a spray fluidized bed are investigated by the CFD – Discrete Element Model (DEM). The gas flow velocity distribution of the fluidized bed is symmetrical, with high velocity in the middle and low velocity on both sides. During the heating process, the particles inside the central tube and at the bottom of the bed are rapidly heated. The particle circulation in the annular area is heated slowly and the temperature is low. The inconsistency of particle circulation results in two peaks in the probability density distribution of the particle temperature during the heating process, and the overall temperature of the particles increases uniformly. During the drying process, the distribution of particle moisture transitions from initial uniform moisture to two peaks, and then the number of completely dried (moisture content of 0) particles gradually increases. Increasing the fluidizing gas temperature and velocity improves particle circulation, drying and heat transfer in the bed. The current study provides an effective method for studying the hydrodynamics of spray fluidized beds with simultaneous processes of heating and particle fluidization.

Keywords: heat transfer, CFD-DEM, spray fluidized bed, drying

Procedia PDF Downloads 69
5095 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis

Authors: Mehrnaz Mostafavi

Abstract:

The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.

Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans

Procedia PDF Downloads 98
5094 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 150
5093 Land Suitability Prediction Modelling for Agricultural Crops Using Machine Learning Approach: A Case Study of Khuzestan Province, Iran

Authors: Saba Gachpaz, Hamid Reza Heidari

Abstract:

The sharp increase in population growth leads to more pressure on agricultural areas to satisfy the food supply. To achieve this, more resources should be consumed and, besides other environmental concerns, highlight sustainable agricultural development. Land-use management is a crucial factor in obtaining optimum productivity. Machine learning is a widely used technique in the agricultural sector, from yield prediction to customer behavior. This method focuses on learning and provides patterns and correlations from our data set. In this study, nine physical control factors, namely, soil classification, electrical conductivity, normalized difference water index (NDWI), groundwater level, elevation, annual precipitation, pH of water, annual mean temperature, and slope in the alluvial plain in Khuzestan (an agricultural hotspot in Iran) are used to decide the best agricultural land use for both rainfed and irrigated agriculture for ten different crops. For this purpose, each variable was imported into Arc GIS, and a raster layer was obtained. In the next level, by using training samples, all layers were imported into the python environment. A random forest model was applied, and the weight of each variable was specified. In the final step, results were visualized using a digital elevation model, and the importance of all factors for each one of the crops was obtained. Our results show that despite 62% of the study area being allocated to agricultural purposes, only 42.9% of these areas can be defined as a suitable class for cultivation purposes.

Keywords: land suitability, machine learning, random forest, sustainable agriculture

Procedia PDF Downloads 83
5092 Deepnic, A Method to Transform Each Variable into Image for Deep Learning

Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.

Abstract:

Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.

Keywords: tabular data, deep learning, perfect trees, NICS

Procedia PDF Downloads 89