Search results for: Gagne’s learning model
19396 Analogy to Continental Divisions: An Attention-Grabbing Approach to Teach Taxonomic Hierarchy to Students
Authors: Sagheer Ahmad
Abstract:
Teaching is a sacred profession whereby students are developed in their mental abilities to cope with the challenges of the remote world. Thinkers have developed plenty of interesting ways to make the learning process quick and absorbing for the students. However, third world countries are still lacking these remote facilities in the institutions, and therefore, teaching is totally dependent upon the skills of the teachers. Skillful teachers use self-devised and stimulating ideas to grab the attention of their students. Most of the time their ideas are based on local grounds with which the students are already familiar. This self-explanatory characteristic is the base of several local ideologies to disseminate scientific knowledge to new generations. Biology is such a subject which largely bases upon hypotheses, and teaching it in an interesting way is needful to create a friendly relationship between teacher and student, and to make a fantastic learning environment. Taxonomic classification if presented as it is, may not be attractive for the secondary school students who just start learning about biology at elementary levels. Presenting this hierarchy by exemplifying Kingdom, Phylum, Class, Order, family, genus and Species as comparatives of our division into continents, countries, cities, towns, villages, homes and finally individuals could be an attention-grabbing approach to make this concept get into bones of students. Similarly, many other interesting approaches have also been adopted to teach students in a fascinating way so that learning science subjects may not be boring for them. Discussing these appealing ways of teaching students can be a valuable stimulus to refine teaching methodologies about science, thereby promoting the concept of friendly learning.Keywords: biology, innovative approaches, taxonomic classification, teaching
Procedia PDF Downloads 25319395 Hidden Markov Model for Financial Limit Order Book and Its Application to Algorithmic Trading Strategy
Authors: Sriram Kashyap Prasad, Ionut Florescu
Abstract:
This study models the intraday asset prices as driven by Markov process. This work identifies the latent states of the Hidden Markov model, using limit order book data (trades and quotes) to continuously estimate the states throughout the day. This work builds a trading strategy using estimated states to generate signals. The strategy utilizes current state to recalibrate buy/ sell levels and the transition between states to trigger stop-loss when adverse price movements occur. The proposed trading strategy is tested on the Stevens High Frequency Trading (SHIFT) platform. SHIFT is a highly realistic market simulator with functionalities for creating an artificial market simulation by deploying agents, trading strategies, distributing initial wealth, etc. In the implementation several assets on the NASDAQ exchange are used for testing. In comparison to a strategy with static buy/ sell levels, this study shows that the number of limit orders that get matched and executed can be increased. Executing limit orders earns rebates on NASDAQ. The system can capture jumps in the limit order book prices, provide dynamic buy/sell levels and trigger stop loss signals to improve the PnL (Profit and Loss) performance of the strategy.Keywords: algorithmic trading, Hidden Markov model, high frequency trading, limit order book learning
Procedia PDF Downloads 15219394 Finding Data Envelopment Analysis Targets Using Multi-Objective Programming in DEA-R with Stochastic Data
Authors: R. Shamsi, F. Sharifi
Abstract:
In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose a multi-objective DEA-R model because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduce the efficiency score), an efficient decision-making unit (DMU) is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other cases, only the ratio of stochastic data may be available (e.g., the ratio of stochastic inputs to stochastic outputs). Thus, we provide a multi-objective DEA model without explicit outputs and prove that the input-oriented MOP DEA-R model in the invariable return to scale case can be replaced by the MOP-DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.Keywords: DEA-R, multi-objective programming, stochastic data, data envelopment analysis
Procedia PDF Downloads 10719393 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning
Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher
Abstract:
Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping
Procedia PDF Downloads 13819392 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors
Authors: Sudhir Kumar Singh, Debashish Chakravarty
Abstract:
Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.Keywords: finite element method, geotechnical engineering, machine learning, slope stability
Procedia PDF Downloads 10319391 Application of Generalized Autoregressive Score Model to Stock Returns
Authors: Katleho Daniel Makatjane, Diteboho Lawrence Xaba, Ntebogang Dinah Moroke
Abstract:
The current study investigates the behaviour of time-varying parameters that are based on the score function of the predictive model density at time t. The mechanism to update the parameters over time is the scaled score of the likelihood function. The results revealed that there is high persistence of time-varying, as the location parameter is higher and the skewness parameter implied the departure of scale parameter from the normality with the unconditional parameter as 1.5. The results also revealed that there is a perseverance of the leptokurtic behaviour in stock returns which implies the returns are heavily tailed. Prior to model estimation, the White Neural Network test exposed that the stock price can be modelled by a GAS model. Finally, we proposed further researches specifically to model the existence of time-varying parameters with a more detailed model that encounters the heavy tail distribution of the series and computes the risk measure associated with the returns.Keywords: generalized autoregressive score model, South Africa, stock returns, time-varying
Procedia PDF Downloads 50219390 Reduction of Rotor-Bearing-Support Finite Element Model through Substructuring
Authors: Abdur Rosyid, Mohamed El-Madany, Mohanad Alata
Abstract:
Due to simplicity and low cost, rotordynamic system is often modeled by using lumped parameters. Recently, finite elements have been used to model rotordynamic system as it offers higher accuracy. However, it involves high degrees of freedom. In some applications such as control design, this requires higher cost. For this reason, various model reduction methods have been proposed. This work demonstrates the quality of model reduction of rotor-bearing-support system through substructuring. The quality of the model reduction is evaluated by comparing some first natural frequencies, modal damping ratio, critical speeds and response of both the full system and the reduced system. The simulation shows that the substructuring is proven adequate to reduce finite element rotor model in the frequency range of interest as long as the numbers and the locations of master nodes are determined appropriately. However, the reduction is less accurate in an unstable or nearly-unstable system.Keywords: rotordynamic, finite element model, timoshenko beam, 3D solid elements, Guyan reduction method
Procedia PDF Downloads 27419389 A Unified Model for Orotidine Monophosphate Synthesis: Target for Inhibition of Growth of Mycobacterium tuberculosis
Authors: N. Naga Subrahmanyeswara Rao, Parag Arvind Deshpande
Abstract:
Understanding nucleotide synthesis reaction of any organism is beneficial to know the growth of it as in Mycobacterium tuberculosis to design anti TB drug. One of the reactions of de novo pathway which takes place in all organisms was considered. The reaction takes places between phosphoribosyl pyrophosphate and orotate catalyzed by orotate phosphoribosyl transferase and divalent metal ion gives orotdine monophosphate, a nucleotide. All the reaction steps of three experimentally proposed mechanisms for this reaction were considered to develop kinetic rate expression. The model was validated using the data for four organisms. This model could successfully describe the kinetics for the reported data. The developed model can serve as a reliable model to describe the kinetics in new organisms without the need of mechanistic determination. So an organism-independent model was developed.Keywords: mechanism, nucleotide, organism, tuberculosis
Procedia PDF Downloads 33619388 Controlling the Expense of Political Contests Using a Modified N-Players Tullock’s Model
Abstract:
This work introduces a generalization of the classical Tullock’s model of one-stage contests under complete information with multiple unlimited numbers of contestants. In classical Tullock’s model, the contest winner is not necessarily the highest bidder. Instead, the winner is determined according to a draw in which the winning probabilities are the relative contestants’ efforts. The Tullock modeling fits well political contests, in which the winner is not necessarily the highest effort contestant. This work presents a modified model which uses a simple non-discriminating rule, namely, a parameter to influence the total costs planned for an election, for example, the contest designer can control the contestants' efforts. The winner pays a fee, and the losers are reimbursed the same amount. Our proposed model includes a mechanism that controls the efforts exerted and balances competition, creating a tighter, less predictable and more interesting contest. Additionally, the proposed model follows the fairness criterion in the sense that it does not alter the contestants' probabilities of winning compared to the classic Tullock’s model. We provide an analytic solution for the contestant's optimal effort and expected reward.Keywords: contests, Tullock's model, political elections, control expenses
Procedia PDF Downloads 14519387 Fama French Four Factor Model: A Study of Nifty Fifty Companies
Authors: Deeksha Arora
Abstract:
The study aims to explore the applicability of the widely used asset pricing models, namely, Capital Asset Pricing Model (CAPM) and the Fama-French Four Factor Model in the Indian equity market. The study will be based on the companies that form part of the Nifty Fifty Index for a period of five years: 2011 to 2016. The asset pricing model is examined by forming portfolios on the basis of three variables – market capitalization (size effect), book-to-market equity ratio (value effect) and profitability. The study provides a basis to test the presence of the Fama-French Four factor model in Indian stock market. This study may provide a basis for future research in the generalized asset pricing model comprising of multiple risk factors.Keywords: book to market equity, Fama French four factor model, market capitalization, profitability, size effect, value effect
Procedia PDF Downloads 26419386 The Effectiveness of a Hybrid Diffie-Hellman-RSA-Advanced Encryption Standard Model
Authors: Abdellahi Cheikh
Abstract:
With the emergence of quantum computers with very powerful capabilities, the security of the exchange of shared keys between two interlocutors poses a big problem in terms of the rapid development of technologies such as computing power and computing speed. Therefore, the Diffie-Hellmann (DH) algorithm is more vulnerable than ever. No mechanism guarantees the security of the key exchange, so if an intermediary manages to intercept it, it is easy to intercept. In this regard, several studies have been conducted to improve the security of key exchange between two interlocutors, which has led to interesting results. The modification made on our model Diffie-Hellman-RSA-AES (DRA), which encrypts the information exchanged between two users using the three-encryption algorithms DH, RSA and AES, by using stenographic photos to hide the contents of the p, g and ClesAES values that are sent in an unencrypted state at the level of DRA model to calculate each user's public key. This work includes a comparative study between the DRA model and all existing solutions, as well as the modification made to this model, with an emphasis on the aspect of reliability in terms of security. This study presents a simulation to demonstrate the effectiveness of the modification made to the DRA model. The obtained results show that our model has a security advantage over the existing solution, so we made these changes to reinforce the security of the DRA model.Keywords: Diffie-Hellmann, DRA, RSA, advanced encryption standard
Procedia PDF Downloads 9419385 Project Management Agile Model Based on Project Management Body of Knowledge Guideline
Authors: Mehrzad Abdi Khalife, Iraj Mahdavi
Abstract:
This paper presents the agile model for project management process. For project management process, the Project Management Body of Knowledge (PMBOK) guideline has been selected as platform. Combination of computational science and artificial intelligent methodology has been added to the guideline to transfer the standard to agile project management process. The model is the combination of practical standard, computational science and artificial intelligent. In this model, we present communication model and protocols to keep process agile. Here, we illustrate the collaboration man and machine in project management area with artificial intelligent approach.Keywords: artificial intelligent, conceptual model, man-machine collaboration, project management, standard
Procedia PDF Downloads 34219384 Design of the Intelligent Virtual Learning Coach. A Contextual Learning Approach to Digital Literacy of Senior Learners in the Context of Electronic Health Record (EHR)
Authors: Ilona Buchem, Carolin Gellner
Abstract:
The call for the support of senior learners in the development of digital literacy has become prevalent in recent years, especially in view of the aging societies paired with advances in digitalization in all spheres of life, including e-health. The goal has been to create opportunities for learning that incorporate the use of context in a reflective and dialogical way. Contextual learning has focused on developing skills through the application of authentic problems. While major research efforts in supporting senior learners in developing digital literacy have been invested so far in e-learning, focusing on knowledge acquisition and cognitive tasks, little research exists in reflective mentoring and coaching with the help of pedagogical agents and addressing the contextual dimensions of learning. This paper describes an approach to creating opportunities for senior learners to improve their digital literacy in the authentic context of the electronic health record (EHR) with the support of an intelligent virtual learning coach. The paper focuses on the design of the virtual coach as part of an e-learning system, which was developed in the EPA-Coach project founded by the German Ministry of Education and Research. The paper starts with the theoretical underpinnings of contextual learning and the related design considerations for a virtual learning coach based on previous studies. Since previous research in the area was mostly designed to cater to the needs of younger audiences, the results had to be adapted to the specific needs of senior learners. Next, the paper outlines the stages in the design of the virtual coach, which included the adaptation of the design requirements, the iterative development of the prototypes, the results of the two evaluation studies and how these results were used to improve the design of the virtual coach. The paper then presents the four prototypes of a senior-friendly virtual learning coach, which were designed to represent different preferences related to the visual appearance, the communication and social interaction styles, and the pedagogical roles. The first evaluation of the virtual coach design was an exploratory, qualitative study, which was carried out in October 2020 with eight seniors aged 64 to 78 and included a range of questions about the preferences of senior learners related to the visual design, gender, age, communication and role. Based on the results of the first evaluation, the design was adapted to the preferences of the senior learners and the new versions of prototypes were created to represent two male and two female options of the virtual coach. The second evaluation followed a quantitative approach with an online questionnaire and was conducted in May 2021 with 41 seniors aged 66 to 93 years. Following three research questions, the survey asked about (1) the intention to use, (2) the perceived characteristics, and (3) the preferred communication/interaction style of the virtual coach, i. e. task-oriented, relationship-oriented, or a mix. This paper follows with the discussion of the results of the design process and ends with conclusions and next steps in the development of the virtual coach including recommendations for further research.Keywords: virtual learning coach, virtual mentor, pedagogical agent, senior learners, digital literacy, electronic health records
Procedia PDF Downloads 18119383 Parameter Estimation for the Oral Minimal Model and Parameter Distinctions Between Obese and Non-obese Type 2 Diabetes
Authors: Manoja Rajalakshmi Aravindakshana, Devleena Ghosha, Chittaranjan Mandala, K. V. Venkateshb, Jit Sarkarc, Partha Chakrabartic, Sujay K. Maity
Abstract:
Oral Glucose Tolerance Test (OGTT) is the primary test used to diagnose type 2 diabetes mellitus (T2DM) in a clinical setting. Analysis of OGTT data using the Oral Minimal Model (OMM) along with the rate of appearance of ingested glucose (Ra) is performed to study differences in model parameters for control and T2DM groups. The differentiation of parameters of the model gives insight into the behaviour and physiology of T2DM. The model is also studied to find parameter differences among obese and non-obese T2DM subjects and the sensitive parameters were co-related to the known physiological findings. Sensitivity analysis is performed to understand changes in parameter values with model output and to support the findings, appropriate statistical tests are done. This seems to be the first preliminary application of the OMM with obesity as a distinguishing factor in understanding T2DM from estimated parameters of insulin-glucose model and relating the statistical differences in parameters to diabetes pathophysiology.Keywords: oral minimal model, OGTT, obese and non-obese T2DM, mathematical modeling, parameter estimation
Procedia PDF Downloads 9419382 Model Order Reduction Using Hybrid Genetic Algorithm and Simulated Annealing
Authors: Khaled Salah
Abstract:
Model order reduction has been one of the most challenging topics in the past years. In this paper, a hybrid solution of genetic algorithm (GA) and simulated annealing algorithm (SA) are used to approximate high-order transfer functions (TFs) to lower-order TFs. In this approach, hybrid algorithm is applied to model order reduction putting in consideration improving accuracy and preserving the properties of the original model which are two important issues for improving the performance of simulation and computation and maintaining the behavior of the original complex models being reduced. Compared to conventional mathematical methods that have been used to obtain a reduced order model of high order complex models, our proposed method provides better results in terms of reducing run-time. Thus, the proposed technique could be used in electronic design automation (EDA) tools.Keywords: genetic algorithm, simulated annealing, model reduction, transfer function
Procedia PDF Downloads 14419381 Application of Machine Learning on Google Earth Engine for Forest Fire Severity, Burned Area Mapping and Land Surface Temperature Analysis: Rajasthan, India
Authors: Alisha Sinha, Laxmi Kant Sharma
Abstract:
Forest fires are a recurring issue in many parts of the world, including India. These fires can have various causes, including human activities (such as agricultural burning, campfires, or discarded cigarettes) and natural factors (such as lightning). This study presents a comprehensive and advanced methodology for assessing wildfire susceptibility by integrating diverse environmental variables and leveraging cutting-edge machine learning techniques across Rajasthan, India. The primary goal of the study is to utilize Google Earth Engine to compare locations in Sariska National Park, Rajasthan (India), before and after forest fires. High-resolution satellite data were used to assess the amount and types of changes caused by forest fires. The present study meticulously analyzes various environmental variables, i.e., slope orientation, elevation, normalized difference vegetation index (NDVI), drainage density, precipitation, and temperature, to understand landscape characteristics and assess wildfire susceptibility. In addition, a sophisticated random forest regression model is used to predict land surface temperature based on a set of environmental parameters.Keywords: wildfire susceptibility mapping, LST, random forest, GEE, MODIS, climatic parameters
Procedia PDF Downloads 2319380 The Use of Semantic Mapping Technique When Teaching English Vocabulary at Saudi Schools
Authors: Mohammed Hassan Alshaikhi
Abstract:
Vocabulary is essential factor of learning and mastering any languages, and it helps learners to communicate with others and to be understood. The aim of this study was to examine whether semantic mapping technique was helpful in terms of improving student's English vocabulary learning comparing to the traditional technique. The students’ age was between 11 and 13 years old. There were 60 students in total who participated in this study. 30 students were in the treatment group (target vocabulary items were taught with semantic mapping). The other 30 students were in the control group (the target vocabulary items were taught by a traditional technique). A t-test was used with the results of pre-test and post-test in order to examine the outcomes of using semantic mapping when teaching vocabulary. The results showed that the vocabulary mastery in the treatment group was increased more than the control group.Keywords: English language, learning vocabulary, Saudi teachers, semantic mapping, teaching vocabulary strategies
Procedia PDF Downloads 24819379 Towards an Enhanced Compartmental Model for Profiling Malware Dynamics
Authors: Jessemyn Modiini, Timothy Lynar, Elena Sitnikova
Abstract:
We present a novel enhanced compartmental model for malware spread analysis in cyber security. This paper applies cyber security data features to epidemiological compartmental models to model the infectious potential of malware. Compartmental models are most efficient for calculating the infectious potential of a disease. In this paper, we discuss and profile epidemiologically relevant data features from a Domain Name System (DNS) dataset. We then apply these features to epidemiological compartmental models to network traffic features. This paper demonstrates how epidemiological principles can be applied to the novel analysis of key cybersecurity behaviours and trends and provides insight into threat modelling above that of kill-chain analysis. In applying deterministic compartmental models to a cyber security use case, the authors analyse the deficiencies and provide an enhanced stochastic model for cyber epidemiology. This enhanced compartmental model (SUEICRN model) is contrasted with the traditional SEIR model to demonstrate its efficacy.Keywords: cybersecurity, epidemiology, cyber epidemiology, malware
Procedia PDF Downloads 11019378 Predicting the Next Offensive Play Types will be Implemented to Maximize the Defense’s Chances of Success in the National Football League
Authors: Chris Schoborg, Morgan C. Wang
Abstract:
In the realm of the National Football League (NFL), substantial dedication of time and effort is invested by both players and coaches in meticulously analyzing the game footage of their opponents. The primary aim is to anticipate the actions of the opposing team. Defensive players and coaches are especially focused on deciphering their adversaries' intentions to effectively counter their strategies. Acquiring insights into the specific play type and its intended direction on the field would confer a significant competitive advantage. This study establishes pre-snap information as the cornerstone for predicting both the play type (e.g., deep pass, short pass, or run) and its spatial trajectory (right, left, or center). The dataset for this research spans the regular NFL season data for all 32 teams from 2013 to 2022. This dataset is acquired using the nflreadr package, which conveniently extracts play-by-play data from NFL games and imports it into the R environment as structured datasets. In this study, we employ a recently developed machine learning algorithm, XGBoost. The final predictive model achieves an impressive lift of 2.61. This signifies that the presented model is 2.61 times more effective than random guessing—a significant improvement. Such a model has the potential to markedly enhance defensive coaches' ability to formulate game plans and adequately prepare their players, thus mitigating the opposing offense's yardage and point gains.Keywords: lift, NFL, sports analytics, XGBoost
Procedia PDF Downloads 5719377 Using Happening Performance in Vocabulary Teaching
Authors: Mustafa Gultekin
Abstract:
It is believed that drama can be used in language classes to create a positive atmosphere for students to use the target language in an interactive way. Thus, drama has been extensively used in many settings in language classes. Although happening has been generally used as a performance art of theatre, this new kind of performance has not been widely known in language teaching area. Therefore, it can be an innovative idea to use happening in language classes, and thus a positive environment can be created for students to use the language in an interactive way. Happening can be defined as an art performance that puts emphasis on interaction in an audience. Because of its interactive feature, happening can also be used in language classes to motivate students to use the language in an interactive environment. The present study aims to explain how a happening performance can be applied to a learning environment to teach vocabulary in English. In line with this purpose, a learning environment was designed for a vocabulary presentation lesson. At the end of the performance, students were asked to compare the traditional way of teaching and happening performance in terms of effectiveness. It was found that happening performance provided the students with a more creative and interactive environment to use the language. Therefore, happening can be used in language classrooms as an innovative tool for education.Keywords: English, happening, language learning, vocabulary teaching
Procedia PDF Downloads 36819376 The Opinions of Nursing Students Regarding Humanized Care through Volunteer Activities at Boromrajonani College of Nursing, Chonburi
Authors: P. Phenpun, S. Wareewan
Abstract:
This qualitative study aimed to describe the opinions in relation to humanized care emerging from the volunteer activities of nursing students at Boromarajonani College of Nursing, Chonburi, Thailand. One hundred and twenty-seven second-year nursing students participated in this study. The volunteer activity model was composed of preparation, implementation, and evaluation through a learning log, in which students were encouraged to write their daily activities after completing practical training at the healthcare center. The preparation content included three main categories: service minded, analytical thinking, and client participation. The preparation process took over three days that accumulates up to 20 hours only. The implementation process was held over 10 days, but with a total of 70 hours only, with participants taking part in volunteer work activities at a healthcare center. A learning log was used for evaluation and data were analyzed using content analysis. The findings were as follows. With service minded, there were two subcategories that emerged from volunteer activities, which were service minded towards patients and within themselves. There were three categories under service minded towards patients, which were rapport, compassion, and empathy service behaviors, and there were four categories under service minded within themselves, which were self-esteem, self-value, management potential, and preparedness in providing good healthcare services. In line with analytical thinking, there were two components of analytical thinking, which were analytical skill for their works and analytical thinking for themselves. There were four subcategories under analytical thinking for their works, which were evidence based thinking, real situational thinking, cause analysis thinking, and systematic thinking, respectively. There were four subcategories under analytical thinking for themselves, which were comparative between themselves, towards their clients that leads to the changing of their service behaviors, open-minded thinking, modernized thinking, and verifying both verbal and non-verbal cues. Lastly, there were three categories under participation, which were mutual rapport relationship; reconsidering client’s needs services and providing useful health care information.Keywords: humanized care service, volunteer activity, nursing student, learning log
Procedia PDF Downloads 30819375 Integrated Machine Learning Framework for At-Home Patients Personalized Risk Prediction Using Activities, Biometric, and Demographic Features
Authors: Claire Xu, Welton Wang, Manasvi Pinnaka, Anqi Pan, Michael Han
Abstract:
Hospitalizations account for one-third of the total health care spending in the US. Early risk detection and intervention can reduce this high cost and increase the satisfaction of both patients and physicians. Due to the lack of awareness of the potential arising risks in home environment, the opportunities for patients to seek early actions of clinical visits are dramatically reduced. This research aims to offer a highly personalized remote patients monitoring and risk assessment AI framework to identify the potentially preventable hospitalization for both acute as well as chronic diseases. A hybrid-AI framework is trained with data from clinical setting, patients surveys, as well as online databases. 20+ risk factors are analyzed ranging from activities, biometric info, demographic info, socio-economic info, hospitalization history, medication info, lifestyle info, etc. The AI model yields high performance of 87% accuracy and 88 sensitivity with 20+ features. This hybrid-AI framework is proven to be effective in identifying the potentially preventable hospitalization. Further, the high indicative features are identified by the models which guide us to a healthy lifestyle and early intervention suggestions.Keywords: hospitalization prevention, machine learning, remote patient monitoring, risk prediction
Procedia PDF Downloads 23719374 An Intelligent Baby Care System Based on IoT and Deep Learning Techniques
Authors: Chinlun Lai, Lunjyh Jiang
Abstract:
Due to the heavy burden and pressure of caring for infants, an integrated automatic baby watching system based on IoT smart sensing and deep learning machine vision techniques is proposed in this paper. By monitoring infant body conditions such as heartbeat, breathing, body temperature, sleeping posture, as well as the surrounding conditions such as dangerous/sharp objects, light, noise, humidity and temperature, the proposed system can analyze and predict the obvious/potential dangerous conditions according to observed data and then adopt suitable actions in real time to protect the infant from harm. Thus, reducing the burden of the caregiver and improving safety efficiency of the caring work. The experimental results show that the proposed system works successfully for the infant care work and thus can be implemented in various life fields practically.Keywords: baby care system, Internet of Things, deep learning, machine vision
Procedia PDF Downloads 22719373 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Rajaian Hoonejani Mohammad, Eshraghi Pegah, Zomorodian Zahra Sadat, Tahsildoost Mohammad
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 7519372 University Lecturers' Attitudes towards Learner Autonomy in the EFL Context in Vietnam
Authors: Nhung T. Bui
Abstract:
Part of the dilemma facing educational reforms in Vietnam as in other Asian contexts is how to encourage more independence in students’ learning approaches. Since 2005, the Ministry of Education and Training of Vietnam has included the students’ ability to learn independently in its national education objectives. While learner autonomy has been viewed as a goal in the teaching and learning English as a foreign language (EFL) and there has been a considerable literature on strategies to stimulate autonomy in learners, teachers’ voices have rarely been heard. Given that teachers play a central role in helping their students to be more autonomous, especially in an inherent Confucian heritage culture like Vietnam, their attitudes towards learner autonomy should be investigated before any practical implementations could be undertaken. This paper reports significant findings of a survey questionnaire with 262 lecturers of English from 5 universities in Hanoi, Vietnam giving opinions regarding the practices and prospects of learner autonomy in their classrooms. The study reveals that lecturers perceive they should be more responsible than their students in all class-related activities; they most appreciate their students’ ability to learn cooperatively and that they consider stimulating students’ interest as the most important teaching strategy to promote learner autonomy. Lecturers, then, are strongly suggested to gradually ‘empower’ their students through the application of out-of-classroom activities; of learning activities which requires collaboration and team spirit; and of activities which could boost students’ interest in learning English.Keywords: English as a foreign language, higher education, learner autonomy, Vietnam
Procedia PDF Downloads 26819371 Performance Comparison of Situation-Aware Models for Activating Robot Vacuum Cleaner in a Smart Home
Authors: Seongcheol Kwon, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.Keywords: situation-awareness, smart home, IoT, machine learning, classifier
Procedia PDF Downloads 42319370 Teachers' Emphatic Concern for Their Learners
Authors: Prakash Singh
Abstract:
The focus of this exploratory study is on whether teachers demonstrate emphatic concern for their learners in planning, implementing and assessing learning outcomes in their regular classrooms. Empathy must be shown to all learners equally and not only for high-risk learners at the expense of other ability learners. Empathy demonstrated by teachers allows them to build a stronger bond with all their learners. This bond based on trust leads to positive outcomes for learners to be able to excel in their work. Empathic teachers must make every effort to simplify the subject matter for high risk learners so that these learners not only enjoy their learning activities but are also successful like their more able peers. A total of 87.5% of the participants agreed that empathy allows teachers to demonstrate humanistic values in their choice of learning materials for learners of different abilities. It is therefore important for teachers to select content and instructional materials that will contribute to the learners’ success in the mainstream of education. It is also imperative for teachers to demonstrate empathic skills and consequently, to be attuned to the emotions and emotional needs of their learners. Schools need to be reformed, not by simply lengthening the school day or by simply adding more content in the curriculum, but by making school more satisfying to learners. This must be consistent with their diverse learning needs and interests so that they gain a sense of power, fulfillment, and importance in their regular classrooms. Hence, teacher - pupil relationships based on empathic concern for the latter’s educational needs lays the foundation for quality education to be offered.Keywords: emotional intelligence, empathy, learners’ emotional needs, teachers’ empathic skills
Procedia PDF Downloads 43719369 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning
Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza
Abstract:
The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library
Procedia PDF Downloads 17819368 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning
Authors: Yangzhi Li
Abstract:
Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.Keywords: robotic construction, robotic assembly, visual guidance, machine learning
Procedia PDF Downloads 8719367 Towards Visual Personality Questionnaires Based on Deep Learning and Social Media
Authors: Pau Rodriguez, Jordi Gonzalez, Josep M. Gonfaus, Xavier Roca
Abstract:
Image sharing in social networks has increased exponentially in the past years. Officially, there are 600 million Instagrammers uploading around 100 million photos and videos per day. Consequently, there is a need for developing new tools to understand the content expressed in shared images, which will greatly benefit social media communication and will enable broad and promising applications in education, advertisement, entertainment, and also psychology. Following these trends, our work aims to take advantage of the existing relationship between text and personality, already demonstrated by multiple researchers, so that we can prove that there exists a relationship between images and personality as well. To achieve this goal, we consider that images posted on social networks are typically conditioned on specific words, or hashtags, therefore any relationship between text and personality can also be observed with those posted images. Our proposal makes use of the most recent image understanding models based on neural networks to process the vast amount of data generated by social users to determine those images most correlated with personality traits. The final aim is to train a weakly-supervised image-based model for personality assessment that can be used even when textual data is not available, which is an increasing trend. The procedure is described next: we explore the images directly publicly shared by users based on those accompanying texts or hashtags most strongly related to personality traits as described by the OCEAN model. These images will be used for personality prediction since they have the potential to convey more complex ideas, concepts, and emotions. As a result, the use of images in personality questionnaires will provide a deeper understanding of respondents than through words alone. In other words, from the images posted with specific tags, we train a deep learning model based on neural networks, that learns to extract a personality representation from a picture and use it to automatically find the personality that best explains such a picture. Subsequently, a deep neural network model is learned from thousands of images associated with hashtags correlated to OCEAN traits. We then analyze the network activations to identify those pictures that maximally activate the neurons: the most characteristic visual features per personality trait will thus emerge since the filters of the convolutional layers of the neural model are learned to be optimally activated depending on each personality trait. For example, among the pictures that maximally activate the high Openness trait, we can see pictures of books, the moon, and the sky. For high Conscientiousness, most of the images are photographs of food, especially healthy food. The high Extraversion output is mostly activated by pictures of a lot of people. In high Agreeableness images, we mostly see flower pictures. Lastly, in the Neuroticism trait, we observe that the high score is maximally activated by animal pets like cats or dogs. In summary, despite the huge intra-class and inter-class variabilities of the images associated to each OCEAN traits, we found that there are consistencies between visual patterns of those images whose hashtags are most correlated to each trait.Keywords: emotions and effects of mood, social impact theory in social psychology, social influence, social structure and social networks
Procedia PDF Downloads 198