Search results for: student network
3629 Physics Informed Deep Residual Networks Based Type-A Aortic Dissection Prediction
Abstract:
Purpose: Acute Type A aortic dissection is a well-known cause of extremely high mortality rate. A highly accurate and cost-effective non-invasive predictor is critically needed so that the patient can be treated at earlier stage. Although various CFD approaches have been tried to establish some prediction frameworks, they are sensitive to uncertainty in both image segmentation and boundary conditions. Tedious pre-processing and demanding calibration procedures requirement further compound the issue, thus hampering their clinical applicability. Using the latest physics informed deep learning methods to establish an accurate and cost-effective predictor framework are amongst the main goals for a better Type A aortic dissection treatment. Methods: Via training a novel physics-informed deep residual network, with non-invasive 4D MRI displacement vectors as inputs, the trained model can cost-effectively calculate all these biomarkers: aortic blood pressure, WSS, and OSI, which are used to predict potential type A aortic dissection to avoid the high mortality events down the road. Results: The proposed deep learning method has been successfully trained and tested with both synthetic 3D aneurysm dataset and a clinical dataset in the aortic dissection context using Google colab environment. In both cases, the model has generated aortic blood pressure, WSS, and OSI results matching the expected patient’s health status. Conclusion: The proposed novel physics-informed deep residual network shows great potential to create a cost-effective, non-invasive predictor framework. Additional physics-based de-noising algorithm will be added to make the model more robust to clinical data noises. Further studies will be conducted in collaboration with big institutions such as Cleveland Clinic with more clinical samples to further improve the model’s clinical applicability.Keywords: type-a aortic dissection, deep residual networks, blood flow modeling, data-driven modeling, non-invasive diagnostics, deep learning, artificial intelligence.
Procedia PDF Downloads 883628 The Aftermath of Insurgency on Educational Attainment in Nigeria: A Peril on National Development
Authors: David Chapola Nggada
Abstract:
This is a survey designed to find out the impact of the ongoing insurgency in north eastern Nigeria on educational attainment. It is a mixture of both qualitative and quantitative research work on a sample size of 71 secondary school students currently displaced from Baga Biu and Monguno areas of Borno State, now residing as internally displaced persons(IDPs) in Gombe and Yola IDP camps. This was done through both semi structured interview and questionnaire administration. Statistical methods used include percentage and cross tables to gain specific insight into different dimensions of what this implies. Two major aspects of the impact covered were impact on individual student and impact on societal development. These two dimensions were measured against national development variables and analyzed against reviewed literature and findings across the globe. A combination of theories from different fields led to a deeper and better insight. The results confirm a significant relationship between educational attainment and the development of the north east region and Nigeria as a whole. Recommendations were made on ways of reintegrating this group back to the educational system.Keywords: education, insurgency, national development, threat
Procedia PDF Downloads 2423627 “To Err Is Human…” Revisiting Oral Error Correction in Class
Authors: David Steven Rosenstein
Abstract:
The widely accepted “Input Theory” of language acquisition proposes that language is basically acquired unconsciously through extensive exposure to all kinds of natural oral and written sources, especially those where the level of the input is slightly above the learner’s competence. As such, it implies that oral error correction by teachers in a classroom is unnecessary, a waste of time, and maybe even counterproductive. And yet, oral error correction by teachers in the classroom continues to be a very common phenomenon. While input theory advocates claim that such correction doesn’t work, interrupts a student’s train of thought, harms fluency, and may cause students embarrassment and fear, many teachers would disagree. They would claim that students know they make mistakes and want to be corrected in order to know they are improving, thereby encouraging students’ desire to keep studying. Moreover, good teachers can create a positive atmosphere where students will not be embarrassed or fearful. Perhaps now is the time to revisit oral error correction in the classroom and consider the results of research carried out long ago by the present speaker. The research indicates that oral error correction may be beneficial in many cases.Keywords: input theory, language acquisition, teachers' corrections, recurrent errors
Procedia PDF Downloads 313626 Start Talking in an E-Learning Environment: Building and Sustaining Communities of Practice
Authors: Melissa C. LaDuke
Abstract:
The purpose of this literature review was to identify the use of online communities of practice (CoPs) within e-learning environments as a method to build social interaction and student-centered educational experiences. A literature review was conducted to survey and collect scholarly thoughts concerning CoPs from a variety of sources. Data collected included best practices, ties to educational theories, and examples of online CoPs. Social interaction has been identified as a critical piece of the learning infrastructure, specifically for adult learners. CoPs are an effective way to help students connect to each other and the material of interest. The use of CoPs falls in line with many educational theories, including situated learning theory, social constructivism, connectivism, adult learning theory, and motivation. New literacies such as social media and gamification can help increase social interaction in online environments and provide methods to host CoPs. Steps to build and sustain a CoP were discussed in addition to CoP considerations and best practices.Keywords: community of practice, knowledge sharing, social interaction, online course design, new literacies
Procedia PDF Downloads 903625 Teaching Children about Their Brains: Evaluating the Role of Neuroscience Undergraduates in Primary School Education
Authors: Clea Southall
Abstract:
Many children leave primary school having formed preconceptions about their relationship with science. Thus, primary school represents a critical window for stimulating scientific interest in younger children. Engagement relies on the provision of hands-on activities coupled with an ability to capture a child’s innate curiosity. This requires children to perceive science topics as interesting and relevant to their everyday life. Teachers and pupils alike have suggested the school curriculum be tailored to help stimulate scientific interest. Young children are naturally inquisitive about the human body; the brain is one topic which frequently engages pupils, although it is not currently included in the UK primary curriculum. Teaching children about the brain could have wider societal impacts such as increasing knowledge of neurological disorders. However, many primary school teachers do not receive formal neuroscience training and may feel apprehensive about delivering lessons on the nervous system. This is exacerbated by a lack of educational neuroscience resources. One solution is for undergraduates to form partnerships with schools - delivering engaging lessons and supplementing teacher knowledge. The aim of this project was to evaluate the success of a short lesson on the brain delivered by an undergraduate neuroscientist to primary school pupils. Prior to entering schools, semi-structured online interviews were conducted with teachers to gain pedagogical advice and relevant websites were searched for neuroscience resources. Subsequently, a single lesson plan was created comprising of four hands-on activities. The activities were devised in a top-down manner, beginning with learning about the brain as an entity, before focusing on individual neurons. Students were asked to label a ‘brain map’ to assess prior knowledge of brain structure and function. They viewed animal brains and created ‘pipe-cleaner neurons’ which were later used to depict electrical transmission. The same session was delivered by an undergraduate student to 570 key stage 2 (KS2) pupils across five schools in Leeds, UK. Post-session surveys, designed for teachers and pupils respectively, were used to evaluate the session. Children in all year groups had relatively poor knowledge of brain structure and function at the beginning of the session. When asked to label four brain regions with their respective functions, older pupils labeled a mean of 1.5 (± 1.0) brain regions compared to 0.8 (± 0.96) for younger pupils (p=0.002). However, by the end of the session, 95% of pupils felt their knowledge of the brain had increased. Hands-on activities were rated most popular by pupils and were considered the most successful aspect of the session by teachers. Although only half the teachers were aware of neuroscience educational resources, nearly all (95%) felt they would have more confidence in teaching a similar session in the future. All teachers felt the session was engaging and that the content could be linked to the current curriculum. Thus, a short fifty-minute session can successfully enhance pupils’ knowledge of a new topic: the brain. Partnerships with an undergraduate student can provide an alternative method for supplementing teacher knowledge, increasing their confidence in delivering future lessons on the nervous system.Keywords: education, neuroscience, primary school, undergraduate
Procedia PDF Downloads 2093624 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society
Authors: Irene Yi
Abstract:
Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.Keywords: computational analysis, gendered grammar, misogynistic language, neural networks
Procedia PDF Downloads 1193623 Application of Self-Efficacy Theory in Counseling Deaf and Hard of Hearing Students
Authors: Nancy A. Delich, Stephen D. Roberts
Abstract:
This case study explores using self-efficacy theory in counseling deaf and hard of hearing students in one California school district. Self-efficacy is described as the confidence a student has for performing a set of skills required to succeed at a specific task. When students need to learn a skill, self-efficacy can be a major factor in influencing behavioral change. Self-efficacy is domain specific, meaning that students can have high confidence in their abilities to accomplish a task in one domain, while at the same time having low confidence in their abilities to accomplish another task in a different domain. The communication isolation experienced by deaf and hard of hearing children and adolescents can negatively impact their belief about their ability to navigate life challenges. There is a need to address issues that impact deaf and hard of hearing students’ social-emotional development. Failure to address these needs may result in depression, suicidal ideation, and anxiety among other mental health concerns. Self-efficacy training can be used to address these socio-emotional developmental issues with this population. Four sources of experiences are applied during an intervention: (a) enactive mastery experience, (b) vicarious experience, (c) verbal persuasion, and (d) physiological and affective states. This case study describes the use of self-efficacy training with a coed group of 12 deaf and hard of hearing high school students who experienced bullying at school. Beginning with enactive mastery experience, the counselor introduced the topic of bullying to the group. The counselor educated the students about the different types of bullying while teaching them the terminology, signs and their meanings. The most effective way to increase self-efficacy is through extensive practice. To better understand these concepts, the students practiced through role-playing with the goal of developing self-advocacy skills. Vicarious experience is the perception that students have about their capabilities. Viewing other students advocating for themselves, cognitively rehearsing what actions they will and will not take, and teaching each other how to stand up against bullying can strengthen their belief in successfully overcoming bullying. The third source of self-efficacy beliefs is verbal persuasion. It occurs when others express belief in the capabilities of the student. Didactic training and pedagogic materials on bullying were employed as part of the group counseling sessions. The fourth source of self-efficacy appraisals is physiological and affective states. Students expect positive emotions to be associated with successful skilled performance. When students practice new skills, the counselor can apply several strategies to enhance self-efficacy while reducing and controlling emotional and physical states. The intervention plan incorporated all four sources of self-efficacy training during several interactive group sessions regarding bullying. There was an increased understanding around the issues of bullying, resulting in the students’ belief of their ability to perform protective behaviors and deter future occurrences. The outcome of the intervention plan resulted in a reduction of reported bullying incidents. In conclusion, self-efficacy training can be an effective counseling and teaching strategy in addressing and enhancing the social-emotional functioning with deaf and hard of hearing adolescents.Keywords: counseling, self-efficacy, bullying, social-emotional development, mental health, deaf and hard of hearing students
Procedia PDF Downloads 3513622 Development of an Improved Paradigm for the Tourism Sector in the Department of Huila, Colombia: A Theoretical and Empirical Approach
Authors: Laura N. Bolivar T.
Abstract:
The tourism importance for regional development is mainly highlighted by the collaborative, cooperating and competitive relationships of the involved agents. The fostering of associativity processes, in particular, the cluster approach emphasizes the beneficial outcomes from the concentration of enterprises, where innovation and entrepreneurship flourish and shape the dynamics for tourism empowerment. Considering the department of Huila, it is located in the south-west of Colombia and holds the biggest coffee production in the country, although it barely contributes to the national GDP. Hence, its economic development strategy is looking for more dynamism and Huila could be consolidated as a leading destination for cultural, ecological and heritage tourism, if at least the public policy making processes for the tourism management of La Tatacoa Desert, San Agustin Park and Bambuco’s National Festival, were implemented in a more efficient manner. In this order of ideas, this study attempts to address the potential restrictions and beneficial factors for the consolidation of the tourism sector of Huila-Colombia as a cluster and how could it impact its regional development. Therefore, a set of theoretical frameworks such as the Tourism Routes Approach, the Tourism Breeding Environment, the Community-based Tourism Method, among others, but also a collection of international experiences describing tourism clustering processes and most outstanding problematics, is analyzed to draw up learning points, structure of proceedings and success-driven factors to be contrasted with the local characteristics in Huila, as the region under study. This characterization involves primary and secondary information collection methods and comprises the South American and Colombian context together with the identification of involved actors and their roles, main interactions among them, major tourism products and their infrastructure, the visitors’ perspective on the situation and a recap of the related needs and benefits regarding the host community. Considering the umbrella concepts, the theoretical and the empirical approaches, and their comparison with the local specificities of the tourism sector in Huila, an array of shortcomings is analytically constructed and a series of guidelines are proposed as a way to overcome them and simultaneously, raise economic development and positively impact Huila’s well-being. This non-exhaustive bundle of guidelines is focused on fostering cooperating linkages in the actors’ network, dealing with Information and Communication Technologies’ innovations, reinforcing the supporting infrastructure, promoting the destinations considering the less known places as well, designing an information system enabling the tourism network to assess the situation based on reliable data, increasing competitiveness, developing participative public policy-making processes and empowering the host community about the touristic richness. According to this, cluster dynamics would drive the tourism sector to meet articulation and joint effort, then involved agents and local particularities would be adequately assisted to cope with the current changing environment of globalization and competition.Keywords: innovative strategy, local development, network of tourism actors, tourism cluster
Procedia PDF Downloads 1413621 Digital Dialogue Game, Epistemic Beliefs, Argumentation and Learning
Authors: Omid Noroozi, Martin Mulder
Abstract:
The motivational potential of educational games is undeniable especially for teaching topics and skills that are difficult to deal with in traditional educational situations such as argumentation competence. Willingness to argue has an association with student epistemic beliefs, which can influence whether, and the way in which students engage in argumentative discourse activities and critical discussion. The goal of this study was to explore how undergraduate students engage with argumentative discourse activities which have been designed to intensify debate, and whether epistemic beliefs are significant to the outcomes. A pre-test, post-test design was used with students who were assigned to groups of four. They were asked to argue a controversial topic with the aim of exploring various perspectives, and the 'pros and cons' on the topic of 'Genetically Modified Organisms (GMOs)'. The results show that the game facilitated argumentative discourse and a willingness to argue and challenged peers, regardless of students’ epistemic beliefs. Furthermore, the game was evaluated positively in terms of students’ motivation and satisfaction with the learning experience.Keywords: argumentation, attitudinal change, epistemic beliefs, dialogue, digital game objectives and theoretical
Procedia PDF Downloads 4033620 An ANOVA-based Sequential Forward Channel Selection Framework for Brain-Computer Interface Application based on EEG Signals Driven by Motor Imagery
Authors: Forouzan Salehi Fergeni
Abstract:
Converting the movement intents of a person into commands for action employing brain signals like electroencephalogram signals is a brain-computer interface (BCI) system. When left or right-hand motions are imagined, different patterns of brain activity appear, which can be employed as BCI signals for control. To make better the brain-computer interface (BCI) structures, effective and accurate techniques for increasing the classifying precision of motor imagery (MI) based on electroencephalography (EEG) are greatly needed. Subject dependency and non-stationary are two features of EEG signals. So, EEG signals must be effectively processed before being used in BCI applications. In the present study, after applying an 8 to 30 band-pass filter, a car spatial filter is rendered for the purpose of denoising, and then, a method of analysis of variance is used to select more appropriate and informative channels from a category of a large number of different channels. After ordering channels based on their efficiencies, a sequential forward channel selection is employed to choose just a few reliable ones. Features from two domains of time and wavelet are extracted and shortlisted with the help of a statistical technique, namely the t-test. Finally, the selected features are classified with different machine learning and neural network classifiers being k-nearest neighbor, Probabilistic neural network, support-vector-machine, Extreme learning machine, decision tree, Multi-layer perceptron, and linear discriminant analysis with the purpose of comparing their performance in this application. Utilizing a ten-fold cross-validation approach, tests are performed on a motor imagery dataset found in the BCI competition III. Outcomes demonstrated that the SVM classifier got the greatest classification precision of 97% when compared to the other available approaches. The entire investigative findings confirm that the suggested framework is reliable and computationally effective for the construction of BCI systems and surpasses the existing methods.Keywords: brain-computer interface, channel selection, motor imagery, support-vector-machine
Procedia PDF Downloads 473619 Assessing Secondary School Curricula in the light of Developing Quality of Life Standards of High School Students
Authors: Othman Ali Alghtani, Yahya Abdul-Ekhalq Ali, Abdullah Abdul-Ekhalq Ali, Ahmed Al Sadiq Abdul Majeed, Najwa Attian Al-Mohammadi, Obead Mozel Alharbi, Sabri Mohamed Ismail, Omar Ibrahim Asiri
Abstract:
This study assessed the curricula of secondary schools given requirements to enhance the quality of life of students. The components of quality of life were described to build a list of standards and indicators. A questionnaire assessing the dimensions of mental (cognitive and emotional), physical, digital, and social health, and environmental awareness was prepared. A descriptive-analytical approach was used on a sample of 258 teachers and educational supervisors in Tabuk. The results indicated shortcomings in the secondary school curricula regarding developing standards and indicators of components of quality of life. Results also indicated that secondary school curricula incorporated few practices to improve student’s quality of life. No significant differences were found regarding the core subject, job, gender, and years of experience.Keywords: assessing curricula, teacher practices, quality of life, teaching practices
Procedia PDF Downloads 2653618 Enhancing Teacher Retention and Professional Satisfaction: An Analysis of Salaries, Policies, and Educational Frameworks
Authors: Melissa Beck Wells
Abstract:
This study examines the complex factors affecting teacher retention across states, focusing on the roles of salaries, educational policies, and professional development. Despite efforts to reduce teacher turnover, it remains a significant challenge, impacting the quality of education and student outcomes. Analysis of data from the National Education Association, the ‘Raise the Bar’ initiative, and the Education Commission of the States reveals a minimal negative correlation between teacher salaries and retention, indicating that salary alone does not determine retention. Additionally, thematic analysis of educational policies and development programs highlights effective strategies for addressing retention challenges. The research emphasizes the need for holistic support systems, including mentorship and professional growth opportunities, to improve retention. These findings urge policymakers and educational leaders to develop comprehensive strategies to maintain a qualified teaching workforce and enhance educational quality and equity nationwide.Keywords: teacher retention, salary levels, educational policies, professional development, teacher turnover
Procedia PDF Downloads 443617 Forest Fire Burnt Area Assessment in a Part of West Himalayan Region Using Differenced Normalized Burnt Ratio and Neural Network Approach
Authors: Sunil Chandra, Himanshu Rawat, Vikas Gusain, Triparna Barman
Abstract:
Forest fires are a recurrent phenomenon in the Himalayan region owing to the presence of vulnerable forest types, topographical gradients, climatic weather conditions, and anthropogenic pressure. The present study focuses on the identification of forest fire-affected areas in a small part of the West Himalayan region using a differential normalized burnt ratio method and spectral unmixing methods. The study area has a rugged terrain with the presence of sub-tropical pine forest, montane temperate forest, and sub-alpine forest and scrub. The major reason for fires in this region is anthropogenic in nature, with the practice of human-induced fires for getting fresh leaves, scaring wild animals to protect agricultural crops, grazing practices within reserved forests, and igniting fires for cooking and other reasons. The fires caused by the above reasons affect a large area on the ground, necessitating its precise estimation for further management and policy making. In the present study, two approaches have been used for carrying out a burnt area analysis. The first approach followed for burnt area analysis uses a differenced normalized burnt ratio (dNBR) index approach that uses the burnt ratio values generated using the Short-Wave Infrared (SWIR) band and Near Infrared (NIR) bands of the Sentinel-2 image. The results of the dNBR have been compared with the outputs of the spectral mixing methods. It has been found that the dNBR is able to create good results in fire-affected areas having homogenous forest stratum and with slope degree <5 degrees. However, in a rugged terrain where the landscape is largely influenced by the topographical variations, vegetation types, tree density, the results may be largely influenced by the effects of topography, complexity in tree composition, fuel load composition, and soil moisture. Hence, such variations in the factors influencing burnt area assessment may not be effectively carried out using a dNBR approach which is commonly followed for burnt area assessment over a large area. Hence, another approach that has been attempted in the present study utilizes a spectral mixing method where the individual pixel is tested before assigning an information class to it. The method uses a neural network approach utilizing Sentinel-2 bands. The training and testing data are generated from the Sentinel-2 data and the national field inventory, which is further used for generating outputs using ML tools. The analysis of the results indicates that the fire-affected regions and their severity can be better estimated using spectral unmixing methods, which have the capability to resolve the noise in the data and can classify the individual pixel to the precise burnt/unburnt class.Keywords: categorical data, log linear modeling, neural network, shifting cultivation
Procedia PDF Downloads 523616 Mathematics Professional Development: Uptake and Impacts on Classroom Practice
Authors: Karen Koellner, Nanette Seago, Jennifer Jacobs, Helen Garnier
Abstract:
Although studies of teacher professional development (PD) are prevalent, surprisingly most have only produced incremental shifts in teachers’ learning and their impact on students. There is a critical need to understand what teachers take up and use in their classroom practice after attending PD and why we often do not see greater changes in learning and practice. This paper is based on a mixed methods efficacy study of the Learning and Teaching Geometry (LTG) video-based mathematics professional development materials. The extent to which the materials produce a beneficial impact on teachers’ mathematics knowledge, classroom practices, and their students’ knowledge in the domain of geometry through a group-randomized experimental design are considered. Included is a close-up examination of a small group of teachers to better understand their interpretations of the workshops and their classroom uptake. The participants included 103 secondary mathematics teachers serving grades 6-12 from two US states in different regions. Randomization was conducted at the school level, with 23 schools and 49 teachers assigned to the treatment group and 18 schools and 54 teachers assigned to the comparison group. The case study examination included twelve treatment teachers. PD workshops for treatment teachers began in Summer 2016. Nine full days of professional development were offered to teachers, beginning with the one-week institute (Summer 2016) and four days of PD throughout the academic year. The same facilitator-led all of the workshops, after completing a facilitator preparation process that included a multi-faceted assessment of fidelity. The overall impact of the LTG PD program was assessed from multiple sources: two teacher content assessments, two PD embedded assessments, pre-post-post videotaped classroom observations, and student assessments. Additional data were collected from the case study teachers including additional videotaped classroom observations and interviews. Repeated measures ANOVA analyses were used to detect patterns of change in the treatment teachers’ content knowledge before and after completion of the LTG PD, relative to the comparison group. No significant effects were found across the two groups of teachers on the two teacher content assessments. Teachers were rated on the quality of their mathematics instruction captured in videotaped classroom observations using the Math in Common Observation Protocol. On average, teachers who attended the LTG PD intervention improved their ability to engage students in mathematical reasoning and to provide accurate, coherent, and well-justified mathematical content. In addition, the LTG PD intervention and instruction that engaged students in mathematical practices both positively and significantly predicted greater student knowledge gains. Teacher knowledge was not a significant predictor. Twelve treatment teachers self-selected to serve as case study teachers to provide additional videotapes in which they felt they were using something from the PD they learned and experienced. Project staff analyzed the videos, compared them to previous videos and interviewed the teachers regarding their uptake of the PD related to content knowledge, pedagogical knowledge and resources used. The full paper will include the case study of Ana to illustrate the factors involved in what teachers take up and use from participating in the LTG PD.Keywords: geometry, mathematics professional development, pedagogical content knowledge, teacher learning
Procedia PDF Downloads 1243615 High-Fidelity Materials Screening with a Multi-Fidelity Graph Neural Network and Semi-Supervised Learning
Authors: Akeel A. Shah, Tong Zhang
Abstract:
Computational approaches to learning the properties of materials are commonplace, motivated by the need to screen or design materials for a given application, e.g., semiconductors and energy storage. Experimental approaches can be both time consuming and costly. Unfortunately, computational approaches such as ab-initio electronic structure calculations and classical or ab-initio molecular dynamics are themselves can be too slow for the rapid evaluation of materials, often involving thousands to hundreds of thousands of candidates. Machine learning assisted approaches have been developed to overcome the time limitations of purely physics-based approaches. These approaches, on the other hand, require large volumes of data for training (hundreds of thousands on many standard data sets such as QM7b). This means that they are limited by how quickly such a large data set of physics-based simulations can be established. At high fidelity, such as configuration interaction, composite methods such as G4, and coupled cluster theory, gathering such a large data set can become infeasible, which can compromise the accuracy of the predictions - many applications require high accuracy, for example band structures and energy levels in semiconductor materials and the energetics of charge transfer in energy storage materials. In order to circumvent this problem, multi-fidelity approaches can be adopted, for example the Δ-ML method, which learns a high-fidelity output from a low-fidelity result such as Hartree-Fock or density functional theory (DFT). The general strategy is to learn a map between the low and high fidelity outputs, so that the high-fidelity output is obtained a simple sum of the physics-based low-fidelity and correction, Although this requires a low-fidelity calculation, it typically requires far fewer high-fidelity results to learn the correction map, and furthermore, the low-fidelity result, such as Hartree-Fock or semi-empirical ZINDO, is typically quick to obtain, For high-fidelity outputs the result can be an order of magnitude or more in speed up. In this work, a new multi-fidelity approach is developed, based on a graph convolutional network (GCN) combined with semi-supervised learning. The GCN allows for the material or molecule to be represented as a graph, which is known to improve accuracy, for example SchNet and MEGNET. The graph incorporates information regarding the numbers of, types and properties of atoms; the types of bonds; and bond angles. They key to the accuracy in multi-fidelity methods, however, is the incorporation of low-fidelity output to learn the high-fidelity equivalent, in this case by learning their difference. Semi-supervised learning is employed to allow for different numbers of low and high-fidelity training points, by using an additional GCN-based low-fidelity map to predict high fidelity outputs. It is shown on 4 different data sets that a significant (at least one order of magnitude) increase in accuracy is obtained, using one to two orders of magnitude fewer low and high fidelity training points. One of the data sets is developed in this work, pertaining to 1000 simulations of quinone molecules (up to 24 atoms) at 5 different levels of fidelity, furnishing the energy, dipole moment and HOMO/LUMO.Keywords: .materials screening, computational materials, machine learning, multi-fidelity, graph convolutional network, semi-supervised learning
Procedia PDF Downloads 373614 AINA: Disney Animation Information as Educational Resources
Authors: Piedad Garrido, Fernando Repulles, Andy Bloor, Julio A. Sanguesa, Jesus Gallardo, Vicente Torres, Jesus Tramullas
Abstract:
With the emergence and development of Information and Communications Technologies (ICTs), Higher Education is experiencing rapid changes, not only in its teaching strategies but also in student’s learning skills. However, we have noticed that students often have difficulty when seeking innovative, useful, and interesting learning resources for their work. This is due to the lack of supervision in the selection of good query tools. This paper presents AINA, an Information Retrieval (IR) computer system aimed at providing motivating and stimulating content to both students and teachers working on different areas and at different educational levels. In particular, our proposal consists of an open virtual resource environment oriented to the vast universe of Disney comics and cartoons. Our test suite includes Disney’s long and shorts films, and we have performed some activities based on the Just In Time Teaching (JiTT) methodology. More specifically, it has been tested by groups of university and secondary school students.Keywords: information retrieval, animation, educational resources, JiTT
Procedia PDF Downloads 3463613 Investigating Factors Impacting Student Motivation in Classroom Use of Digital Games
Authors: Max Neu
Abstract:
A large variety of studies on the utilization of games in classroom settings promote positive effects on students motivation for learning. Still, most of those studies rarely can give any specifics about the factors that might lead to changes in students motivation. The undertaken study has been conducted in tandem with the development of a highly classroom-optimized serious game, with the intent of providing a subjectively positive initial contact with the subject of political participation and to enable the development of personal motivation towards further engagement with the topic. The goal of this explorative study was to Identify the factors that influence students motivation towards the subject when serious games are being used in classroom education. Therefor, students that have been exposed to a set of classes in which a classroom optimized serious game has been used. Afterwards, a selection of those have been questioned in guided interviews that have been evaluated through Qualitative Content Analysis. The study indicates that at least 23 factors in the categories, mechanics, content and context potentially influence students motivation to engage with the classes subject. The conclusions are of great value for the further production of classroom games as well as curricula involving digital games in general.Keywords: formal education, games in classroom, motivation, political education
Procedia PDF Downloads 1083612 Integrated Models of Reading Comprehension: Understanding to Impact Teaching—The Teacher’s Central Role
Authors: Sally A. Brown
Abstract:
Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aid teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.Keywords: explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role
Procedia PDF Downloads 963611 Artificial Neural Networks in Environmental Psychology: Application in Architectural Projects
Authors: Diego De Almeida Pereira, Diana Borchenko
Abstract:
Artificial neural networks are used for many applications as they are able to learn complex nonlinear relationships between input and output data. As the number of neurons and layers in a neural network increases, it is possible to represent more complex behaviors. The present study proposes that artificial neural networks are a valuable tool for architecture and engineering professionals concerned with understanding how buildings influence human and social well-being based on theories of environmental psychology.Keywords: environmental psychology, architecture, neural networks, human and social well-being
Procedia PDF Downloads 4943610 Diversity in the Community - The Disability Perspective
Authors: Sarah Reker, Christiane H. Kellner
Abstract:
From the perspective of people with disabilities, inequalities can also emerge from spatial segregation, the lack of social contacts or limited economic resources. In order to reduce or even eliminate these disadvantages and increase general well-being, community-based participation as well as decentralisation efforts within exclusively residential homes is essential. Therefore, the new research project “Index for participation development and quality of life for persons with disabilities”(TeLe-Index, 2014-2016), which is anchored at the Technische Universität München in Munich and at a large residential complex and service provider for persons with disabilities in the outskirts of Munich aims to assist the development of community-based living environments. People with disabilities should be able to participate in social life beyond the confines of the institution. Since a diverse society is a society in which different individual needs and wishes can emerge and be catered to, the ultimate goal of the project is to create an environment for all citizens–regardless of disability, age or ethnic background–that accommodates their daily activities and requirements. The UN-Convention on the Rights of Persons with Disabilities, which Germany also ratified, postulates the necessity of user-centered design, especially when it comes to evaluating the individual needs and wishes of all citizens. Therefore, a multidimensional approach is required. Based on this insight, the structure of the town-like center will be remodeled to open up the community to all people. This strategy should lead to more equal opportunities and open the way for a much more diverse community. Therefore, macro-level research questions were inspired by quality of life theory and were formulated as follows for different dimensions: •The user dimension: what needs and necessities can we identify? Are needs person-related? Are there any options to choose from? What type of quality of life can we identify? The economic dimension: what resources (both material and staff-related) are available in the region? (How) are they used? What costs (can) arise and what effects do they entail? •The environment dimension: what “environmental factors” such as access (mobility and absence of barriers) prove beneficial or impedimental? In this context, we have provided academic supervision and support for three projects (the construction of a new school, inclusive housing for children and teenagers with disabilities and the professionalization of employees with person-centered thinking). Since we cannot present all the issues of the umbrella-project within the conference framework, we will be focusing on one project more in-depth, namely “Outpatient Housing Options for Children and Teenagers with Disabilities”. The insights we have obtained until now will enable us to present the intermediary results of our evaluation. The most central questions pertaining to this part of the research were the following: •How have the existing network relations been designed? •What meaning (or significance) does the existing service offers and structures have for the everyday life of an external residential group? These issues underpinned the environmental analyses as well as the qualitative guided interviews and qualitative network analyses we carried out.Keywords: decentralisation, environmental analyses, outpatient housing options for children and teenagers with disabilities, qualitative network analyses
Procedia PDF Downloads 3653609 The Novelty of Mobile Money Solution to Ghana’S Cashless Future: Opportunities, Challenges and Way Forward
Authors: Julius Y Asamoah
Abstract:
Mobile money has seen faster adoption in the decade. Its emergence serves as an essential driver of financial inclusion and an innovative financial service delivery channel, especially to the unbanked population. The rising importance of mobile money services has caught policymakers and regulators' attention, seeking to understand the many issues emerging from this context. At the same time, it is unlocking the potential of knowledge of this new technology. Regulatory responses and support are essential, requiring significant changes to current regulatory practices in Ghana. The article aims to answer the following research questions: "What risk does an unregulated mobile money service pose to consumers and the financial system? "What factors stimulate and hinder the introduction of mobile payments in developing countries? The sample size used was 250 respondents selected from the study area. The study has adopted an analytical approach comprising a combination of qualitative and quantitative data collection methods. Actor-network theory (ANT) is used as an interpretive lens to analyse this process. ANT helps analyse how actors form alliances and enrol other actors, including non-human actors (i.e. technology), to secure their interests. The study revealed that government regulatory policies impact mobile money as critical to mobile money services in developing countries. Regulatory environment should balance the needs of advancing access to finance with the financial system's stability and draw extensively from Kenya's work as the best strategies for the system's players. Thus, regulators need to address issues related to the enhancement of supportive regulatory frameworks. It recommended that the government involve various stakeholders, such as mobile phone operators. Moreover, the national regulatory authority creates a regulatory environment that promotes fair practices and competition to raise revenues to support a business-enabling environment's key pillars as infrastructure.Keywords: actor-network theory (ANT), cashless future, Developing countries, Ghana, Mobile Money
Procedia PDF Downloads 1373608 Virtual Science Hub: An Open Source Platform to Enrich Science Teaching
Authors: Enrique Barra, Aldo Gordillo, Juan Quemada
Abstract:
This paper presents the Virtual Science Hub platform. It is an open source platform that combines a social network, an e-learning authoring tool, a video conference service and a learning object repository for science teaching enrichment. These four main functionalities fit very well together. The platform was released in April 2012 and since then it has not stopped growing. Finally we present the results of the surveys conducted and the statistics gathered to validate this approach.Keywords: e-learning, platform, authoring tool, science teaching, educational sciences
Procedia PDF Downloads 3953607 The Effect of Physical Biorhythm Cycle on Health-Related Fitness Factors
Authors: Leyli Khavari, Javad Yousefian
Abstract:
The aim of this study was to investigate the effect of physical biorhythm cycle on health-related fitness factors. For this purpose, 120 athlete and non-athlete male and female students were selected randomly and based on the level of physical activity divided into athletic and non-athletic groups. The exact date of birth and also when the subjects were in the positive, negative and critical physical biorhythm cycle was determined by calculation software biorhythm. The physical fitness factors tests, including Queens College Step Test, AAHPERD sit-ups; Wells stretch test and hand dynamometer. Students in three stages in positive, negative and critical physical cycle were tested. Data processing using SPSS software and statistical tests ANOVA with repeated measures and student t test was used for dependent. The results of this study showed that changes in physical fitness and physical biorhythm were not affected by changes in the 23-day physical cycle.Keywords: AAHPERD test, biorhythm, physical cycle, Queens College Step Test
Procedia PDF Downloads 1823606 Academic Staff’s Perception and Willingness to Participate in Collaborative Research: Implication for Development in Sub-Saharan Africa
Authors: Ademola Ibukunolu Atanda
Abstract:
Research undertakings are meant to proffer solutions to issues and challenges in society. This justifies the need for research in ivory towers. Multinational and non-governmental organisations, as well as foundations, commit financial resources to support research endeavours. In recent times, the direction and dimension of research undertaking encourage collaborations, whereby experts from different disciplines or specializations would bring their expertise in addressing any identified problem, whether in humanities or sciences. However, the extent to which collaborative research undertakings are perceived and embraced by academic staff would determine the impact collaborative research would have on society. To this end, this study investigated academic staff’s perception and willingness to be involved in collaborative research for the purpose of proffering solutions to societal problems. The study adopted a descriptive research design. The population comprised academic staff in southern Nigeria. The sample was drawn through a convenient sampling technique. The data were collected using a questionnaire titled “Perception and Willingness to Participate in Collaborative Research Questionnaire (PWPCRQ)’ using Google Forms. Data collected were analyzed using descriptive statistics of simple percentages, mean and charts. The findings showed that Academic Staff’s readiness to participate in collaborative research is to a great extent (89%) and they participate in collaborative research very often (51%). The Academic Staff was involved more in collaboration research among their colleagues within their universities (1.98) than participation in inter-disciplines collaboration (1.47) with their colleagues outside Nigeria. Collaborative research was perceived to impact on development (2.5). Collaborative research offers the following benefits to members’ aggregation of views, the building of an extensive network of contacts, enhancement of sharing of skills, facilitation of tackling complex problems, increased visibility of research network and citations and promotion of funding opportunities. The study concluded that Academic staff in universities in the South-West of Nigeria participate in collaborative research but with their colleagues within Nigeria rather than outside the country. Based on the findings, it was recommended that the management of universities in South-West Nigeria should encourage collaborative research with some incentives.Keywords: collaboration, research, development, participation
Procedia PDF Downloads 623605 A Study of Topical and Similarity of Sebum Layer Using Interactive Technology in Image Narratives
Authors: Chao Wang
Abstract:
Under rapid innovation of information technology, the media plays a very important role in the dissemination of information, and it has a totally different analogy generations face. However, the involvement of narrative images provides more possibilities of narrative text. "Images" through the process of aperture, a camera shutter and developable photosensitive processes are manufactured, recorded and stamped on paper, displayed on a computer screen-concretely saved. They exist in different forms of files, data, or evidence as the ultimate looks of events. By the interface of media and network platforms and special visual field of the viewer, class body space exists and extends out as thin as sebum layer, extremely soft and delicate with real full tension. The physical space of sebum layer of confuses the fact that physical objects exist, needs to be established under a perceived consensus. As at the scene, the existing concepts and boundaries of physical perceptions are blurred. Sebum layer physical simulation shapes the “Topical-Similarity" immersing, leading the contemporary social practice communities, groups, network users with a kind of illusion without the presence, i.e. a non-real illusion. From the investigation and discussion of literatures, digital movies editing manufacture and produce the variability characteristics of time (for example, slices, rupture, set, and reset) are analyzed. Interactive eBook has an unique interaction in "Waiting-Greeting" and "Expectation-Response" that makes the operation of image narrative structure more interpretations functionally. The works of digital editing and interactive technology are combined and further analyze concept and results. After digitization of Interventional Imaging and interactive technology, real events exist linked and the media handing cannot be cut relationship through movies, interactive art, practical case discussion and analysis. Audience needs more rational thinking about images carried by the authenticity of the text.Keywords: sebum layer, topical and similarity, interactive technology, image narrative
Procedia PDF Downloads 3883604 Deep Learning Based Text to Image Synthesis for Accurate Facial Composites in Criminal Investigations
Authors: Zhao Gao, Eran Edirisinghe
Abstract:
The production of an accurate sketch of a suspect based on a verbal description obtained from a witness is an essential task for most criminal investigations. The criminal investigation system employs specifically trained professional artists to manually draw a facial image of the suspect according to the descriptions of an eyewitness for subsequent identification. Within the advancement of Deep Learning, Recurrent Neural Networks (RNN) have shown great promise in Natural Language Processing (NLP) tasks. Additionally, Generative Adversarial Networks (GAN) have also proven to be very effective in image generation. In this study, a trained GAN conditioned on textual features such as keywords automatically encoded from a verbal description of a human face using an RNN is used to generate photo-realistic facial images for criminal investigations. The intention of the proposed system is to map corresponding features into text generated from verbal descriptions. With this, it becomes possible to generate many reasonably accurate alternatives to which the witness can use to hopefully identify a suspect from. This reduces subjectivity in decision making both by the eyewitness and the artist while giving an opportunity for the witness to evaluate and reconsider decisions. Furthermore, the proposed approach benefits law enforcement agencies by reducing the time taken to physically draw each potential sketch, thus increasing response times and mitigating potentially malicious human intervention. With publically available 'CelebFaces Attributes Dataset' (CelebA) and additionally providing verbal description as training data, the proposed architecture is able to effectively produce facial structures from given text. Word Embeddings are learnt by applying the RNN architecture in order to perform semantic parsing, the output of which is fed into the GAN for synthesizing photo-realistic images. Rather than the grid search method, a metaheuristic search based on genetic algorithms is applied to evolve the network with the intent of achieving optimal hyperparameters in a fraction the time of a typical brute force approach. With the exception of the ‘CelebA’ training database, further novel test cases are supplied to the network for evaluation. Witness reports detailing criminals from Interpol or other law enforcement agencies are sampled on the network. Using the descriptions provided, samples are generated and compared with the ground truth images of a criminal in order to calculate the similarities. Two factors are used for performance evaluation: The Structural Similarity Index (SSIM) and the Peak Signal-to-Noise Ratio (PSNR). A high percentile output from this performance matrix should attribute to demonstrating the accuracy, in hope of proving that the proposed approach can be an effective tool for law enforcement agencies. The proposed approach to criminal facial image generation has potential to increase the ratio of criminal cases that can be ultimately resolved using eyewitness information gathering.Keywords: RNN, GAN, NLP, facial composition, criminal investigation
Procedia PDF Downloads 1593603 A Graph Theoretic Algorithm for Bandwidth Improvement in Computer Networks
Authors: Mehmet Karaata
Abstract:
Given two distinct vertices (nodes) source s and target t of a graph G = (V, E), the two node-disjoint paths problem is to identify two node-disjoint paths between s ∈ V and t ∈ V . Two paths are node-disjoint if they have no common intermediate vertices. In this paper, we present an algorithm with O(m)-time complexity for finding two node-disjoint paths between s and t in arbitrary graphs where m is the number of edges. The proposed algorithm has a wide range of applications in ensuring reliability and security of sensor, mobile and fixed communication networks.Keywords: disjoint paths, distributed systems, fault-tolerance, network routing, security
Procedia PDF Downloads 4413602 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients' Cohorts: A Case Study in Scotland
Authors: Raptis Sotirios
Abstract:
Health and social care (HSc) services planning and scheduling are facing unprecedented challenges due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven can help to improve policies, plan and design services provision schedules using algorithms assist healthcare managers’ to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as CART, random forests (RF), and logistic regression (LGR). The significance tests Chi-Squared test and Student test are used on data over a 39 years span for which HSc services data exist for services delivered in Scotland. The demands are probabilistically associated through statistical hypotheses that assume that the target service’s demands are statistically dependent on other demands as a NULL hypothesis. This linkage can be confirmed or not by the data. Complementarily, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus groups of services. Statistical tests confirm ML couplings making the prediction also statistically meaningful and prove that a target service can be matched reliably to other services, and ML shows these indicated relationships can also be linear ones. Zero paddings were used for missing years records and illustrated better such relationships both for limited years and in the entire span offering long term data visualizations while limited years groups explained how well patients numbers can be related in short periods or can change over time as opposed to behaviors across more years. The prediction performance of the associations is measured using Receiver Operating Characteristic(ROC) AUC and ACC metrics as well as the statistical tests, Chi-Squared and Student. Co-plots and comparison tables for RF, CART, and LGR as well as p-values and Information Exchange(IE), are provided showing the specific behavior of the ML and of the statistical tests and the behavior using different learning ratios. The impact of k-NN and cross-correlation and C-Means first groupings is also studied over limited years and the entire span. It was found that CART was generally behind RF and LGR, but in some interesting cases, LGR reached an AUC=0 falling below CART, while the ACC was as high as 0.912, showing that ML methods can be confused padding or by data irregularities or outliers. On average, 3 linear predictors were sufficient, LGR was found competing RF well, and CART followed with the same performance at higher learning ratios. Services were packed only if when significance level(p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited years, across various services sectors, learning configurations, as confirmed using statistical hypotheses.Keywords: class, cohorts, data frames, grouping, prediction, prob-ability, services
Procedia PDF Downloads 2293601 The Canaanite Trade Network between the Shores of the Mediterranean Sea
Authors: Doaa El-Shereef
Abstract:
The Canaanite civilization was one of the early great civilizations of the Near East, they influenced and been influenced from the civilizations of the ancient world especially the Egyptian and Mesopotamia civilizations. The development of the Canaanite trade started from the Chalcolithic Age to the Iron Age through the oldest trade route in the Middle East. This paper will focus on defining the Canaanites and from where did they come from and the meaning of the term Canaan and how the Ancient Manuscripts define the borders of the land of Canaan and this essay will describe the Canaanite trade route and their exported goods such as cedar wood, and pottery.Keywords: archaeology, bronze age, Canaanite, colonies, Massilia, pottery, shipwreck, vineyards
Procedia PDF Downloads 2013600 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 94