Search results for: one shot learning
3996 Collaborative Data Refinement for Enhanced Ionic Conductivity Prediction in Garnet-Type Materials
Authors: Zakaria Kharbouch, Mustapha Bouchaara, F. Elkouihen, A. Habbal, A. Ratnani, A. Faik
Abstract:
Solid-state lithium-ion batteries have garnered increasing interest in modern energy research due to their potential for safer, more efficient, and sustainable energy storage systems. Among the critical components of these batteries, the electrolyte plays a pivotal role, with LLZO garnet-based electrolytes showing significant promise. Garnet materials offer intrinsic advantages such as high Li-ion conductivity, wide electrochemical stability, and excellent compatibility with lithium metal anodes. However, optimizing ionic conductivity in garnet structures poses a complex challenge, primarily due to the multitude of potential dopants that can be incorporated into the LLZO crystal lattice. The complexity of material design, influenced by numerous dopant options, requires a systematic method to find the most effective combinations. This study highlights the utility of machine learning (ML) techniques in the materials discovery process to navigate the complex range of factors in garnet-based electrolytes. Collaborators from the materials science and ML fields worked with a comprehensive dataset previously employed in a similar study and collected from various literature sources. This dataset served as the foundation for an extensive data refinement phase, where meticulous error identification, correction, outlier removal, and garnet-specific feature engineering were conducted. This rigorous process substantially improved the dataset's quality, ensuring it accurately captured the underlying physical and chemical principles governing garnet ionic conductivity. The data refinement effort resulted in a significant improvement in the predictive performance of the machine learning model. Originally starting at an accuracy of 0.32, the model underwent substantial refinement, ultimately achieving an accuracy of 0.88. This enhancement highlights the effectiveness of the interdisciplinary approach and underscores the substantial potential of machine learning techniques in materials science research.Keywords: lithium batteries, all-solid-state batteries, machine learning, solid state electrolytes
Procedia PDF Downloads 613995 Comparison of Machine Learning-Based Models for Predicting Streptococcus pyogenes Virulence Factors and Antimicrobial Resistance
Authors: Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Diego Santibañez Oyarce, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
Streptococcus pyogenes is a gram-positive bacteria involved in a wide range of diseases and is a major-human-specific bacterial pathogen. In Chile, this year the 'Ministerio de Salud' declared an alert due to the increase in strains throughout the year. This increase can be attributed to the multitude of factors including antimicrobial resistance (AMR) and Virulence Factors (VF). Understanding these VF and AMR is crucial for developing effective strategies and improving public health responses. Moreover, experimental identification and characterization of these pathogenic mechanisms are labor-intensive and time-consuming. Therefore, new computational methods are required to provide robust techniques for accelerating this identification. Advances in Machine Learning (ML) algorithms represent the opportunity to refine and accelerate the discovery of VF associated with Streptococcus pyogenes. In this work, we evaluate the accuracy of various machine learning models in predicting the virulence factors and antimicrobial resistance of Streptococcus pyogenes, with the objective of providing new methods for identifying the pathogenic mechanisms of this organism.Our comprehensive approach involved the download of 32,798 genbank files of S. pyogenes from NCBI dataset, coupled with the incorporation of data from Virulence Factor Database (VFDB) and Antibiotic Resistance Database (CARD) which contains sequences of AMR gene sequence and resistance profiles. These datasets provided labeled examples of both virulent and non-virulent genes, enabling a robust foundation for feature extraction and model training. We employed preprocessing, characterization and feature extraction techniques on primary nucleotide/amino acid sequences and selected the optimal more for model training. The feature set was constructed using sequence-based descriptors (e.g., k-mers and One-hot encoding), and functional annotations based on database prediction. The ML models compared are logistic regression, decision trees, support vector machines, neural networks among others. The results of this work show some differences in accuracy between the algorithms, these differences allow us to identify different aspects that represent unique opportunities for a more precise and efficient characterization and identification of VF and AMR. This comparative analysis underscores the value of integrating machine learning techniques in predicting S. pyogenes virulence and AMR, offering potential pathways for more effective diagnostic and therapeutic strategies. Future work will focus on incorporating additional omics data, such as transcriptomics, and exploring advanced deep learning models to further enhance predictive capabilities.Keywords: antibiotic resistance, streptococcus pyogenes, virulence factors., machine learning
Procedia PDF Downloads 333994 Design-Based Elements to Sustain Participant Activity in Massive Open Online Courses: A Case Study
Authors: C. Zimmermann, E. Lackner, M. Ebner
Abstract:
Massive Open Online Courses (MOOCs) are increasingly popular learning hubs that are boasting considerable participant numbers, innovative technical features, and a multitude of instructional resources. Still, there is a high level of evidence showing that almost all MOOCs suffer from a declining frequency of participant activity and fairly low completion rates. In this paper, we would like to share the lessons learned in implementing several design patterns that have been suggested in order to foster participant activity. Our conclusions are based on experiences with the ‘Dr. Internet’ MOOC, which was created as an xMOOC to raise awareness for a more critical approach to online health information: participants had to diagnose medical case studies. There is a growing body of recommendations (based on Learning Analytics results from earlier xMOOCs) as to how the decline in participant activity can be alleviated. One promising focus in this regard is instructional design patterns, since they have a tremendous influence on the learner’s motivation, which in turn is a crucial trigger of learning processes. Since Medieval Age storytelling, micro-learning units and specific comprehensible, narrative structures were chosen to animate the audience to follow narration. Hence, MOOC participants are not likely to abandon a course or information channel when their curiosity is kept at a continuously high level. Critical aspects that warrant consideration in this regard include shorter course duration, a narrative structure with suspense peaks (according to the ‘storytelling’ approach), and a course schedule that is diversified and stimulating, yet easy to follow. All of these criteria have been observed within the design of the Dr. Internet MOOC: 1) the standard eight week course duration was shortened down to six weeks, 2) all six case studies had a special quiz format and a corresponding resolution video which was made available in the subsequent week, 3) two out of six case studies were split up in serial video sequences to be presented over the span of two weeks, and 4) the videos were generally scheduled in a less predictable sequence. However, the statistical results from the first run of the MOOC do not indicate any strong influences on the retention rate, so we conclude with some suggestions as to why this might be and what aspects need further consideration.Keywords: case study, Dr. internet, experience, MOOCs, design patterns
Procedia PDF Downloads 2663993 Machine Learning Techniques in Bank Credit Analysis
Authors: Fernanda M. Assef, Maria Teresinha A. Steiner
Abstract:
The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines
Procedia PDF Downloads 1033992 TutorBot+: Automatic Programming Assistant with Positive Feedback based on LLMs
Authors: Claudia Martínez-Araneda, Mariella Gutiérrez, Pedro Gómez, Diego Maldonado, Alejandra Segura, Christian Vidal-Castro
Abstract:
The purpose of this document is to showcase the preliminary work in developing an EduChatbot-type tool and measuring the effects of its use aimed at providing effective feedback to students in programming courses. This bot, hereinafter referred to as tutorBot+, was constructed based on chatGPT and is tasked with assisting and delivering timely positive feedback to students in the field of computer science at the Universidad Católica de Concepción. The proposed working method consists of four stages: (1) Immersion in the domain of Large Language Models (LLMs), (2) Development of the tutorBot+ prototype and integration, (3) Experiment design, and (4) Intervention. The first stage involves a literature review on the use of artificial intelligence in education and the evaluation of intelligent tutors, as well as research on types of feedback for learning and the domain of chatGPT. The second stage encompasses the development of tutorBot+, and the final stage involves a quasi-experimental study with students from the Programming and Database labs, where the learning outcome involves the development of computational thinking skills, enabling the use and measurement of the tool's effects. The preliminary results of this work are promising, as a functional chatBot prototype has been developed in both conversational and non-conversational versions integrated into an open-source online judge and programming contest platform system. There is also an exploration of the possibility of generating a custom model based on a pre-trained one tailored to the domain of programming. This includes the integration of the created tool and the design of the experiment to measure its utility.Keywords: assessment, chatGPT, learning strategies, LLMs, timely feedback
Procedia PDF Downloads 683991 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification
Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos
Abstract:
Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology
Procedia PDF Downloads 1493990 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 333989 People Who Live in Poverty Usually Do So Due to Circumstances Far Beyond Their Control: A Multiple Case Study on Poverty Simulation Events
Authors: Tracy Smith-Carrier
Abstract:
Burgeoning research extols the benefits of innovative experiential learning activities to increase participants’ engagement, enhance their individual learning, and bridge the gap between theory and practice. This presentation discusses findings from a multiple case study on poverty simulation events conducted with two samples: undergraduate students and community participants. After exploring the nascent research on the benefits and limitations of poverty simulation activities, the study explores whether participating in a poverty simulation resulted in changes to participants’ beliefs about the causes and effects of poverty, as well as shifts in their attitudes and actions toward people experiencing poverty. For the purposes of triangulation, quantitative and qualitative data from a variety of sources were analyzed: participant feedback surveys, qualitative responses, and pre, post, and follow-up questionnaires. Findings show statistically significant results (p<.05) from both samples on cumulative scores of the modified Attitudes Toward Poverty Scale, indicating an improvement in participants’ attitudes toward poverty. Although generally positive about their experiences, participating in the simulation did not appear to have prompted participants to take specific actions to reduce poverty. Conclusions drawn from the research study suggest that poverty simulation planners should be wary of adopting scenarios that emphasize, or fail to adequately contextualize, behaviours or responses that might perpetuate individual explanations of poverty. Moreover, organizers must carefully consider how to ensure participants in their audience currently experiencing low-income do not become emotionally distressed, triggered or further marginalized in the process. While overall participants were positive about their experiences in the simulation, the events did not appear to have prompted them to action. Moving beyond the goal of increasing participants’ understandings of poverty, interventions that foster greater engagement in poverty issues over the long-term are necessary.Keywords: empathy, experiential learning, poverty awareness, poverty simulation
Procedia PDF Downloads 2673988 Use of Social Networks and Mobile Technologies in Education
Authors: Václav Maněna, Roman Dostál, Štěpán Hubálovský
Abstract:
Social networks play an important role in the lives of children and young people. Along with the high penetration of mobile technologies such as smartphones and tablets among the younger generation, there is an increasing use of social networks already in elementary school. The paper presents the results of research, which was realized at schools in the Hradec Králové region. In this research, the authors focused on issues related to communications on social networks for children, teenagers and young people in the Czech Republic. This research was conducted at selected elementary, secondary and high schools using anonymous questionnaires. The results are evaluated and compared with the results of the research, which has been realized in 2008. The authors focused on the possibilities of using social networks in education. The paper presents the possibility of using the most popular social networks in education, with emphasis on increasing motivation for learning. The paper presents comparative analysis of social networks, with regard to the possibility of using in education as well.Keywords: social networks, motivation, e-learning, mobile technology
Procedia PDF Downloads 3133987 Machine Learning Techniques for COVID-19 Detection: A Comparative Analysis
Authors: Abeer A. Aljohani
Abstract:
COVID-19 virus spread has been one of the extreme pandemics across the globe. It is also referred to as coronavirus, which is a contagious disease that continuously mutates into numerous variants. Currently, the B.1.1.529 variant labeled as omicron is detected in South Africa. The huge spread of COVID-19 disease has affected several lives and has surged exceptional pressure on the healthcare systems worldwide. Also, everyday life and the global economy have been at stake. This research aims to predict COVID-19 disease in its initial stage to reduce the death count. Machine learning (ML) is nowadays used in almost every area. Numerous COVID-19 cases have produced a huge burden on the hospitals as well as health workers. To reduce this burden, this paper predicts COVID-19 disease is based on the symptoms and medical history of the patient. This research presents a unique architecture for COVID-19 detection using ML techniques integrated with feature dimensionality reduction. This paper uses a standard UCI dataset for predicting COVID-19 disease. This dataset comprises symptoms of 5434 patients. This paper also compares several supervised ML techniques to the presented architecture. The architecture has also utilized 10-fold cross validation process for generalization and the principal component analysis (PCA) technique for feature reduction. Standard parameters are used to evaluate the proposed architecture including F1-Score, precision, accuracy, recall, receiver operating characteristic (ROC), and area under curve (AUC). The results depict that decision tree, random forest, and neural networks outperform all other state-of-the-art ML techniques. This achieved result can help effectively in identifying COVID-19 infection cases.Keywords: supervised machine learning, COVID-19 prediction, healthcare analytics, random forest, neural network
Procedia PDF Downloads 923986 Applying Epistemology to Artificial Intelligence in the Social Arena: Exploring Fundamental Considerations
Authors: Gianni Jacucci
Abstract:
Epistemology traditionally finds its place within human research philosophies and methodologies. Artificial intelligence methods pose challenges, particularly given the unresolved relationship between AI and pivotal concepts in social arenas such as hermeneutics and accountability. We begin by examining the essential criteria governing scientific rigor in the human sciences. We revisit the three foundational philosophies underpinning qualitative research methods: empiricism, hermeneutics, and phenomenology. We elucidate the distinct attributes, merits, and vulnerabilities inherent in the methodologies they inspire. The integration of AI, e.g., deep learning algorithms, sparks an interest in evaluating these criteria against the diverse forms of AI architectures. For instance, Interpreted AI could be viewed as a hermeneutic approach, relying on a priori interpretations, while straight AI may be perceived as a descriptive phenomenological approach, processing original and uncontaminated data. This paper serves as groundwork for such explorations, offering preliminary reflections to lay the foundation and outline the initial landscape.Keywords: artificial intelligence, deep learning, epistemology, qualitative research, methodology, hermeneutics, accountability
Procedia PDF Downloads 383985 Perceived Influence of Information Communication Technology on Empowerment Amongst the College of Education Physical and Health Education Students in Oyo State
Authors: I. O. Oladipo, Olusegun Adewale Ajayi, Omoniyi Oladipupo Adigun
Abstract:
Information Communication Technology (ICT) have the potential to contribute to different facets of educational development and effective learning; expanding access, promoting efficiency, improve the quality of learning, enhancing the quality of teaching and provide important mechanism for the economic crisis. Considering the prevalence of unemployment among the higher institution graduates in this nation, in which much seems not to have been achieved in this direction. In view of this, the purpose of this study is to create an awareness and enlightenment of ICT for empowerment opportunities after school. A self-developed modified 4-likert scale questionnaire was used for data collection among Colleges of Education, Physical and Health Education students in Oyo State. Inferential statistical analysis of chi-square set at 0.05 alpha levels was used to analyze the stated hypotheses. The study concludes that awareness and enlightenment of ICT significantly influence empowerment opportunities and recommended that college of education students should be encouraged on the application of ICT for job opportunity after school.Keywords: employment, empowerment, information communication technology, physical education
Procedia PDF Downloads 3903984 Emotional Intelligence and Age in Open Distance Learning
Authors: Naila Naseer
Abstract:
Emotional Intelligence (EI) concept is not new yet unique and interesting. EI is a person’s ability to be aware of his/her own emotions and to manage, handle and communicate emotions with others effectively. The present study was conducted to assess the relationship between emotional intelligence and age of graduate level students at Allama Iqbal Open University (AIOU). Population consisted of Allama Iqbal Open University students (B.Ed 3rd Semester, Autumn 2007) from Rawalpindi and Islamabad regions. Total number of sample consisted of 469 participants was randomly drawn out by using table of random numbers. Bar-On EQ-i was administered on the participants through personal contact. The instrument was also validated through pilot study on a random sample of 50 participants (B.Ed students Spring 2006), who had completed their B.Ed degree successfully. Data was analyzed and tabulated in percentages, frequencies, mean, standard deviation, correlation, and scatter gram in SPSS (version 16.0 for windows). The results revealed that students with higher age group had scored low on the scale (Bar-On EQ-i). Moreover, the students in low age groups exhibited higher levels of EI as compared with old age students.Keywords: emotional intelligence, age level, learning, emotion-related feelings
Procedia PDF Downloads 3333983 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning
Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu
Abstract:
Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.
Procedia PDF Downloads 893982 A Machine Learning-Based Model to Screen Antituberculosis Compound Targeted against LprG Lipoprotein of Mycobacterium tuberculosis
Authors: Syed Asif Hassan, Syed Atif Hassan
Abstract:
Multidrug-resistant Tuberculosis (MDR-TB) is an infection caused by the resistant strains of Mycobacterium tuberculosis that do not respond either to isoniazid or rifampicin, which are the most important anti-TB drugs. The increase in the occurrence of a drug-resistance strain of MTB calls for an intensive search of novel target-based therapeutics. In this context LprG (Rv1411c) a lipoprotein from MTB plays a pivotal role in the immune evasion of Mtb leading to survival and propagation of the bacterium within the host cell. Therefore, a machine learning method will be developed for generating a computational model that could predict for a potential anti LprG activity of the novel antituberculosis compound. The present study will utilize dataset from PubChem database maintained by National Center for Biotechnology Information (NCBI). The dataset involves compounds screened against MTB were categorized as active and inactive based upon PubChem activity score. PowerMV, a molecular descriptor generator, and visualization tool will be used to generate the 2D molecular descriptors for the actives and inactive compounds present in the dataset. The 2D molecular descriptors generated from PowerMV will be used as features. We feed these features into three different classifiers, namely, random forest, a deep neural network, and a recurring neural network, to build separate predictive models and choosing the best performing model based on the accuracy of predicting novel antituberculosis compound with an anti LprG activity. Additionally, the efficacy of predicted active compounds will be screened using SMARTS filter to choose molecule with drug-like features.Keywords: antituberculosis drug, classifier, machine learning, molecular descriptors, prediction
Procedia PDF Downloads 3913981 Efficient Chiller Plant Control Using Modern Reinforcement Learning
Authors: Jingwei Du
Abstract:
The need of optimizing air conditioning systems for existing buildings calls for control methods designed with energy-efficiency as a primary goal. The majority of current control methods boil down to two categories: empirical and model-based. To be effective, the former heavily relies on engineering expertise and the latter requires extensive historical data. Reinforcement Learning (RL), on the other hand, is a model-free approach that explores the environment to obtain an optimal control strategy often referred to as “policy”. This research adopts Proximal Policy Optimization (PPO) to improve chiller plant control, and enable the RL agent to collaborate with experienced engineers. It exploits the fact that while the industry lacks historical data, abundant operational data is available and allows the agent to learn and evolve safely under human supervision. Thanks to the development of language models, renewed interest in RL has led to modern, online, policy-based RL algorithms such as the PPO. This research took inspiration from “alignment”, a process that utilizes human feedback to finetune the pretrained model in case of unsafe content. The methodology can be summarized into three steps. First, an initial policy model is generated based on minimal prior knowledge. Next, the prepared PPO agent is deployed so feedback from both critic model and human experts can be collected for future finetuning. Finally, the agent learns and adapts itself to the specific chiller plant, updates the policy model and is ready for the next iteration. Besides the proposed approach, this study also used traditional RL methods to optimize the same simulated chiller plants for comparison, and it turns out that the proposed method is safe and effective at the same time and needs less to no historical data to start up.Keywords: chiller plant, control methods, energy efficiency, proximal policy optimization, reinforcement learning
Procedia PDF Downloads 303980 Virtual Academy Next: Addressing Transition Challenges Through a Gamified Virtual Transition Program for Students with Disabilities
Authors: Jennifer Gallup, Joel Bocanegra, Greg Callan, Abigail Vaughn
Abstract:
Students with disabilities (SWD) engaged in a distance summer program delivered over multiple virtual mediums that used gaming principles to teach and practice self-regulated learning (SRL) through the process of exploring possible jobs. Gaming quests were developed to explore jobs and teach transition skills. Students completed specially designed quests that taught and reinforced SRL and problem-solving through individual, group, and teacher-led experiences. SRL skills learned were reinforced through guided job explorations over the context of MinecraftEDU, zoom with experts in the career, collaborations with a team over Marco Polo, and Zoom. The quests were developed and laid out on an accessible web page, with active learning opportunities and feedback conducted within multiple virtual mediums including MinecraftEDU. Gaming mediums actively engage players in role-playing, problem-solving, critical thinking, and collaboration. Gaming has been used as a medium for education since the inception of formal education. Games, and specifically board games, are pre-historic, meaning we had board games before we had written language. Today, games are widely used in education, often as a reinforcer for behavior or for rewards for work completion. Games are not often used as a direct method of instruction and assessment; however, the inclusion of games as an assessment tool and as a form of instruction increases student engagement and participation. Games naturally include collaboration, problem-solving, and communication. Therefore, our summer program was developed using gaming principles and MinecraftEDU. This manuscript describes a virtual learning summer program called Virtual Academy New and Exciting Transitions (VAN) that was redesigned from a face-to-face setting to a completely online setting with a focus on SWD aged 14-21. The focus of VAN was to address transition planning needs such as problem-solving skills, self-regulation, interviewing, job exploration, and communication for transition-aged youth diagnosed with various disabilities (e.g., learning disabilities, attention-deficit hyperactivity disorder, intellectual disability, down syndrome, autism spectrum disorder).Keywords: autism, disabilities, transition, summer program, gaming, simulations
Procedia PDF Downloads 753979 Elaboration and Validation of a Survey about Research on the Characteristics of Mentoring of University Professors’ Lifelong Learning
Authors: Nagore Guerra Bilbao, Clemente Lobato Fraile
Abstract:
This paper outlines the design and development of the MENDEPRO questionnaire, designed to analyze mentoring performance within a professional development process carried out with professors at the University of the Basque Country, Spain. The study took into account the international research carried out over the past two decades into teachers' professional development, and was also based on a thorough review of the most common instruments used to identify and analyze mentoring styles, many of which fail to provide sufficient psychometric guarantees. The present study aimed to gather empirical data in order to verify the metric quality of the questionnaire developed. To this end, the process followed to validate the theoretical construct was as follows: The formulation of the items and indicators in accordance with the study variables; the analysis of the validity and reliability of the initial questionnaire; the review of the second version of the questionnaire and the definitive measurement instrument. Content was validated through the formal agreement and consensus of 12 university professor training experts. A reduced sample of professors who had participated in a lifelong learning program was then selected for a trial evaluation of the instrument developed. After the trial, 18 items were removed from the initial questionnaire. The final version of the instrument, comprising 33 items, was then administered to a sample group of 99 participants. The results revealed a five-dimensional structure matching theoretical expectations. Also, the reliability data for both the instrument as a whole (.98) and its various dimensions (between .91 and .97) were very high. The questionnaire was thus found to have satisfactory psychometric properties and can therefore be considered apt for studying the performance of mentoring in both induction programs for young professors and lifelong learning programs for senior faculty members.Keywords: higher education, mentoring, professional development, university teaching
Procedia PDF Downloads 1803978 Digital Platform of Crops for Smart Agriculture
Authors: Pascal François Faye, Baye Mor Sall, Bineta Dembele, Jeanne Ana Awa Faye
Abstract:
In agriculture, estimating crop yields is key to improving productivity and decision-making processes such as financial market forecasting and addressing food security issues. The main objective of this paper is to have tools to predict and improve the accuracy of crop yield forecasts using machine learning (ML) algorithms such as CART , KNN and SVM . We developed a mobile app and a web app that uses these algorithms for practical use by farmers. The tests show that our system (collection and deployment architecture, web application and mobile application) is operational and validates empirical knowledge on agro-climatic parameters in addition to proactive decision-making support. The experimental results obtained on the agricultural data, the performance of the ML algorithms are compared using cross-validation in order to identify the most effective ones following the agricultural data. The proposed applications demonstrate that the proposed approach is effective in predicting crop yields and provides timely and accurate responses to farmers for decision support.Keywords: prediction, machine learning, artificial intelligence, digital agriculture
Procedia PDF Downloads 803977 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.Keywords: classification, CRISP-DM, machine learning, predictive quality, regression
Procedia PDF Downloads 1443976 Modelling High Strain Rate Tear Open Behavior of a Bilaminate Consisting of Foam and Plastic Skin Considering Tensile Failure and Compression
Authors: Laura Pytel, Georg Baumann, Gregor Gstrein, Corina Klug
Abstract:
Premium cars often coat the instrument panels with a bilaminate consisting of a soft foam and a plastic skin. The coating is torn open during the passenger airbag deployment under high strain rates. Characterizing and simulating the top coat layer is crucial for predicting the attenuation that delays the airbag deployment, effecting the design of the restrain system and to reduce the demand of simulation adjustments through expensive physical component testing.Up to now, bilaminates used within cars either have been modelled by using a two-dimensional shell formulation for the whole coating system as one which misses out the interaction of the two layers or by combining a three-dimensional formulation foam layer with a two-dimensional skin layer but omitting the foam in the significant parts like the expected tear line area and the hinge where high compression is expected. In both cases, the properties of the coating causing the attenuation are not considered. Further, at present, the availability of material information, as there are failure dependencies of the two layers, as well as the strain rate of up to 200 1/s, are insufficient. The velocity of the passenger airbag flap during an airbag shot has been measured with about 11.5 m/s during first ripping; the digital image correlation evaluation showed resulting strain rates of above 1500 1/s. This paper provides a high strain rate material characterization of a bilaminate consisting of a thin polypropylene foam and a thermoplasctic olefins (TPO) skin and the creation of validated material models. With the help of a Split Hopkinson tension bar, strain rates of 1500 1/s were within reach. The experimental data was used to calibrate and validate a more physical modelling approach of the forced ripping of the bilaminate. In the presented model, the three-dimensional foam layer is continuously tied to the two-dimensional skin layer, allowing failure in both layers at any possible position. The simulation results show a higher agreement in terms of the trajectory of the flaps and its velocity during ripping. The resulting attenuation of the airbag deployment measured by the contact force between airbag and flaps increases and serves usable data for dimensioning modules of an airbag system.Keywords: bilaminate ripping behavior, High strain rate material characterization and modelling, induced material failure, TPO and foam
Procedia PDF Downloads 693975 Transfer Learning for Protein Structure Classification at Low Resolution
Authors: Alexander Hudson, Shaogang Gong
Abstract:
Structure determination is key to understanding protein function at a molecular level. Whilst significant advances have been made in predicting structure and function from amino acid sequence, researchers must still rely on expensive, time-consuming analytical methods to visualise detailed protein conformation. In this study, we demonstrate that it is possible to make accurate (≥80%) predictions of protein class and architecture from structures determined at low (>3A) resolution, using a deep convolutional neural network trained on high-resolution (≤3A) structures represented as 2D matrices. Thus, we provide proof of concept for high-speed, low-cost protein structure classification at low resolution, and a basis for extension to prediction of function. We investigate the impact of the input representation on classification performance, showing that side-chain information may not be necessary for fine-grained structure predictions. Finally, we confirm that high resolution, low-resolution and NMR-determined structures inhabit a common feature space, and thus provide a theoretical foundation for boosting with single-image super-resolution.Keywords: transfer learning, protein distance maps, protein structure classification, neural networks
Procedia PDF Downloads 1363974 The Negative Effects of Controlled Motivation on Mathematics Achievement
Authors: John E. Boberg, Steven J. Bourgeois
Abstract:
The decline in student engagement and motivation through the middle years is well documented and clearly associated with a decline in mathematics achievement that persists through high school. To combat this trend and, very often, to meet high-stakes accountability standards, a growing number of parents, teachers, and schools have implemented various methods to incentivize learning. However, according to Self-Determination Theory, forms of incentivized learning such as public praise, tangible rewards, or threats of punishment tend to undermine intrinsic motivation and learning. By focusing on external forms of motivation that thwart autonomy in children, adults also potentially threaten relatedness measures such as trust and emotional engagement. Furthermore, these controlling motivational techniques tend to promote shallow forms of cognitive engagement at the expense of more effective deep processing strategies. Therefore, any short-term gains in apparent engagement or test scores are overshadowed by long-term diminished motivation, resulting in inauthentic approaches to learning and lower achievement. The current study focuses on the relationships between student trust, engagement, and motivation during these crucial years as students transition from elementary to middle school. In order to test the effects of controlled motivational techniques on achievement in mathematics, this quantitative study was conducted on a convenience sample of 22 elementary and middle schools from a single public charter school district in the south-central United States. The study employed multi-source data from students (N = 1,054), parents (N = 7,166), and teachers (N = 356), along with student achievement data and contextual campus variables. Cross-sectional questionnaires were used to measure the students’ self-regulated learning, emotional and cognitive engagement, and trust in teachers. Parents responded to a single item on incentivizing the academic performance of their child, and teachers responded to a series of questions about their acceptance of various incentive strategies. Structural equation modeling (SEM) was used to evaluate model fit and analyze the direct and indirect effects of the predictor variables on achievement. Although a student’s trust in teacher positively predicted both emotional and cognitive engagement, none of these three predictors accounted for any variance in achievement in mathematics. The parents’ use of incentives, on the other hand, predicted a student’s perception of his or her controlled motivation, and these two variables had significant negative effects on achievement. While controlled motivation had the greatest effects on achievement, parental incentives demonstrated both direct and indirect effects on achievement through the students’ self-reported controlled motivation. Comparing upper elementary student data with middle-school student data revealed that controlling forms of motivation may be taking their toll on student trust and engagement over time. While parental incentives positively predicted both cognitive and emotional engagement in the younger sub-group, such forms of controlling motivation negatively predicted both trust in teachers and emotional engagement in the middle-school sub-group. These findings support the claims, posited by Self-Determination Theory, about the dangers of incentivizing learning. Short-term gains belie the underlying damage to motivational processes that lead to decreased intrinsic motivation and achievement. Such practices also appear to thwart basic human needs such as relatedness.Keywords: controlled motivation, student engagement, incentivized learning, mathematics achievement, self-determination theory, student trust
Procedia PDF Downloads 2203973 Didactic Games for the Development of Reading and Writing: Proeduca Program
Authors: Andreia Osti
Abstract:
The context experienced in the face of the COVID-19 pandemic substantially changed the way children communicate and the way literacy teaching was carried out. Officially, according to the Brazilian Institute of Geography and Statistics, children who should be literate were seriously impacted by the pandemic, and it was found that the number of illiterate children increased from 1.4 million, in 2019, to 2.4 million in 2021. In this context, this work presents partial results of an intervention project in which classroom monitoring of students in the literacy phase was carried out. Methodologically, pedagogical games were developed that work on specific reading and writing content, such as 1) games with direct regularities and; 2) Games with contextual regularities. The project involves the elaboration and production of games and their application by the classroom teacher. All work focused on literacy and improving understanding of grapheme and phoneme relationships among students, aiming to improve reading and writing comprehension levels. The project, still under development, is carried out in two schools and supports 60 students. The teachers participate in the research, as they apply the games produced at the university and monitor the children's learning process. The project is developed with financial support for research from FAPESP - in the public education improvement program – PROEDUCA. The initial results show that children are more involved in playful activities, that games provide better moments of interaction in the classroom and that they result in effective learning since they constitute a different way of approaching the content to be taught. It is noteworthy that the pedagogical games produced directly involve the teaching and learning processes of curricular components – in this case, reading and writing, which are basic components in elementary education and constitute teaching methodologies as specific and guided activities are planned in literacy methods. In this presentation, some of the materials developed will be shown, as well as the results of the assessments carried out with the students. In relation to the Sustainable Development objectives (SDGs) linked to this project, we have 4 – Quality Education, 10 – Reduction of inequalities. It is noteworthy that the research seeks to improve Public Education and promote the articulation between theory and practice in the educational context with a view to consolidating the tripod of teaching, research and university extension and promoting a humanized education.Keywords: didactic, teaching, games, learning, literacy
Procedia PDF Downloads 233972 Task Based Language Learning: A Paradigm Shift in ESL/EFL Teaching and Learning: A Case Study Based Approach
Authors: Zehra Sultan
Abstract:
The study is based on the task-based language teaching approach which is found to be very effective in the EFL/ESL classroom. This approach engages learners to acquire the usage of authentic language skills by interacting with the real world through sequence of pedagogical tasks. The use of technology enhances the effectiveness of this approach. This study throws light on the historical background of TBLT and its efficacy in the EFL/ESL classroom. In addition, this study precisely talks about the implementation of this approach in the General Foundation Programme of Muscat College, Oman. It furnishes the list of the pedagogical tasks embedded in the language curriculum of General Foundation Programme (GFP) which are skillfully allied to the College Graduate Attributes. Moreover, the study also discusses the challenges pertaining to this approach from the point of view of teachers, students, and its classroom application. Additionally, the operational success of this methodology is gauged through formative assessments of the GFP, which is apparent in the students’ progress.Keywords: task-based language teaching, authentic language, communicative approach, real world activities, ESL/EFL activities
Procedia PDF Downloads 1243971 Manage an Acute Pain Unit based on the Balanced Scorecard
Authors: Helena Costa Oliveira, Carmem Oliveira, Rita Moutinho
Abstract:
The Balanced Scorecard (BSC) is a continuous strategic monitoring model focused not only on financial issues but also on internal processes, patients/users, and learning and growth. Initially dedicated to business management, it currently serves organizations of other natures - such as hospitals. This paper presents a BSC designed for a Portuguese Acute Pain Unit (APU). This study is qualitative and based on the experience of collaborators at the APU. The management of APU is based on four perspectives – users, internal processes, learning and growth, and financial and legal. For each perspective, there were identified strategic objectives, critical factors, lead indicators and initiatives. The strategic map of the APU outlining sustained strategic relations among strategic objectives. This study contributes to the development of research in the health management area as it explores how organizational insufficiencies and inconsistencies in this particular case can be addressed, through the identification of critical factors, to clearly establish core outcomes and initiatives to set up.Keywords: acute pain unit, balanced scorecard, hospital management, organizational performance, Portugal
Procedia PDF Downloads 1483970 Teaching Practitioners to Use Technology to Support and Instruct Students with Autism Spectrum Disorders
Authors: Nicole Nicholson, Anne Spillane
Abstract:
The purpose of this quantitative, descriptive analysis was to determine the success of a post-graduate new teacher education program, designed to teach educators the knowledge and skills necessary to use technology in the classroom, improve the ability to communicate with stakeholders, and implement EBPs and UDL principles into instruction for students with ASD (Autism Spectrum Disorders ). The success of candidates (n=20) in the program provided evidence as to how candidates were effectively able to use technology to create meaningful learning opportunities and implement EBPs for individuals with ASD. ≥90% of participants achieved the following competencies: podcast creation; technology used to share information about assistive technology; and created a resource website on ASD (including information on EBPs, local and national support groups, ASD characteristics, and the latest research on ASD). 59% of students successfully created animation. Results of the analysis indicated that the teacher education program was successful in teaching candidates desired competencies during its first year of implementation.Keywords: autism spectrum disorders, ASD, evidence based practices, EBP, universal design for learning, UDL
Procedia PDF Downloads 1633969 A Scalable Model of Fair Socioeconomic Relations Based on Blockchain and Machine Learning Algorithms-1: On Hyperinteraction and Intuition
Authors: Merey M. Sarsengeldin, Alexandr S. Kolokhmatov, Galiya Seidaliyeva, Alexandr Ozerov, Sanim T. Imatayeva
Abstract:
This series of interdisciplinary studies is an attempt to investigate and develop a scalable model of fair socioeconomic relations on the base of blockchain using positive psychology techniques and Machine Learning algorithms for data analytics. In this particular study, we use hyperinteraction approach and intuition to investigate their influence on 'wisdom of crowds' via created mobile application which was created for the purpose of this research. Along with the public blockchain and private Decentralized Autonomous Organization (DAO) which were elaborated by us on the base of Ethereum blockchain, a model of fair financial relations of members of DAO was developed. We developed a smart contract, so-called, Fair Price Protocol and use it for implementation of model. The data obtained from mobile application was analyzed by ML algorithms. A model was tested on football matches.Keywords: blockchain, Naïve Bayes algorithm, hyperinteraction, intuition, wisdom of crowd, decentralized autonomous organization
Procedia PDF Downloads 1693968 Representational Issues in Learning Solution Chemistry at Secondary School
Authors: Lam Pham, Peter Hubber, Russell Tytler
Abstract:
Students’ conceptual understandings of chemistry concepts/phenomena involve capability to coordinate across the three levels of Johnston’s triangle model. This triplet model is based on reasoning about chemical phenomena across macro, sub-micro and symbolic levels. In chemistry education, there is a need for further examining inquiry-based approaches that enhance students’ conceptual learning and problem solving skills. This research adopted a directed inquiry pedagogy based on students constructing and coordinating representations, to investigate senior school students’ capabilities to flexibly move across Johnston’ levels when learning dilution and molar concentration concepts. The participants comprise 50 grade 11 and 20 grade 10 students and 4 chemistry teachers who were selected from 4 secondary schools located in metropolitan Melbourne, Victoria. This research into classroom practices used ethnographic methodology, involved teachers working collaboratively with the research team to develop representational activities and lesson sequences in the instruction of a unit on solution chemistry. The representational activities included challenges (Representational Challenges-RCs) that used ‘representational tools’ to assist students to move across Johnson’s three levels for dilution phenomena. In this report, the ‘representational tool’ called ‘cross and portion’ model was developed and used in teaching and learning the molar concentration concept. Students’ conceptual understanding and problem solving skills when learning with this model are analysed through group case studies of year 10 and 11 chemistry students. In learning dilution concepts, students in both group case studies actively conducted a practical experiment, used their own language and visualisation skills to represent dilution phenomena at macroscopic level (RC1). At the sub-microscopic level, students generated and negotiated representations of the chemical interactions between solute and solvent underpinning the dilution process. At the symbolic level, students demonstrated their understandings about dilution concepts by drawing chemical structures and performing mathematical calculations. When learning molar concentration with a ‘cross and portion’ model (RC2), students coordinated across visual and symbolic representational forms and Johnson’s levels to construct representations. The analysis showed that in RC1, Year 10 students needed more ‘scaffolding’ in inducing to representations to explicit the form and function of sub-microscopic representations. In RC2, Year 11 students showed clarity in using visual representations (drawings) to link to mathematics to solve representational challenges about molar concentration. In contrast, year 10 students struggled to get match up the two systems, symbolic system of mole per litre (‘cross and portion’) and visual representation (drawing). These conceptual problems do not lie in the students’ mathematical calculation capability but rather in students’ capability to align visual representations with the symbolic mathematical formulations. This research also found that students in both group case studies were able to coordinate representations when probed about the use of ‘cross and portion’ model (in RC2) to demonstrate molar concentration of diluted solutions (in RC1). Students mostly succeeded in constructing ‘cross and portion’ models to represent the reduction of molar concentration of the concentration gradients. In conclusion, this research demonstrated how the strategic introduction and coordination of chemical representations across modes and across the macro, sub-micro and symbolic levels, supported student reasoning and problem solving in chemistry.Keywords: cross and portion, dilution, Johnston's triangle, molar concentration, representations
Procedia PDF Downloads 1373967 Development and Validation of a Quantitative Measure of Engagement in the Analysing Aspect of Dialogical Inquiry
Authors: Marcus Goh Tian Xi, Alicia Chua Si Wen, Eunice Gan Ghee Wu, Helen Bound, Lee Liang Ying, Albert Lee
Abstract:
The Map of Dialogical Inquiry provides a conceptual look at the underlying nature of future-oriented skills. According to the Map, learning is learner-oriented, with conversational time shifted from teachers to learners, who play a strong role in deciding what and how they learn. For example, in courses operating on the principles of Dialogical Inquiry, learners were able to leave the classroom with a deeper understanding of the topic, broader exposure to differing perspectives, and stronger critical thinking capabilities, compared to traditional approaches to teaching. Despite its contributions to learning, the Map is grounded in a qualitative approach both in its development and its application for providing feedback to learners and educators. Studies hinge on openended responses by Map users, which can be time consuming and resource intensive. The present research is motivated by this gap in practicality by aiming to develop and validate a quantitative measure of the Map. In addition, a quantifiable measure may also strengthen applicability by making learning experiences trackable and comparable. The Map outlines eight learning aspects that learners should holistically engage. This research focuses on the Analysing aspect of learning. According to the Map, Analysing has four key components: liking or engaging in logic, using interpretative lenses, seeking patterns, and critiquing and deconstructing. Existing scales of constructs (e.g., critical thinking, rationality) related to these components were identified so that the current scale could adapt items from. Specifically, items were phrased beginning with an “I”, followed by an action phrase, to fulfil the purpose of assessing learners' engagement with Analysing either in general or in classroom contexts. Paralleling standard scale development procedure, the 26-item Analysing scale was administered to 330 participants alongside existing scales with varying levels of association to Analysing, to establish construct validity. Subsequently, the scale was refined and its dimensionality, reliability, and validity were determined. Confirmatory factor analysis (CFA) revealed if scale items loaded onto the four factors corresponding to the components of Analysing. To refine the scale, items were systematically removed via an iterative procedure, according to their factor loadings and results of likelihood ratio tests at each step. Eight items were removed this way. The Analysing scale is better conceptualised as unidimensional, rather than comprising the four components identified by the Map, for three reasons: 1) the covariance matrix of the model specified for the CFA was not positive definite, 2) correlations among the four factors were high, and 3) exploratory factor analyses did not yield an easily interpretable factor structure of Analysing. Regarding validity, since the Analysing scale had higher correlations with conceptually similar scales than conceptually distinct scales, with minor exceptions, construct validity was largely established. Overall, satisfactory reliability and validity of the scale suggest that the current procedure can result in a valid and easy-touse measure for each aspect of the Map.Keywords: analytical thinking, dialogical inquiry, education, lifelong learning, pedagogy, scale development
Procedia PDF Downloads 91