Search results for: computational machine learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10022

Search results for: computational machine learning

5672 Iterative Dynamic Programming for 4D Flight Trajectory Optimization

Authors: Kawser Ahmed, K. Bousson, Milca F. Coelho

Abstract:

4D flight trajectory optimization is one of the key ingredients to improve flight efficiency and to enhance the air traffic capacity in the current air traffic management (ATM). The present paper explores the iterative dynamic programming (IDP) as a potential numerical optimization method for 4D flight trajectory optimization. IDP is an iterative version of the Dynamic programming (DP) method. Due to the numerical framework, DP is very suitable to deal with nonlinear discrete dynamic systems. The 4D waypoint representation of the flight trajectory is similar to the discretization by a grid system; thus DP is a natural method to deal with the 4D flight trajectory optimization. However, the computational time and space complexity demanded by the DP is enormous due to the immense number of grid points required to find the optimum, which prevents the use of the DP in many practical high dimension problems. On the other hand, the IDP has shown potentials to deal successfully with high dimension optimal control problems even with a few numbers of grid points at each stage, which reduces the computational effort over the traditional DP approach. Although the IDP has been applied successfully in chemical engineering problems, IDP is yet to be validated in 4D flight trajectory optimization problems. In this paper, the IDP has been successfully used to generate minimum length 4D optimal trajectory avoiding any obstacle in its path, such as a no-fly zone or residential areas when flying in low altitude to reduce noise pollution.

Keywords: 4D waypoint navigation, iterative dynamic programming, obstacle avoidance, trajectory optimization

Procedia PDF Downloads 144
5671 Modelling and Assessment of an Off-Grid Biogas Powered Mini-Scale Trigeneration Plant with Prioritized Loads Supported by Photovoltaic and Thermal Panels

Authors: Lorenzo Petrucci

Abstract:

This paper is intended to give insight into the potential use of small-scale off-grid trigeneration systems powered by biogas generated in a dairy farm. The off-grid plant object of analysis comprises a dual-fuel Genset as well as electrical and thermal storage equipment and an adsorption machine. The loads are the different apparatus used in the dairy farm, a household where the workers live and a small electric vehicle whose batteries can also be used as a power source in case of emergency. The insertion in the plant of an adsorption machine is mainly justified by the abundance of thermal energy and the simultaneous high cooling demand associated with the milk-chilling process. In the evaluated operational scenario, our research highlights the importance of prioritizing specific small loads which cannot sustain an interrupted supply of power over time. As a consequence, a photovoltaic and thermal panel is included in the plant and is tasked with providing energy independently of potentially disruptive events such as engine malfunctioning or scarce and unstable supplies of fuels. To efficiently manage the plant an energy dispatch strategy is created in order to control the flow of energy between the power sources and the thermal and electric storages. In this article we elaborate on models of the equipment and from these models, we extract parameters useful to build load-dependent profiles of the prime movers and storage efficiencies. We show that under reasonable assumptions the analysis provides a sensible estimate of the generated energy. The simulations indicate that a Diesel Generator sized to a value 25% higher than the total electrical peak demand operates 65% of the time below the minimum acceptable load threshold. To circumvent such a critical operating mode, dump loads are added through the activation and deactivation of small resistors. In this way, the excess of electric energy generated can be transformed into useful heat. The combination of PVT and electrical storage to support the prioritized load in an emergency scenario is evaluated in two different days of the year having the lowest and highest irradiation values, respectively. The results show that the renewable energy component of the plant can successfully sustain the prioritized loads and only during a day with very low irradiation levels it also needs the support of the EVs’ battery. Finally, we show that the adsorption machine can reduce the ice builder and the air conditioning energy consumption by 40%.

Keywords: hybrid power plants, mathematical modeling, off-grid plants, renewable energy, trigeneration

Procedia PDF Downloads 159
5670 Parents and Stakeholders’ Perspectives on Early Reading Intervention Implemented as a Curriculum for Children with Learning Disabilities

Authors: Bander Mohayya Alotaibi

Abstract:

The valuable partnerships between parents and teachers may develop positive and effective interactions between home and school. This will help these stakeholders share information and resources regarding student academics during ongoing interactions. Thus, partnerships will build a solid foundation for both families and schools to help children succeed in school. Parental involvement can be seen as an effective tool that can change homes and communities and not just schools’ systems. Seeking parents and stakeholders’ attitudes toward learning and learners can help schools design a curriculum. Subsequently, this information can be used to find ways to help improve the academic performance of students, especially in low performing schools. There may be some conflicts when designing curriculum. In addition, designing curriculum might bring more educational expectations to all the sides. There is a lack of research that targets the specific attitude of parents toward specific concepts on curriculum contents. More research is needed to study the perspective that parents of children with learning disabilities (LD) have regarding early reading curriculum. Parents and stakeholders’ perspectives on early reading intervention implemented as a curriculum for children with LD was studied through an advanced quantitative research. The purpose of this study seeks to understand stakeholders and parents’ perspectives of key concepts and essential early reading skills that impact the design of curriculum that will serve as an intervention for early struggler readers who have LD. Those concepts or stages include phonics, phonological awareness, and reading fluency as well as strategies used in house by parents. A survey instrument was used to gather the data. Participants were recruited through 29 schools and districts of the metropolitan area of the northern part of Saudi Arabia. Participants were stakeholders including parents of children with learning disability. Data were collected using distribution of paper and pen survey to schools. Psychometric properties of the instrument were evaluated for the validity and reliability of the survey; face validity, content validity, and construct validity including an Exploratory Factor Analysis were used to shape and reevaluate the structure of the instrument. Multivariate analysis of variance (MANOVA) used to find differences between the variables. The study reported the results of the perspectives of stakeholders toward reading strategies, phonics, phonological awareness, and reading fluency. Also, suggestions and limitations are discussed.

Keywords: stakeholders, learning disability, early reading, perspectives, parents, intervention, curriculum

Procedia PDF Downloads 139
5669 3D Multiuser Virtual Environments in Language Teaching

Authors: Hana Maresova, Daniel Ecler

Abstract:

The paper focuses on the use of 3D multi-user virtual environments (MUVE) in language teaching and presents the results of four years of research at the Faculty of Education, Palacký University in Olomouc (Czech Republic). In the form of an experiment, mother tongue language teaching in the 3D virtual worlds Second Life and Kitely (experimental group) and parallel traditional teaching on identical topics representing teacher's interpretation using a textbook (control group) were implemented. The didactic test, which was presented to the experimental and control groups in an identical form before and after the instruction, verified the effect of the instruction in the experimental group by comparing the results obtained by both groups. Within the three components of mother-tongue teaching (vocabulary, literature, style and communication education), the students in the literature group achieved partially better results (statistically significant in the case of items devoted to the area of visualization of the learning topic), while in the case of grammar and style education the respondents of the control group achieved better results. On the basis of the results obtained, we can conclude that the most appropriate use of MUVE can be seen in the teaching of those topics that provide the possibility of dramatization, experiential learning and group involvement and cooperation, on the contrary, with regard to the need to divide students attention between the topic taught and the control of avatar and movement in virtual reality as less suitable for teaching in the area of memorization of the topic or concepts.

Keywords: distance learning, 3D virtual environments, online teaching, language teaching

Procedia PDF Downloads 147
5668 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 309
5667 Customized Temperature Sensors for Sustainable Home Appliances

Authors: Merve Yünlü, Nihat Kandemir, Aylin Ersoy

Abstract:

Temperature sensors are used in home appliances not only to monitor the basic functions of the machine but also to minimize energy consumption and ensure safe operation. In parallel with the development of smart home applications and IoT algorithms, these sensors produce important data such as the frequency of use of the machine, user preferences, and the compilation of critical data in terms of diagnostic processes for fault detection throughout an appliance's operational lifespan. Commercially available thin-film resistive temperature sensors have a well-established manufacturing procedure that allows them to operate over a wide temperature range. However, these sensors are over-designed for white goods applications. The operating temperature range of these sensors is between -70°C and 850°C, while the temperature range requirement in home appliance applications is between 23°C and 500°C. To ensure the operation of commercial sensors in this wide temperature range, usually, a platinum coating of approximately 1-micron thickness is applied to the wafer. However, the use of platinum in coating and the high coating thickness extends the sensor production process time and therefore increases sensor costs. In this study, an attempt was made to develop a low-cost temperature sensor design and production method that meets the technical requirements of white goods applications. For this purpose, a custom design was made, and design parameters (length, width, trim points, and thin film deposition thickness) were optimized by using statistical methods to achieve the desired resistivity value. To develop thin film resistive temperature sensors, one side polished sapphire wafer was used. To enhance adhesion and insulation 100 nm silicon dioxide was coated by inductively coupled plasma chemical vapor deposition technique. The lithography process was performed by a direct laser writer. The lift-off process was performed after the e-beam evaporation of 10 nm titanium and 280 nm platinum layers. Standard four-point probe sheet resistance measurements were done at room temperature. The annealing process was performed. Resistivity measurements were done with a probe station before and after annealing at 600°C by using a rapid thermal processing machine. Temperature dependence between 25-300 °C was also tested. As a result of this study, a temperature sensor has been developed that has a lower coating thickness than commercial sensors but can produce reliable data in the white goods application temperature range. A relatively simplified but optimized production method has also been developed to produce this sensor.

Keywords: thin film resistive sensor, temperature sensor, household appliance, sustainability, energy efficiency

Procedia PDF Downloads 56
5666 Evaluation and Assessment of Bioinformatics Methods and Their Applications

Authors: Fatemeh Nokhodchi Bonab

Abstract:

Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.

Keywords: methods, applications, transcriptional regulatory systems, techniques

Procedia PDF Downloads 107
5665 Searching the Relationship among Components that Contribute to Interactive Plight and Educational Execution

Authors: Shri Krishna Mishra

Abstract:

In an educational context, technology can prompt interactive plight only when it is used in conjunction with interactive plight methods. This study, therefore, examines the relationships among components that contribute to higher levels of interactive plight and execution, such as interactive Plight methods, technology, intrinsic motivation and deep learning. 526 students participated in this study. With structural equation modelling, the authors test the conceptual model and identify satisfactory model fit. The results indicate that interactive Plight methods, technology and intrinsic motivation have significant relationship with interactive Plight; deep learning mediates the relationships of the other variables with Execution.

Keywords: searching the relationship among components, contribute to interactive plight, educational execution, intrinsic motivation

Procedia PDF Downloads 437
5664 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 496
5663 Audio-Visual Aids and the Secondary School Teaching

Authors: Shrikrishna Mishra, Badri Yadav

Abstract:

In this complex society of today where experiences are innumerable and varied, it is not at all possible to present every situation in its original colors hence the opportunities for learning by actual experiences always are not at all possible. It is only through the use of proper audio visual aids that the life situation can be trough in the class room by an enlightened teacher in their simplest form and representing the original to the highest point of similarity which is totally absent in the verbal or lecture method. In the presence of audio aids, the attention is attracted interest roused and suitable atmosphere for proper understanding is automatically created, but in the existing traditional method greater efforts are to be made in order to achieve the aforesaid essential requisite. Inspire of the best and sincere efforts on the side of the teacher the net effect as regards understanding or learning in general is quite negligible.

Keywords: Audio-Visual Aids, the secondary school teaching, complex society, audio

Procedia PDF Downloads 469
5662 The Impact of Science Teachers' Epistemological Beliefs and Metacognition on Their Use of Inquiry Based Teaching Approaches

Authors: Irfan Ahmed Rind

Abstract:

Science education has recently become the top priority of government of Pakistan. Number of schemes has been initiated for the improvement of science teaching and learning at primary and secondary levels of education, most importantly training in-service science teachers on inquiry based teaching and learning to empower students and encourage creativity, critical thinking, and innovation among them. Therefore, this approach has been promoted in the recent continuous professional development trainings for the in-service teachers. However, the follow ups on trained science teachers and educators suggest that these teachers fail to implement the inquiry based teaching and learning in their classes. In addition, these trainings also fail to bring any significant change in students’ science content knowledge and understanding as per the annual national level surveys conducted by government and independent agencies. Research suggests that science has been taught using scientific positivism, which supports objectivity based on experiments and mathematics. In contrary, the inquiry based teaching and learning are based on constructivism, which conflicts with the positivist epistemology of science teachers. It was, therefore, assumed that science teachers struggle to implement the inquiry based teaching approach as it conflicts with their basic epistemological beliefs. With this assumption, this research aimed to (i) understand how science teachers conceptualize the nature of science, and how this influence their understanding of learning, learners, their own roles as teachers and their teaching strategies, (ii) identify the conflict of science teachers’ epistemological beliefs with the inquiry based teaching approach, and (iii) find the ways in which science teachers epistemological beliefs may be developed from positivism to constructivism, so that they may effectively use the inquiry based teaching approach in teaching science. Using qualitative case study approach, thirty six secondary and higher secondary science teachers (21 male and 15 female) were selected. Data was collected using interviewed, participatory observations (sixty lessons were observed), and twenty interviews from students for verifications of teachers’ responses. The findings suggest that most of the science teacher were positivist in defining the nature of science. Most of them limit themselves to one fix answer that is provided in the books and that there is only one 'right' way to teach science. There is no room for students’ or teachers’ own opinion or bias when it comes to scientific concepts. Inquiry based teaching seems 'no right' to them. They find it difficult to allow students to think out of the box. However, some interesting exercises were found to be very effective in bringing the change in teachers’ epistemological beliefs. These will be discussed in detail in the paper. The findings have major implications for the teachers, educators, and policymakers.

Keywords: science teachers, epistemology, metacognition, inquiry based teaching

Procedia PDF Downloads 138
5661 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP

Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost

Abstract:

The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.

Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)

Procedia PDF Downloads 411
5660 Dao Embodied – Embodying Dao: The Body as Locus of Personal Cultivation in Ancient Daoist and Confucian Philosophy

Authors: Geir Sigurðsson

Abstract:

This paper compares ancient Daoist and Confucian approaches to the human body as a locus for learning, edification or personal cultivation. While pointing out some major differences between ancient Chinese and mainstream Western visions of the body, it seeks at the same time inspiration in some seminal Western phenomenological and post-structuralist writings, in particular from Maurice Merleau-Ponty and Pierre Bourdieu. By clarifying the somewhat dissimilar scopes of foci found in Daoist and Confucian philosophies with regard to the role of and attitude to the body, the conclusion is nevertheless that their approaches are comparable, and that both traditions take the physical body to play a vital role in the cultivation of excellence. Lastly, it will be argued that cosmological underpinnings prevent the Confucian li from being rigid and invariable and that it rather emerges as a flexible learning device to train through active embodiment a refined sensibility for one’s cultural environment.

Keywords: body, Confucianism, Daoism, li (ritual), phenomenology

Procedia PDF Downloads 115
5659 Coarse Grid Computational Fluid Dynamics Fire Simulations

Authors: Wolfram Jahn, Jose Manuel Munita

Abstract:

While computational fluid dynamics (CFD) simulations of fire scenarios are commonly used in the design of buildings, less attention has been given to the use of CFD simulations as an operational tool for the fire services. The reason of this lack of attention lies mainly in the fact that CFD simulations typically take large periods of time to complete, and their results would thus not be available in time to be of use during an emergency. Firefighters often face uncertain conditions when entering a building to attack a fire. They would greatly benefit from a technology based on predictive fire simulations, able to assist their decision-making process. The principal constraint to faster CFD simulations is the fine grid necessary to solve accurately the physical processes that govern a fire. This paper explores the possibility of overcoming this constraint and using coarse grid CFD simulations for fire scenarios, and proposes a methodology to use the simulation results in a meaningful way that can be used by the fire fighters during an emergency. Data from real scale compartment fire tests were used to compare CFD fire models with different grid arrangements, and empirical correlations were obtained to interpolate data points into the grids. The results show that the strongly predominant effect of the heat release rate of the fire on the fluid dynamics allows for the use of coarse grids with relatively low overall impact of simulation results. Simulations with an acceptable level of accuracy could be run in real time, thus making them useful as a forecasting tool for emergency response purposes.

Keywords: CFD, fire simulations, emergency response, forecast

Procedia PDF Downloads 303
5658 Learners' Perceptions about Teacher Written Feedback in the School of Foreign Languages, Anadolu University

Authors: Gaye Senbag

Abstract:

In English language teaching, feedback is considered as one of the main components of writing instruction. Teachers put a lot of time and effort in order to provide learners with written feedback for effective language learning. At Anadolu University School of Foreign Languages (AUSFL) students are given written feedback for their each piece of writing through online platforms such as Edmodo and Turnitin, and traditional methods. However, little is known regarding how learners value and respond to teacher-provided feedback. As the perceptions of the students remarkably affect their learning, this study examines how they perceive the effectiveness of feedback provided by the teacher. Aiming to analyse it, 30 intermediate level (B1+ CEFR level) students were given a questionnaire, which includes Likert scale questions. The results will be discussed in detail.

Keywords: feedback, perceptions, writing, English Language Teaching (ELT)

Procedia PDF Downloads 231
5657 Deploying a Platform as a Service Cloud Solution to Support Student Learning

Authors: Jiangping Wang

Abstract:

This presentation describes the design and implementation of PaaS (platform as a service) cloud-based labs that are used in database-related courses to teach students practical skills. Traditionally, all labs are implemented in a desktop-based environment where students have to install heavy client software to access database servers. In order to release students from that burden, we have successfully deployed the cloud-based solution to support database-related courses, from which students and teachers can practice and learn database topics in various database courses via cloud access. With its development environment, execution runtime, web server, database server, and collaboration capability, it offers a shared pool of configurable computing resources and comprehensive environment that supports students’ needs without the complexity of maintaining the infrastructure.

Keywords: PaaS, database environment, e-learning, web server

Procedia PDF Downloads 255
5656 The Impact of Professional Development in the Area of Technology Enhanced Learning on Higher Education Teaching Practices Across Atlantic Technological University – Research Methodology and Preliminary Findings

Authors: Annette Cosgrove

Abstract:

The objectives of this research study is to examine the impact of professional development in Technology Enhanced Learning (TEL) and the digitisation of learning in teaching communities across multiple higher education sites in the ATU (Atlantic Technological University *) ( 2020-2025), including the proposal of an evidence based digital teaching model for use in a future pandemic. The research strategy undertaken for this PhD Study is a multi-site study using mixed methods. Qualitative & quantitative methods are being used in the study to collect data. A pilot study was carried out initially , feedback collected and the research instrument was edited to reflect this feedback, before being administered. The purpose of the staff questionnaire is to evaluate the impact of professional development in the area of TEL, and to capture the practitioners views on the perceived impact on their teaching practice in the higher education sector across ATU (West of Ireland – 5 Higher education locations ). The phenomenon being explored is ‘ the impact of professional development in the area of technology enhanced learning and on teaching practice in a higher education institution.’ The research methodology chosen for this study is an Action based Research Study. The researcher has chosen this approach as it is a prime strategy for developing educational theory and enhancing educational practice . This study includes quantitative and qualitative methods to elicit data which will quantify the impact that continuous professional development in the area of digital teaching practice and technologies has on the practitioner’s teaching practice in higher education. The research instruments / data collection tools for this study include a lecturer survey with a targeted TEL Practice group ( Pre and post covid experience) and semi-structured interviews with lecturers.. This research is currently being conducted across the ATU multisite campus and targeting Higher education lecturers that have completed formal CPD in the area of digital teaching. ATU, a west of Ireland university is the focus of the study , The research questionnaire has been deployed, with 75 respondents to date across the ATU - the primary questionnaire and semi- formal interviews are ongoing currently – the purpose being to evaluate the impact of formal professional development in the area of TEL and its perceived impact on the practitioners teaching practice in the area of digital teaching and learning . This paper will present initial findings, reflections and data from this ongoing research study.

Keywords: TEL, DTL, digital teaching, digital assessment

Procedia PDF Downloads 50
5655 The Reliability and Shape of the Force-Power-Velocity Relationship of Strength-Trained Males Using an Instrumented Leg Press Machine

Authors: Mark Ashton Newman, Richard Blagrove, Jonathan Folland

Abstract:

The force-velocity profile of an individual has been shown to influence success in ballistic movements, independent of the individuals' maximal power output; therefore, effective and accurate evaluation of an individual’s F-V characteristics and not solely maximal power output is important. The relatively narrow range of loads typically utilised during force-velocity profiling protocols due to the difficulty in obtaining force data at high velocities may bring into question the accuracy of the F-V slope along with predictions pertaining to the maximum force that the system can produce at a velocity of null (F₀) and the theoretical maximum velocity against no load (V₀). As such, the reliability of the slope of the force-velocity profile, as well as V₀, has been shown to be relatively poor in comparison to F₀ and maximal power, and it has been recommended to assess velocity at loads closer to both F₀ and V₀. The aim of the present study was to assess the relative and absolute reliability of an instrumented novel leg press machine which enables the assessment of force and velocity data at loads equivalent to ≤ 10% of one repetition maximum (1RM) through to 1RM during a ballistic leg press movement. The reliability of maximal and mean force, velocity, and power, as well as the respective force-velocity and power-velocity relationships and the linearity of the force-velocity relationship, were evaluated. Sixteen male strength-trained individuals (23.6 ± 4.1 years; 177.1 ± 7.0 cm; 80.0 ± 10.8 kg) attended four sessions; during the initial visit, participants were familiarised with the leg press, modified to include a mounted force plate (Type SP3949, Force Logic, Berkshire, UK) and a Micro-Epsilon WDS-2500-P96 linear positional transducer (LPT) (Micro-Epsilon, Merseyside, UK). Peak isometric force (IsoMax) and a dynamic 1RM, both from a starting position of 81% leg length, were recorded for the dominant leg. Visits two to four saw the participants carry out the leg press movement at loads equivalent to ≤ 10%, 30%, 50%, 70%, and 90% 1RM. IsoMax was recorded during each testing visit prior to dynamic F-V profiling repetitions. The novel leg press machine used in the present study appears to be a reliable tool for measuring F and V-related variables across a range of loads, including velocities closer to V₀ when compared to some of the findings within the published literature. Both linear and polynomial models demonstrated good to excellent levels of reliability for SFV and F₀ respectively, with reliability for V₀ being good using a linear model but poor using a 2nd order polynomial model. As such, a polynomial regression model may be most appropriate when using a similar unilateral leg press setup to predict maximal force production capabilities due to only a 5% difference between F₀ and obtained IsoMax values with a linear model being best suited to predict V₀.

Keywords: force-velocity, leg-press, power-velocity, profiling, reliability

Procedia PDF Downloads 38
5654 Use of Large Eddy Simulations Model to Simulate the Flow of Heavy Oil-Water-Air through Pipe

Authors: Salim Al Jadidi, Shian Gao, Shivananda Moolya

Abstract:

Computational Fluid Dynamic (CFD) technique coupled with Sub-Grid-Scale (SGS) model is used to study the flow behavior of heavy oil-water-air flow in a horizontal pipe by adapting ANSYS Fluent CFD software. The technique suitable for the transport of water-lubricated heavy viscous oil in a horizontal pipe is the Core Annular flow (CAF) technique. The present study focuses on the numerical study of CAF adapting Large Eddy Simulations (LES). The basic objective of the present study is to gain a basic knowledge of the flow behavior of heavy oil using turbulent CAF through a conventional horizontal pipe. This work also focuses on the success and applicability of LES. The simulation of heavy oil-water-air three-phase flow and two-phase flow of heavy oil–water in a conventional horizontal pipe is performed using ANSYS Fluent 16.2 software. The influence of three-phase heavy oil-water air flow in a selected pipe is affected by gravity. It is also observed from the result that the air phase and the variation in the temperature impact the behavior of the annular stream and pressure drop. Some results obtained during the study are validated with the results gained from part of the literature experiments and simulations, and the results show reasonably good agreement between the studies.

Keywords: computational fluid dynamics, gravity, heavy viscous oil, three-phase flow

Procedia PDF Downloads 63
5653 Assessment of Hypersaline Outfalls via Computational Fluid Dynamics Simulations: A Case Study of the Gold Coast Desalination Plant Offshore Multiport Brine Diffuser

Authors: Mitchell J. Baum, Badin Gibbes, Greg Collecutt

Abstract:

This study details a three-dimensional field-scale numerical investigation conducted for the Gold Coast Desalination Plant (GCDP) offshore multiport brine diffuser. Quantitative assessment of diffuser performance with regard to trajectory, dilution and mapping of seafloor concentration distributions was conducted for 100% plant operation. The quasi-steady Computational Fluid Dynamics (CFD) simulations were performed using the Reynolds averaged Navier-Stokes equations with a k-ω shear stress transport turbulence closure scheme. The study compliments a field investigation, which measured brine plume characteristics under similar conditions. CFD models used an iterative mesh in a domain with dimensions 400 m long, 200 m wide and an average depth of 24.2 m. Acoustic Doppler current profiler measurements conducted in the companion field study exhibited considerable variability over the water column. The effect of this vertical variability on simulated discharge outcomes was examined. Seafloor slope was also accommodated into the model. Ambient currents varied predominantly in the longshore direction – perpendicular to the diffuser structure. Under these conditions, the alternating port orientation of the GCDP diffuser resulted in simultaneous subjection to co-propagating and counter-propagating ambient regimes. Results from quiescent ambient simulations suggest broad agreement with empirical scaling arguments traditionally employed in design and regulatory assessments. Simulated dynamic ambient regimes showed the influence of ambient crossflow upon jet trajectory, dilution and seafloor concentration is significant. The effect of ambient flow structure and the subsequent influence on jet dynamics is discussed, along with the implications for using these different simulation approaches to inform regulatory decisions.

Keywords: computational fluid dynamics, desalination, field-scale simulation, multiport brine diffuser, negatively buoyant jet

Procedia PDF Downloads 199
5652 Fostering Inclusive Learning: The Role of Intercultural Communication in Multilingual Primary Education

Authors: Ozge Yalciner

Abstract:

Intercultural communication is crucial in the education of multilingual learners in primary grades, significantly influencing their academic and social development. This study explores how intercultural communication intersects with multilingual education, highlighting the importance of culturally responsive teaching practices. It addresses the challenges and opportunities presented by diverse linguistic backgrounds and proposes strategies for creating inclusive and supportive learning environments. The research emphasizes the need for teacher training programs that equip educators with the skills to recognize and address cultural differences, thereby enhancing student engagement and participation. This study was completed in an elementary school in a city in the Midwest, USA. The data was collected through observations and interviews with students and teachers. It discusses the integration of multicultural perspectives in curricula and the promotion of language diversity as an asset. Peer interactions and collaborative learning are highlighted as crucial for developing intercultural competence among young learners. The findings suggest that meaningful intercultural communication fosters a sense of belonging and mutual respect, leading to improved educational outcomes for multilingual students. Prioritizing intercultural communication in primary education is essential for supporting the linguistic and cultural identities of multilingual learners. By adopting inclusive pedagogical approaches and fostering an environment of cultural appreciation, educators can better support their students' academic success and personal growth.

Keywords: diversity, intercultural communication, multilingual learners, primary grades

Procedia PDF Downloads 16
5651 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines

Authors: Alexander Guzman Urbina, Atsushi Aoyama

Abstract:

The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.

Keywords: deep learning, risk assessment, neuro fuzzy, pipelines

Procedia PDF Downloads 282
5650 Adaptive Programming for Indigenous Early Learning: The Early Years Model

Authors: Rachel Buchanan, Rebecca LaRiviere

Abstract:

Context: The ongoing effects of colonialism continue to be experienced through paternalistic policies and funding processes that cause disjuncture between and across Indigenous early childhood programming on-reserve and in urban and Northern settings in Canada. While various educational organizations and social service providers have risen to address these challenges in the short, medium and long term, there continues to be a lack in nation-wide cohesive, culturally grounded, and meaningful early learning programming for Indigenous children in Canada. Indigenous-centered early learning programs tend to face one of two scaling dilemmas: their program goals are too prescriptive to enable the program to be meaningfully replicated in different cultural/ community settings, or their program goals are too broad to be meaningfully adapted to the unique cultural and contextual needs and desires of Indigenous communities (the “franchise approach”). There are over 600 First Nations communities in Canada representing more than 50 Nations and languages. Consequently, Indigenous early learning programming cannot be applied with a universal or “one size fits all” approach. Sustainable and comprehensive programming must be responsive to each community context, building upon existing strengths and assets to avoid program duplication and irrelevance. Thesis: Community-driven and culturally adapted early childhood programming is critical but cannot be achieved on a large scale within traditional program models that are constrained by prescriptive overarching program goals. Principles, rather than goals, are an effective way to navigate and evaluate complex and dynamic systems. Principles guide an intervention to be adaptable, flexible and scalable. The Martin Family Initiative (MFI) ’s Early Years program engages a principles-based approach to programming. As will be discussed in this paper, this approach enables the program to catalyze existing community-based strengths and organizational assets toward bridging gaps across and disjuncture between Indigenous early learning programs, as well as to scale programming in sustainable, context-responsive and dynamic ways. This paper argues that using a principles-driven and adaptive scaling approach, the Early Years model establishes important learnings for culturally adapted Indigenous early learning programming in Canada. Methodology: The Early Years has leveraged this approach to develop an array of programming with partner organizations and communities across the country. The Early Years began as a singular pilot project in one First Nation. In just three years, it has expanded to five different regions and community organizations. In each context, the program supports the partner organization through different means and to different ends, the extent to which is determined in partnership with each community-based organization: in some cases, this means supporting the organization to build home visiting programming from the ground-up; in others, it means offering organization-specific culturally adapted early learning resources to support the programming that already exists in communities. Principles underpin but do not define the practices of the program in each of these relationships. This paper will explore numerous examples of principles-based adaptability with the context of the Early Years, concluding that the program model offers theadaptability and dynamism necessary to respond to unique and ever-evolving community contexts and needs of Indigenous children today.

Keywords: culturally adapted programming, indigenous early learning, principles-based approach, program scaling

Procedia PDF Downloads 170
5649 Temporal and Spacial Adaptation Strategies in Aerodynamic Simulation of Bluff Bodies Using Vortex Particle Methods

Authors: Dario Milani, Guido Morgenthal

Abstract:

Fluid dynamic computation of wind caused forces on bluff bodies e.g light flexible civil structures or high incidence of ground approaching airplane wings, is one of the major criteria governing their design. For such structures a significant dynamic response may result, requiring the usage of small scale devices as guide-vanes in bridge design to control these effects. The focus of this paper is on the numerical simulation of the bluff body problem involving multiscale phenomena induced by small scale devices. One of the solution methods for the CFD simulation that is relatively successful in this class of applications is the Vortex Particle Method (VPM). The method is based on a grid free Lagrangian formulation of the Navier-Stokes equations, where the velocity field is modeled by particles representing local vorticity. These vortices are being convected due to the free stream velocity as well as diffused. This representation yields the main advantages of low numerical diffusion, compact discretization as the vorticity is strongly localized, implicitly accounting for the free-space boundary conditions typical for this class of FSI problems, and a natural representation of the vortex creation process inherent in bluff body flows. When the particle resolution reaches the Kolmogorov dissipation length, the method becomes a Direct Numerical Simulation (DNS). However, it is crucial to note that any solution method aims at balancing the computational cost against the accuracy achievable. In the classical VPM method, if the fluid domain is discretized by Np particles, the computational cost is O(Np2). For the coupled FSI problem of interest, for example large structures such as long-span bridges, the aerodynamic behavior may be influenced or even dominated by small structural details such as barriers, handrails or fairings. For such geometrically complex and dimensionally large structures, resolving the complete domain with the conventional VPM particle discretization might become prohibitively expensive to compute even for moderate numbers of particles. It is possible to reduce this cost either by reducing the number of particles or by controlling its local distribution. It is also possible to increase the accuracy of the solution without increasing substantially the global computational cost by computing a correction of the particle-particle interaction in some regions of interest. In this paper different strategies are presented in order to extend the conventional VPM method to reduce the computational cost whilst resolving the required details of the flow. The methods include temporal sub stepping to increase the accuracy of the particles convection in certain regions as well as dynamically re-discretizing the particle map to locally control the global and the local amount of particles. Finally, these methods will be applied on a test case and the improvements in the efficiency as well as the accuracy of the proposed extension to the method are presented. The important benefits in terms of accuracy and computational cost of the combination of these methods will be thus presented as long as their relevant applications.

Keywords: adaptation, fluid dynamic, remeshing, substepping, vortex particle method

Procedia PDF Downloads 246
5648 Refined Edge Detection Network

Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni

Abstract:

Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.

Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone

Procedia PDF Downloads 88
5647 The Impact of the COVID-19 Pandemic on the Armenian Higher Education System: Challenges аnd Perspectives

Authors: Armine Vahanyan

Abstract:

Humanity has been still coping with the new COVID-19 pandemic. Healthcare providers, economists, psychologists, and other specialists speak about the impact of the virus on different spheres of our life. In the list of similar discussions, the impact of pandemics on global education is of utmost importance. Ideally, providing quality education services should be crucial, and the ways education programs are being adapted will determine the success or failure of the service providers. The paper aims to summarize the research touching upon the current situation of higher education in Armenia. The research includes data from official reports, surveys among education leads, faculty, and students, as well as personal observations and consideration. Through descriptive analysis, the findings of the research are being presented from various aspects. Interim results of the research unveiled two major issues in the sector of higher education in Armenia. On the one hand, the entire compulsory digitization of instruction, assessment, and grading has evoked serious gaps related to the lack of technical competencies. There is an urgent need for professional development programs that will address most of the concerns due to the shift to the online instruction mode. On the other hand, online teaching and learning require revision and adaptation of the existing curricula. Given that the content of certain programs may not be compromised, the teaching methods, the assignments, and evaluation require profound transformation, which will still be in line with course learning outcomes and student learning outcomes. The given paper focuses on the ways the mentioned issues are being addressed in Armenia. The extent of commitment for changes and adaptability to the new situation varies from the government-funded and private universities. In particular, the paper compares and contrasts activities and measures taken at the Armenian State Pedagogical University and the American University of Armenia. Thus, the Pedagogical University focused on the use of Google Classroom as the only means for teaching and learning as well as adopted the compulsory synchronous instruction mode. The American University, on the contrary, kept practicing the academic freedom, enabling both synchronous and asynchronous instruction modes, ensuring alignment of the course learning outcomes and student learning outcomes. The State University utilized the assignments and assessment, which would work for the on-campus instruction mode, while the American university employed a variety of assignments applicable for online teaching mode. The latter has suggested the utilization of multiple apps, internet sources, and online library access for a better online instant. Discussions with faculty through online forums and/or professional development workshops also facilitate restructuring and adaptation of the courses. Finally, the paper will synthesize the results of the undertaken research and will outline the e-learning perspectives and opportunities boosted by the known devastating healthcare issue.

Keywords: assessment, compulsory digitization of education services, online teaching, instruction mode, program restructuring

Procedia PDF Downloads 112
5646 Study of Wake Dynamics for a Rim-Driven Thruster Based on Numerical Method

Authors: Bao Liu, Maarten Vanierschot, Frank Buysschaert

Abstract:

The present work examines the wake dynamics of a rim-driven thruster (RDT) with Computational Fluid Dynamics (CFD). Unsteady Reynolds-averaged Navier-Stokes (URANS) equations were solved in the commercial solver ANSYS Fluent in combination with the SST k-ω turbulence model. The application of the moving reference frame (MRF) and sliding mesh (SM) approach to handling the rotational movement of the propeller were compared in the transient simulations. Validation and verification of the numerical model was performed to ensure numerical accuracy. Two representative scenarios were considered, i.e., the bollard condition (J=0) and a very light loading condition(J=0.7), respectively. From the results, it’s confirmed that compared to the SM method, the MRF method is not suitable for resolving the unsteady flow features as it only gives the general mean flow but smooths out lots of characteristic details in the flow field. By evaluating the simulation results with the SM technique, the instantaneous wake flow field under both conditions is presented and analyzed, most notably the helical vortex structure. It’s observed from the results that the tip vortices, blade shed vortices, and hub vortices are present in the wake flow field and convect downstream in a highly non-linear way. The shear layer vortices shedding from the duct displayed a strong interaction with the distorted tip vortices in an irregularmanner.

Keywords: computational fluid dynamics, rim-driven thruster, sliding mesh, wake dynamics

Procedia PDF Downloads 227
5645 Applying Image Schemas and Cognitive Metaphors to Teaching/Learning Italian Preposition a in Foreign/Second Language Context

Authors: Andrea Fiorista

Abstract:

The learning of prepositions is a quite problematic aspect in foreign language instruction, and Italian is certainly not an exception. In their prototypical function, prepositions express schematic relations of two entities in a highly abstract, typically image-schematic way. In other terms, prepositions assume concepts such as directionality, collocation of objects in space and time and, in Cognitive Linguistics’ terms, the position of a trajector with respect to a landmark. Learners of different native languages may conceptualize them differently, implying that they are supposed to operate a recategorization (or create new categories) fitting with the target language. However, most current Italian Foreign/Second Language handbooks and didactic grammars do not facilitate learners in carrying out the task, as they tend to provide partial and idiosyncratic descriptions, with the consequent learner’s effort to memorize them, most of the time without success. In their prototypical meaning, prepositions are used to specify precise topographical positions in the physical environment which become less and less accurate as they radiate out from what might be termed a concrete prototype. According to that, the present study aims to elaborate a cognitive and conceptually well-grounded analysis of some extensive uses of the Italian preposition a, in order to propose effective pedagogical solutions in the Teaching/Learning process. Image schemas, cognitive metaphors and embodiment represent efficient cognitive tools in a task like this. Actually, while learning the merely spatial use of the preposition a (e.g. Sono a Roma = I am in Rome; vado a Roma = I am going to Rome,…) is quite straightforward, it is more complex when a appears in constructions such as verbs of motion +a + infinitive (e.g. Vado a studiare = I am going to study), inchoative periphrasis (e.g. Tra poco mi metto a leggere = In a moment I will read), causative construction (e.g. Lui mi ha mandato a lavorare = He sent me to work). The study reports data from a teaching intervention of Focus on Form, in which a basic cognitive schema is used to facilitate both teachers and students to respectively explain/understand the extensive uses of a. The educational material employed translates Cognitive Linguistics’ theoretical assumptions, such as image schemas and cognitive metaphors, into simple images or proto-scenes easily comprehensible for learners. Illustrative material, indeed, is supposed to make metalinguistic contents more accessible. Moreover, the concept of embodiment is pedagogically applied through activities including motion and learners’ bodily involvement. It is expected that replacing rote learning with a methodology that gives grammatical elements a proper meaning, makes learning process more effective both in the short and long term.

Keywords: cognitive approaches to language teaching, image schemas, embodiment, Italian as FL/SL

Procedia PDF Downloads 76
5644 Measuring the Unmeasurable: A Project of High Risk Families Prediction and Management

Authors: Peifang Hsieh

Abstract:

The prevention of child abuse has aroused serious concerns in Taiwan because of the disparity between the increasing amount of reported child abuse cases that doubled over the past decade and the scarcity of social workers. New Taipei city, with the most population in Taiwan and over 70% of its 4 million citizens are migrant families in which the needs of children can be easily neglected due to insufficient support from relatives and communities, sees urgency for a social support system, by preemptively identifying and outreaching high-risk families of child abuse, so as to offer timely assistance and preventive measure to safeguard the welfare of the children. Big data analysis is the inspiration. As it was clear that high-risk families of child abuse have certain characteristics in common, New Taipei city decides to consolidate detailed background information data from departments of social affairs, education, labor, and health (for example considering status of parents’ employment, health, and if they are imprisoned, fugitives or under substance abuse), to cross-reference for accurate and prompt identification of the high-risk families in need. 'The Service Center for High-Risk Families' (SCHF) was established to integrate data cross-departmentally. By utilizing the machine learning 'random forest method' to build a risk prediction model which can early detect families that may very likely to have child abuse occurrence, the SCHF marks high-risk families red, yellow, or green to indicate the urgency for intervention, so as to those families concerned can be provided timely services. The accuracy and recall rates of the above model were 80% and 65%. This prediction model can not only improve the child abuse prevention process by helping social workers differentiate the risk level of newly reported cases, which may further reduce their major workload significantly but also can be referenced for future policy-making.

Keywords: child abuse, high-risk families, big data analysis, risk prediction model

Procedia PDF Downloads 118
5643 The Analysis of Gizmos Online Program as Mathematics Diagnostic Program: A Story from an Indonesian Private School

Authors: Shofiayuningtyas Luftiani

Abstract:

Some private schools in Indonesia started integrating the online program Gizmos in the teaching-learning process. Gizmos was developed to supplement the existing curriculum by integrating it into the instructional programs. The program has some features using an inquiry-based simulation, in which students conduct exploration by using a worksheet while teachers use the teacher guidelines to direct and assess students’ performance In this study, the discussion about Gizmos highlights its features as the assessment media of mathematics learning for secondary school students. The discussion is based on the case study and literature review from the Indonesian context. The purpose of applying Gizmos as an assessment media refers to the diagnostic assessment. As a part of the diagnostic assessment, the teachers review the student exploration sheet, analyze particularly in the students’ difficulties and consider findings in planning future learning process. This assessment becomes important since the teacher needs the data about students’ persistent weaknesses. Additionally, this program also helps to build student’ understanding by its interactive simulation. Currently, the assessment over-emphasizes the students’ answers in the worksheet based on the provided answer keys while students perform their skill in translating the question, doing the simulation and answering the question. Whereas, the assessment should involve the multiple perspectives and sources of students’ performance since teacher should adjust the instructional programs with the complexity of students’ learning needs and styles. Consequently, the approach to improving the assessment components is selected to challenge the current assessment. The purpose of this challenge is to involve not only the cognitive diagnosis but also the analysis of skills and error. Concerning the selected setting for this diagnostic assessment that develops the combination of cognitive diagnosis, skills analysis and error analysis, the teachers should create an assessment rubric. The rubric plays the important role as the guide to provide a set of criteria for the assessment. Without the precise rubric, the teacher potentially ineffectively documents and follows up the data about students at risk of failure. Furthermore, the teachers who employ the program of Gizmos as the diagnostic assessment might encounter some obstacles. Based on the condition of assessment in the selected setting, the obstacles involve the time constrain, the reluctance of higher teaching burden and the students’ behavior. Consequently, the teacher who chooses the Gizmos with those approaches has to plan, implement and evaluate the assessment. The main point of this assessment is not in the result of students’ worksheet. However, the diagnostic assessment has the two-stage process; the process to prompt and effectively follow-up both individual weaknesses and those of the learning process. Ultimately, the discussion of Gizmos as the media of the diagnostic assessment refers to the effort to improve the mathematical learning process.

Keywords: diagnostic assessment, error analysis, Gizmos online program, skills analysis

Procedia PDF Downloads 166