Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3

Assessment Design Related Abstracts

3 Removing Barriers in Assessment and Feedback for Blind Students in Open Distance Learning

Authors: Sindile Ngubane-Mokiwa

Abstract:

This paper addresses two questions: (1) what barriers do the blind students face with assessment and feedback in open distance learning contexts? And (2) How can these barriers be removed? The paper focuses on the distance education through which most students with disabilities elevate their chances of accessing higher education. Lack of genuine inclusion is also evident in the challenges the blind students face during the assessment. These barriers are experienced at both formative and summative stages. The insights in this paper emanate from a case study that was carried out through qualitative approaches. The data was collected through in-depth interview, life stories, and telephonic interviews. The paper provides a review of local, continental and international views on how best assessment barriers can be removed. A group of five blind students, comprising of two honours students, two master's students and one doctoral student participated in this study. The data analysis was done through thematic analysis. The findings revealed that (a) feedback to the assignment is often inaccessible; (b) the software used is incompatible; (c) learning and assessment are designed in exclusionary approaches; (d) assessment facilities are not conducive; and (e) lack of proactive innovative assessment strategies. The article concludes by recommending ways in which barriers to assessment can be removed. These include addressing inclusive assessment and feedback strategies in professional development initiatives.

Keywords: Disabilities, Feedback, Universal design for learning, barriers, Assessment Design, blind students

Procedia PDF Downloads 187
2 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game

Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin

Abstract:

Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.

Keywords: Assessment Design, educational video games, automated embedded assessment, assessment validation, game-based assessment

Procedia PDF Downloads 312
1 Bridging the Divide: Mixed-Method Analysis of Student Engagement and Outcomes in Diverse Postgraduate Cohorts

Authors: A.Knox

Abstract:

Student diversity in postgraduate classes puts major challenges on educators seeking to encourage student engagement and desired to learn outcomes. This paper outlines the impact of a set of teaching initiatives aimed at addressing challenges associated with teaching and learning in an environment characterized by diversity in the student cohort. The study examines postgraduate students completing the core capstone unit within a specialized business degree. Although relatively small, the student cohort is highly diverse in terms of cultural backgrounds represented, prior learning and/or qualifications, as well as duration and type of work experience relevant to the degree, is completed. The wide range of cultures, existing knowledge and experience create enormous challenges with respect to students’ learning needs and outcomes. Subsequently, a suite of teaching innovations has been adopted to enhance curriculum content/delivery and the design of assessments. This paper explores the impact of these specific teaching and learning practices, examining the ways they have supported students’ diverse needs and enhanced students’ learning outcomes. Data from surveys and focus groups are used to assess the effectiveness of these practices. The results highlight the effectiveness of peer-assisted learning, cultural competence-building, and advanced assessment options in addressing diverse student needs and enhancing student engagement and learning outcomes. These findings suggest that such practices would benefit students’ learning in environments marked by diversity in the student cohort. Specific recommendations are offered for other educators working with diverse classes.

Keywords: Assessment Design, curriculum content, curriculum delivery, student diversity

Procedia PDF Downloads 1