Evaluating the Effectiveness of Electronic Response Systems in Technology-Oriented Classes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
Evaluating the Effectiveness of Electronic Response Systems in Technology-Oriented Classes

Authors: Ahmad Salman

Abstract:

Electronic Response Systems such as Kahoot, Poll Everywhere, and Google Classroom are gaining a lot of popularity when surveying audiences in events, meetings, and classroom. The reason is mainly because of the ease of use and the convenience these tools bring since they provide mobile applications with a simple user interface. In this paper, we present a case study on the effectiveness of using Electronic Response Systems on student participation and learning experience in a classroom. We use a polling application for class exercises in two different technology-oriented classes. We evaluate the effectiveness of the usage of the polling applications through statistical analysis of the students performance in these two classes and compare them to the performances of students who took the same classes without using the polling application for class participation. Our results show an increase in the performances of the students who used the Electronic Response System when compared to those who did not by an average of 11%.

Keywords: Interactive learning, classroom technology, electronic response systems, polling applications, learning evaluation.

Digital Object Identifier (DOI): doi.org/10.6084/m9.figshare.12489896

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 591

References:


[1] D. Lebow. Constructivist values for instructional systems design: Five principles toward a new mindset. ETR&D, 41:4–16, 1993.
[2] W. R. Klemm. What good is learning if you don’t remember it? The Journal of Effective Teaching, 7(1):61–73, 2007.
[3] A. W. Chickering and Z. F. Gamson. Development and adaptations of the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 1999(80):75–81, 2002.
[4] W. M. Boyd. Repeating questions in prose learning. Journal of Educational Psychology, 64:31–38, 1973.
[5] R. W. Kulhavy. Feedback in written instruction. Review of Educational Research, 53:211–232, 1977.
[6] D. A. Karp and W. C. Yoels. The college classroom: Some observations on the meanings of student participation. Sociology and Social Research, 60:421–439, 1976.
[7] E. Stones. Students’ attitudes to the size of teaching groups. Educational Review, 21(2):98–108, 1970.
[8] J. Schacter and C. Fagnano. Does computer technology improve student learning and achievement? How, when, and under what conditions? Journal of Educational Computing Research, 20(4):329–343, 1999.
[9] K. Hinde and A. Hunt. Using the personal response system to enhance student learning: Some evidence from teaching economics. In D. Banks, editor, Audience response systems in higher education: Applications and cases. Information Science Publishing, Hershey, PA, 2006.
[10] D. Duncan. Clickers: A new teaching aid with exceptional promise. Astronomy Education Review, 5:1–24, 2006.
[11] R. R. Hake. Interactive-engagement vs. traditional methods: A six-thousandstudent survey of mechanics test data for introductory physics courses. American Journal of Physics, 66:64–74, 1998.
[12] Poll Everywhere Support, https://www.polleverywhere.com/ ,2019
[13] James Madison University Course Catalog, shorturl.at/fKMQV ,2019