Eye Tracking: Biometric Evaluations of Instructional Materials for Improved Learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33093
Eye Tracking: Biometric Evaluations of Instructional Materials for Improved Learning

Authors: Janet Holland

Abstract:

Eye tracking is a great way to triangulate multiple data sources for deeper, more complete knowledge of how instructional materials are really being used and emotional connections made. Using sensor based biometrics provides a detailed local analysis in real time expanding our ability to collect science based data for a more comprehensive level of understanding, not previously possible, for teaching and learning. The knowledge gained will be used to make future improvements to instructional materials, tools, and interactions. The literature has been examined and a preliminary pilot test was implemented to develop a methodology for research in Instructional Design and Technology. Eye tracking now offers the addition of objective metrics obtained from eye tracking and other biometric data collection with analysis for a fresh perspective.

Keywords: Area of interest, eye tracking, biometrics, fixation, fixation count, fixation sequence, fixation time, gaze points, heat map, saccades, time to first fixation.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.3346724

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878

References:


[1] Boucheix, J. M., Lowe, R. K., (2010). An eye tracking comparison of external pointing cues and internal continuous cues in learning with complex animations. Learning and Instruction 20, pp.123-135.
[2] Canham, M., Hegarty, M., (2010). Effects of knowledge and display design on comprehension of complex graphics. Learning and Instruction 20, pp.155-166.
[3] De Koning, B. B., Tabbers, H. K., Rikers, R. M.J.P, Paas, F. (2010). Attention guidance in learning from a complex animation: Seeing is understanding? Learning and Instruction 20, pp.111-122.
[4] Duchowski, A. T. (2003). Eye tracking methodology: theory and practice. Springer, London.
[5] Goldberg J. H., & Wichansky A. M. (2003). Eye tracking in usability evaluation: A practitioner’s guide. In: Hyona J., Radach R., Deubel H. (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493-516.
[6] iMotions © (2015). 7 most used eye tracking metrics and terms. Retrieved September 25, 2017 from: https://imotions.com/blog/7-terms-metrics-eye-tracking/
[7] iMotions © (2018). 7 ways to measure human behavior. Retrieved Nov. 26, 2018 from: https://imotions.com/blog/sensor-chart/
[8] iMotions © (2018). Eye tracking: The complete pocket guide. Retrieved Dec. 2, 2017 from: https://imotions.com/blog/eye-tracking/
[9] Jacob, R. J. K. (1995). Eye tracking in advanced interface design. In: Barfield W., Furness T.A. (eds) Virtual environments and advanced interface design. Oxford University Press, New York: NY, pp 258-288.
[10] Jacob, R. J. K. (1990). What you look at is what you get: Eye movement-based interaction techniques. Human-Computer Interaction Lab, Naval Research Laboratory. Washington: DC.
[11] Jacob, R. J., & Karn, K. S. (2003). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research. Hyona, Radach & Deubel (eds.) Oxford, England.
[12] Jarodzka, H., Scheiter, K., Gerjets, P., Van Gog, T., (2010). In the eyes of the beholder: How experts and novices interpret dynamic stimuli. Learning and Instruction 20, pp.146-154.
[13] Majaranta, P. & Bulling, A. (2014). Advances in physiological computing. Eye tracking and eye-based human-computer interaction, 3(pp. 39-65).
[14] Meyer, K., Rasch, T., Schnotz, W., (2010). Effects of animation’s speed of presentation on perceptual processing and learning. Learning and Instruction 20, pp. 136-145.
[15] Schmidt-Weigand, F., Kohnert, A., Glowalla, U. (2010). A closer look at split visual attention in system- and self-paced instruction in multimedia learning. Learning and Instruction 20, pp. 100-110.
[16] Sibert, L. E., & Jacob R. J. K. (2000). Evaluation of eye gaze interaction. Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM.
[17] Sinclair, B. (2017). IoT Inc. How your company can use the internet of things to win in the outcome economy. McGraw-Hill Education, New York: NY.
[18] Stankovic, J. A. (2014). Research directions for the Internet of things. IEEE. Retrieved March 11, 2018 from https://www.cs.virginia.edu/~stankovic/psfiles/IOT.pdf