Analysis of Aiming Performance for Games Using Mapping Method of Corneal Reflections Based on Two Different Light Sources
Authors: Yoshikazu Onuki, Itsuo Kumazawa
Abstract:
Fundamental motivation of this paper is how gaze estimation can be utilized effectively regarding an application to games. In games, precise estimation is not always important in aiming targets but an ability to move a cursor to an aiming target accurately is also significant. Incidentally, from a game producing point of view, a separate expression of a head movement and gaze movement sometimes becomes advantageous to expressing sense of presence. A case that panning a background image associated with a head movement and moving a cursor according to gaze movement can be a representative example. On the other hand, widely used technique of POG estimation is based on a relative position between a center of corneal reflection of infrared light sources and a center of pupil. However, a calculation of a center of pupil requires relatively complicated image processing, and therefore, a calculation delay is a concern, since to minimize a delay of inputting data is one of the most significant requirements in games. In this paper, a method to estimate a head movement by only using corneal reflections of two infrared light sources in different locations is proposed. Furthermore, a method to control a cursor using gaze movement as well as a head movement is proposed. By using game-like-applications, proposed methods are evaluated and, as a result, a similar performance to conventional methods is confirmed and an aiming control with lower computation power and stressless intuitive operation is obtained.
Keywords: Point-of-gaze, gaze estimation, head movement, corneal reflections, two infrared light sources, game.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1082387
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075References:
[1] R. J. acob, "The use of eye movements in human computer interaction techniques: What you look at is what you get," ACM Transactions on Information Systems 9, 1991, pp. 152-169.
[2] S. Zhai, C. Morimoto, and S. Ihde, "Manual and gaze input cascaded (magic) pointing," Proceedings of the SIGCHI conference on Human factors in computing systems, 1999, pp. 246-253.
[3] A. T. Duchowski, "A breadth-first survey of eye-tracking applications," Behavior Research Methods Instruments & Computers 34(4), 2002, pp. 455-470.
[4] C. H. Morimoto, and M. R. Mimica, "Eye gaze tracking techniques for interactive applications," Computer Vision and Image Understanding, Special Issue on Eye Detection and Tracking 98, 2005, pp. 4-24.
[5] C. H. Morimoto, A. Amr, and M. Flickner, "Detecting eye position and gaze from a single camera and 2 light sources," Proceedings of the International Conference on Pattern Recognition, 2002.
[6] Z. Zhu, and Q. Ji, "Eye gaze tracking under natural head movements," IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR05), 2005.
[7] F. L. Coutinho, and C. H. Morimoto, "Free head motion eye gaze tracking using a single camera and multiple light sources," Proceedings of the XIX Brasilian Symposium on Computer Graphics and Image Processing, 2006, pp. 171-178.
[8] J. Chen, T. Tong, W. Gray, and Q. Ji, "A robust 3D eye gaze tracking system using noise reduction," Proceedings of the 2008 symposium on Eye tracking research & applications, 2008, pp. 189-196.
[9] E. D. Guestrin, M. Eizenman, J. J. Kang, and E. Eizenman, ”Analysis of subject-dependent point-of-gaze estimation bias in the cross-ratios method,” Proceedings of the 2008 symposium on Eye tracking research & applications, 2008, pp. 237-244.
[10] D. Model, and M. Eizenman, ”User-calibration-free remote gaze estimation system,” Proceedings of the 2008 symposium on Eye tracking research & applications, 2010, pp. 29-36.
[11] M. Eizenman, R. Sapir-Pichhadze, C. A. Westall, A. M. Wong, H. Lee, and Y. Morad, ”Eye-movement responses to disparity vergence stimuli with artificial monocular scotomas,” Current Eye Research 31(6), 2006, pp. 471-480.
[12] E. D. Guestrin, and M. Eizenman, ”Remote point-of-gaze estimation with free head movements requiring a single-point calibration,” Proceedings of Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2007, pp. 4556-4560.
[13] D. W. Hansen, J. S. Agustin, and A. Villanueva, ”Homography normalization for robust gaze estimation in uncalibrated setups,” Proceedings of the 2008 symposium on Eye tracking research & applications, 2010, pp. 13-20.
[14] S. Munn, and J. B. Pelz, ”3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker,” Proceedings of the 2008 symposium on Eye tracking research & applications, 2008, pp. 181-188.
[15] F. Li, S. Munn, and J. Pelz, ”A model-based approach to video-based eye tracking,” Journal of Modern Optics, Special Issue onPhysiological Optics, 2008.
[16] S. M. Kolakowski, and J. B. Pelz, ”Compensating for eye tracker camera movement,” Proceedings of the 2006 symposium on Eye tracking research & applications, 2006, pp. 79-85.
[17] J. H. Goldberg and J. I. Helfman, ”Visual Scanpath Representation,” Proceedings of the 2008 symposium on Eye tracking research & applications, 2010, pp. 203-210.