Cursor Position Estimation Model for Virtual Touch Screen Using Camera
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Cursor Position Estimation Model for Virtual Touch Screen Using Camera

Authors: Somkiat Wangsiripitak

Abstract:

Virtual touch screen using camera is an ordinary screen which uses a camera to imitate the touch screen by taking a picture of an indicator, e.g., finger, which is laid on the screen, converting the indicator tip position on the picture to the position on the screen, and moving the cursor on the screen to that position. In fact, the indicator is not laid on the screen directly, but it is intervened by the cover at some intervals. In spite of this gap, if the eye-indicator-camera angle is not large, the mapping from the indicator tip positions on the image to the corresponding cursor positions on the screen is not difficult and could be done with a little error. However, the larger the angle is, the bigger the error in the mapping occurs. This paper proposes cursor position estimation model for virtual touch screen using camera which could eliminate this kind of error. The proposed model (i) moves the on-screen pilot cursor to the screen position which locates on the screen at the position just behind the indicator tip when the indicator tip has been looked from the camera position, and then (ii) converts that pilot cursor position to the desirable cursor position (the position on the screen when it has been looked from the user-s eye through the indicator tip) by using the bilinear transformation. Simulation results show the correctness of the estimated cursor position by using the proposed model.

Keywords: Bilinear transformation, cursor position, pilot cursor, virtual touch screen.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1334928

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634

References:


[1] Z. Zhang, "Vision-based Interaction with Fingers and Papers," Proc. International Symposium on the CREST Digital Archiving Project, pp. 83-106, May. 2003.
[2] J. Coutaz, Crowley, J. L., and F. B rard., "Things that see: Machine perception for human computer interaction," Communications of the ACM, 43(3):54-64, 2000.
[3] C. Maggioni and B. Kammerer, "Gesture computer - history, design and applications, in Ed. R. Cipolla and A. Pentland, editors," Computer Vision for Human-Machine Interaction, Cambridge University Press, 1998.
[4] T. Starner, S. Mann, B. Rhodes, J. Levine, J. Healey, D. Kirsch, R. W. Picard, and A. Pentland, "Augmented reality through wearable computing," Presence, Special Issue on Augmented Reality, 6(4), 1997.