Users’ Preferences for Map Navigation Gestures
Authors: Y. Y. Pang, N. A. Ismail
Abstract:
Map is a powerful and convenient tool in helping us to navigate to different places, but the use of indirect devices often makes its usage cumbersome. This study intends to propose a new map navigation dialogue that uses hand gesture. A set of dialogue was developed from users’ perspective to provide users complete freedom for panning, zooming, rotate, tilt and find direction operations. A participatory design experiment was involved here where one hand gesture and two hand gesture dialogues had been analysed in the forms of hand gestures to develop a set of usable dialogues. The major finding was that users prefer one-hand gesture compared to two-hand gesture in map navigation.
Keywords: Hand gesture, map navigation, participatory design, intuitive interaction.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1337801
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1435References:
[1] R. J. K. Jacob, J. J. Leggett, B. A. Myers, and R. Pausch, "An Agenda for Human-Computer Interaction Research: Interaction Styles and Input/Output Devices," Behaviour and Information Technology, vol. 12, pp. 69-79, 1993.
[2] D. J. Sturman and D. Zeltzer, "A survey of glove-based input," Computer Graphics and Applications, IEEE, vol. 14, pp. 30-39, 1994.
[3] H. I. Stern, J. P. Wachs, and Y. Edan, "Designing Hand Gesture Vocabularies for Natural Interaction by Combining Psycho- Physiological and Recognition Factors," International Journal of Semantic Computing, vol. 2, pp. 137-160, 2008.
[4] M. Nielsen, M. Störring, T. Moeslund, and E. Granum, "A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI," in Gesture-Based Communication in Human-Computer Interaction. vol. 2915, A. Camurri and G. Volpe, Eds., ed: Springer Berlin Heidelberg, 2004, pp. 409-420.
[5] A. K. Sinha and J. A. Landay, "Embarking on multimodal interface design," in Multimodal Interfaces, 2002. Proceedings. Fourth IEEE International Conference on, 2002, pp. 355-360.
[6] A. Löcken, T. Hesselmann, M. Pielot, N. Henze, and S. Boll, "Usercentred process for the definition of free-hand gestures applied to controlling music playback," Multimedia Systems, vol. 18, pp. 15-31, 2012/02/01 2012.
[7] N. Bernsen, "Multimodality in Language and Speech Systems — From Theory to Design Support Tool," in Multimodality in Language and Speech Systems. vol. 19, B. Granström, D. House, and I. Karlsson, Eds., ed: Springer Netherlands, 2002, pp. 93-148.
[8] F. Dickmann, Web-Mapping und Web-GIS: Westermann, 2001.
[9] L. Meng, "The State of the Art of Map-Based Mobile Services," in Mapbased Mobile Services, L. Meng, A. Zipf, and S. Winter, Eds., ed: Springer Berlin Heidelberg, 2008, pp. 1-12.
[10] S.-G. Kim, J.-W. Kim, K.-T. Bae, and C.-W. Lee, "Multi-touch Interaction for Table-Top Display," in Advances in Artificial Reality and Tele-Existence. vol. 4282, Z. Pan, A. Cheok, M. Haller, R. H. Lau, H. Saito, and R. Liang, Eds., ed: Springer Berlin Heidelberg, 2006, pp. 1273-1282.
[11] J. O. Wobbrock, M. R. Morris, and A. D. Wilson, "User-defined gestures for surface computing," presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 2009.
[12] J. O. Wobbrock, H. H. Aung, B. Rothrock, and B. A. Myers, "Maximizing the guessability of symbolic input," presented at the CHI '05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2005.
[13] C. Kühnel, T. Westermann, F. Hemmert, S. Kratz, A. Müller, and S. Möller, "I'm home: Defining and evaluating a gesture set for smart-home control," International Journal of Human-Computer Studies, vol. 69, pp. 693-704, 10// 2011.
[14] M. Nacenta, Y. Kamber, Y. Qiang, and P. O. Kristensson, "Memorability of pre-designed and user-defined gesture sets," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 2013, pp. 1099-1108.
[15] E. B. Sanders, "From User-Centered to Participatory Design Approaches," in Design and the Social Sciences, J. Frascara, Ed., ed. New York: Taylor & Francis Books, 2002, pp. 1-8.
[16] I. Poggi, "From a Typology of Gestures to a Procedure for Gesture Production," in Gesture and Sign Language in Human-Computer Interaction. vol. 2298, I. Wachsmuth and T. Sowa, Eds., ed: Springer Berlin Heidelberg, 2002, pp. 158-168.
[17] F. Quek, D. McNeill, R. Bryll, S. Duncan, X.-F. Ma, C. Kirbas, et al., "Multimodal human discourse: gesture and speech," ACM Trans. Comput.-Hum. Interact., vol. 9, pp. 171-193, 2002.
[18] M. Karam and M. C. Schraefel, "A Taxonomy of Gestures in Human Computer Interactions," University of Southampton, Technical Report2005.
[19] R. Aigner, D. Wigdor, H. Benko, M. Haller, D. Lindbauer, A. Ion, et al., "Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI," 2012.
[20] J. Ruiz, Y. Li, and E. Lank, "User-defined motion gestures for mobile interaction," presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 2011.
[21] H. G. N. Abadi and L. Y. Peng, "Culture as the main source in “Usercentered design” Using “user-defined”,“guessability”, and “usability” study in generating gestures for touchscreen mobile devices," in Proceeding of the 2nd International Conference on E-Learning & knowledge Management Technologies, 2012.G. O. Young, “Synthetic structure of industrial plastics (Book style with paper title and editor),” in Plastics, 2nd ed. vol. 3, J. Peters, Ed. New York: McGraw-Hill, 1964, pp. 15–64.