{"title":"Sound Selection for Gesture Sonification and Manipulation of Virtual Objects","authors":"Benjamin Bressolette, S\u00b4ebastien Denjean, Vincent Roussarie, Mitsuko Aramaki, S\u00f8lvi Ystad, Richard Kronland-Martinet","volume":121,"journal":"International Journal of Mechanical and Mechatronics Engineering","pagesStart":147,"pagesEnd":153,"ISSN":"1307-6892","URL":"https:\/\/publications.waset.org\/pdf\/10006330","abstract":"New sensors and technologies – such as microphones,
\r\ntouchscreens or infrared sensors – are currently making their
\r\nappearance in the automotive sector, introducing new kinds of
\r\nHuman-Machine Interfaces (HMIs). The interactions with such tools
\r\nmight be cognitively expensive, thus unsuitable for driving tasks.
\r\nIt could for instance be dangerous to use touchscreens with a
\r\nvisual feedback while driving, as it distracts the driver’s visual
\r\nattention away from the road. Furthermore, new technologies in
\r\ncar cockpits modify the interactions of the users with the central
\r\nsystem. In particular, touchscreens are preferred to arrays of buttons
\r\nfor space improvement and design purposes. However, the buttons’
\r\ntactile feedback is no more available to the driver, which makes
\r\nsuch interfaces more difficult to manipulate while driving. Gestures
\r\ncombined with an auditory feedback might therefore constitute an
\r\ninteresting alternative to interact with the HMI. Indeed, gestures can
\r\nbe performed without vision, which means that the driver’s visual
\r\nattention can be totally dedicated to the driving task. In fact, the
\r\nauditory feedback can both inform the driver with respect to the task
\r\nperformed on the interface and on the performed gesture, which might
\r\nconstitute a possible solution to the lack of tactile information. As
\r\naudition is a relatively unused sense in automotive contexts, gesture
\r\nsonification can contribute to reducing the cognitive load thanks
\r\nto the proposed multisensory exploitation. Our approach consists
\r\nin using a virtual object (VO) to sonify the consequences of the
\r\ngesture rather than the gesture itself. This approach is motivated
\r\nby an ecological point of view: Gestures do not make sound, but
\r\ntheir consequences do. In this experiment, the aim was to identify
\r\nefficient sound strategies, to transmit dynamic information of VOs to
\r\nusers through sound. The swipe gesture was chosen for this purpose,
\r\nas it is commonly used in current and new interfaces. We chose
\r\ntwo VO parameters to sonify, the hand-VO distance and the VO
\r\nvelocity. Two kinds of sound parameters can be chosen to sonify the
\r\nVO behavior: Spectral or temporal parameters. Pitch and brightness
\r\nwere tested as spectral parameters, and amplitude modulation as a
\r\ntemporal parameter. Performances showed a positive effect of sound
\r\ncompared to a no-sound situation, revealing the usefulness of sounds
\r\nto accomplish the task.","references":"[1] G. Kramer et al., Sonification report: Status of the field and research\r\nagenda, 2010.\r\n[2] J. M. Loomis, R. G. Golledge, and R. L. Klatzky, Navigation system for\r\nthe blind: Auditory display modes and guidance, Presence: Teleoperators\r\nand Virtual Environments, vol. 7, no. 2, pp. 193203, 1998.\r\n[3] K. Wegner, Surgical navigation system and method using audio feedback,\r\n1998.\r\n[4] M. Dozza, L. Chiari, and F. B. Horak, A portable audio-biofeedback\r\nsystem to improve postural control, in Engineering in Medicine and\r\nBiology Society, 2004. IEMBS04. 26th Annual International Conference\r\nof the IEEE, 2004, vol. 2, pp. 47994802.\r\n[5] J. Danna et al., The effect of real-time auditory feedback on learning new\r\ncharacters, Human Movement Science, vol. 43, pp. 216228, Oct. 2015.\r\n[6] J. Danna et al., Handwriting movement sonification for the rehabilitation\r\nof dysgraphia, in 10th International Symposium on Computer Music\r\nMultidisciplinary Research (CMMR)-Sound, Music and Motion-15-18\r\noct. 2013-Marseille, France, 2013, pp. 200208.\r\n[7] M. Aramaki, C. Gondre, R. Kronland-Martinet, T. Voinier, and S. Ystad,\r\nThinking the sounds: an intuitive control of an impact sound synthesizer,\r\nin International Conference on Auditory Display (ICAD09), 2009, pp.\r\n119124.\r\n[8] G. Parseihian, C. Gondre, M. Aramaki, S. Ystad, and R.\r\nKronland-Martinet, Comparison and evaluation of sonification strategies\r\nfor guidance tasks, IEEE Transactions on Multimedia, vol. 18, no. 4,\r\npp. 674686, 2016.\r\n[9] G. Jakus, C. Dicke, and J. Sodnik, A user study of auditory, head-up\r\nand multi-modal displays in vehicles, Applied Ergonomics, vol. 46, pp.\r\n184192, Jan. 2015.\r\n[10] M. Rath and R. Schleicher, On the relevance of auditory feedback\r\nfor quality of control in a balancing task, Acta Acustica United With\r\nAcustica, vol. 94, no. 1, pp. 1220, 2008.\r\n[11] I. S. MacKenzie, Fitts law as a research and design tool in\r\nhuman-computer interaction, Human-computer interaction, vol. 7, no. 1,\r\npp. 91139, 1992.\r\n[12] P. M. Fitts, The information capacity of the human motor system\r\nin controlling the amplitude of movement., Journal of experimental\r\npsychology, vol. 47, no. 6, p. 381, 1954.\r\n[13] M. M. J. Houben, The sound of rolling objects : perception of size and\r\nspeed, PhD Thesis, 2002.\r\n[14] S. Conan, O. Derrien, M. Aramaki, S. Ystad, and R. Kronland-Martinet,\r\nA synthesis model with intuitive control capabilities for rolling sounds,\r\nIEEE\/ACM Transactions on Audio, Speech, and Language Processing,\r\nvol. 22, no. 8, pp. 12601273, 2014.\r\n[15] E. Thoret, M. Aramaki, R. Kronland-Martinet, J.-L. Velay, and S. Ystad,\r\nFrom sound to shape: auditory perception of drawing movements., Journal\r\nof Experimental Psychology: Human Perception and Performance, vol.\r\n40, no. 3, p. 983, 2014.\r\n[16] Leap Motion \u2014 Mac & PC Motion Controller for Games, Design,\r\nVirtual Reality & More (online). Available: https:\/\/www.leapmotion.com\/\r\n[17] Spat (online). Available: http:\/\/forumnet.ircam.fr\/fr\/produit\/spat\/\r\n[18] Unity - Game Engine (online). Available: https:\/\/unity3d.com\/fr\/\r\n[19] A. Merer, M. Aramaki, S. Ystad, and R. Kronland-Martinet, Perceptual\r\ncharacterization of motion evoked by sounds for synthesis control\r\npurposes, ACM Transactions on Applied Perception, vol. 10, no. 1, pp.\r\n124, Feb. 2013.\r\n[20] M. M. Houben, A. Kohlrausch, and D. Hermes, Auditory cues\r\ndetermining the perception of the size and speed of rolling balls, 2001.\r\n[21] M. M. J. Houben, A. Kohlrausch, and D. J. Hermes, Perception of the\r\nsize and speed of rolling balls by sound, Speech Communication, vol. 43,\r\nno. 4, pp. 331345, Sep. 2004.","publisher":"World Academy of Science, Engineering and Technology","index":"Open Science Index 121, 2017"}