Sound Selection for Gesture Sonification and Manipulation of Virtual Objects
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Sound Selection for Gesture Sonification and Manipulation of Virtual Objects

Authors: Benjamin Bressolette, S´ebastien Denjean, Vincent Roussarie, Mitsuko Aramaki, Sølvi Ystad, Richard Kronland-Martinet

Abstract:

New sensors and technologies – such as microphones, touchscreens or infrared sensors – are currently making their appearance in the automotive sector, introducing new kinds of Human-Machine Interfaces (HMIs). The interactions with such tools might be cognitively expensive, thus unsuitable for driving tasks. It could for instance be dangerous to use touchscreens with a visual feedback while driving, as it distracts the driver’s visual attention away from the road. Furthermore, new technologies in car cockpits modify the interactions of the users with the central system. In particular, touchscreens are preferred to arrays of buttons for space improvement and design purposes. However, the buttons’ tactile feedback is no more available to the driver, which makes such interfaces more difficult to manipulate while driving. Gestures combined with an auditory feedback might therefore constitute an interesting alternative to interact with the HMI. Indeed, gestures can be performed without vision, which means that the driver’s visual attention can be totally dedicated to the driving task. In fact, the auditory feedback can both inform the driver with respect to the task performed on the interface and on the performed gesture, which might constitute a possible solution to the lack of tactile information. As audition is a relatively unused sense in automotive contexts, gesture sonification can contribute to reducing the cognitive load thanks to the proposed multisensory exploitation. Our approach consists in using a virtual object (VO) to sonify the consequences of the gesture rather than the gesture itself. This approach is motivated by an ecological point of view: Gestures do not make sound, but their consequences do. In this experiment, the aim was to identify efficient sound strategies, to transmit dynamic information of VOs to users through sound. The swipe gesture was chosen for this purpose, as it is commonly used in current and new interfaces. We chose two VO parameters to sonify, the hand-VO distance and the VO velocity. Two kinds of sound parameters can be chosen to sonify the VO behavior: Spectral or temporal parameters. Pitch and brightness were tested as spectral parameters, and amplitude modulation as a temporal parameter. Performances showed a positive effect of sound compared to a no-sound situation, revealing the usefulness of sounds to accomplish the task.

Keywords: Auditory feedback, gesture, sonification, sound perception, virtual object.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1339974

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 970

References:


[1] G. Kramer et al., Sonification report: Status of the field and research agenda, 2010.
[2] J. M. Loomis, R. G. Golledge, and R. L. Klatzky, Navigation system for the blind: Auditory display modes and guidance, Presence: Teleoperators and Virtual Environments, vol. 7, no. 2, pp. 193203, 1998.
[3] K. Wegner, Surgical navigation system and method using audio feedback, 1998.
[4] M. Dozza, L. Chiari, and F. B. Horak, A portable audio-biofeedback system to improve postural control, in Engineering in Medicine and Biology Society, 2004. IEMBS04. 26th Annual International Conference of the IEEE, 2004, vol. 2, pp. 47994802.
[5] J. Danna et al., The effect of real-time auditory feedback on learning new characters, Human Movement Science, vol. 43, pp. 216228, Oct. 2015.
[6] J. Danna et al., Handwriting movement sonification for the rehabilitation of dysgraphia, in 10th International Symposium on Computer Music Multidisciplinary Research (CMMR)-Sound, Music and Motion-15-18 oct. 2013-Marseille, France, 2013, pp. 200208.
[7] M. Aramaki, C. Gondre, R. Kronland-Martinet, T. Voinier, and S. Ystad, Thinking the sounds: an intuitive control of an impact sound synthesizer, in International Conference on Auditory Display (ICAD09), 2009, pp. 119124.
[8] G. Parseihian, C. Gondre, M. Aramaki, S. Ystad, and R. Kronland-Martinet, Comparison and evaluation of sonification strategies for guidance tasks, IEEE Transactions on Multimedia, vol. 18, no. 4, pp. 674686, 2016.
[9] G. Jakus, C. Dicke, and J. Sodnik, A user study of auditory, head-up and multi-modal displays in vehicles, Applied Ergonomics, vol. 46, pp. 184192, Jan. 2015.
[10] M. Rath and R. Schleicher, On the relevance of auditory feedback for quality of control in a balancing task, Acta Acustica United With Acustica, vol. 94, no. 1, pp. 1220, 2008.
[11] I. S. MacKenzie, Fitts law as a research and design tool in human-computer interaction, Human-computer interaction, vol. 7, no. 1, pp. 91139, 1992.
[12] P. M. Fitts, The information capacity of the human motor system in controlling the amplitude of movement., Journal of experimental psychology, vol. 47, no. 6, p. 381, 1954.
[13] M. M. J. Houben, The sound of rolling objects : perception of size and speed, PhD Thesis, 2002.
[14] S. Conan, O. Derrien, M. Aramaki, S. Ystad, and R. Kronland-Martinet, A synthesis model with intuitive control capabilities for rolling sounds, IEEE/ACM Transactions on Audio, Speech, and Language Processing, vol. 22, no. 8, pp. 12601273, 2014.
[15] E. Thoret, M. Aramaki, R. Kronland-Martinet, J.-L. Velay, and S. Ystad, From sound to shape: auditory perception of drawing movements., Journal of Experimental Psychology: Human Perception and Performance, vol. 40, no. 3, p. 983, 2014.
[16] Leap Motion — Mac & PC Motion Controller for Games, Design, Virtual Reality & More (online). Available: https://www.leapmotion.com/
[17] Spat (online). Available: http://forumnet.ircam.fr/fr/produit/spat/
[18] Unity - Game Engine (online). Available: https://unity3d.com/fr/
[19] A. Merer, M. Aramaki, S. Ystad, and R. Kronland-Martinet, Perceptual characterization of motion evoked by sounds for synthesis control purposes, ACM Transactions on Applied Perception, vol. 10, no. 1, pp. 124, Feb. 2013.
[20] M. M. Houben, A. Kohlrausch, and D. Hermes, Auditory cues determining the perception of the size and speed of rolling balls, 2001.
[21] M. M. J. Houben, A. Kohlrausch, and D. J. Hermes, Perception of the size and speed of rolling balls by sound, Speech Communication, vol. 43, no. 4, pp. 331345, Sep. 2004.