Authoring Tactile Gestures: Case Study for Emotion Stimulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
Authoring Tactile Gestures: Case Study for Emotion Stimulation

Authors: Rodrigo Lentini, Beatrice Ionascu, Friederike A. Eyssel, Scandar Copti, Mohamad Eid

Abstract:

The haptic modality has brought a new dimension to human computer interaction by engaging the human sense of touch. However, designing appropriate haptic stimuli, and in particular tactile stimuli, for various applications is still challenging. To tackle this issue, we present an intuitive system that facilitates the authoring of tactile gestures for various applications. The system transforms a hand gesture into a tactile gesture that can be rendering using a home-made haptic jacket. A case study is presented to demonstrate the ability of the system to develop tactile gestures that are recognizable by human subjects. Four tactile gestures are identified and tested to intensify the following four emotional responses: high valence – high arousal, high valence – low arousal, low valence – high arousal, and low valence – low arousal. A usability study with 20 participants demonstrated high correlation between the selected tactile gestures and the intended emotional reaction. Results from this study can be used in a wide spectrum of applications ranging from gaming to interpersonal communication and multimodal simulations.

Keywords: Tactile stimulation, tactile gesture, emotion reactions, arousal, valence.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1127422

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1266

References:


[1] M. Reiner, “The Role of Haptics in Immersive Telecommunication Environments,” IEEE Transactions on Circuits and Systems for Video Technology, 14(3): 392–401, 2004.
[2] M. A. Eid and H. Al Osman, "Affective Haptics: Current Research and Future Directions," IEEE Access, vol. 4, pp. 26-40, 2016.
[3] M. Eid, M. Orozco, and A. El Saddik, “A guided tour in haptic audio visual environments and applications,” International Journal of Advanced Media and Communication, vol. 1, no. 3, pp. 265-297, 2007.
[4] F. Danieau, A. L´ecuyer, P. Guillotel, J. Fleureau, N. Mollet, M. Christie, “Enhancing audiovisual experience with haptic feedback: a survey on HAV,” vol. 6, pp. 193–205, 2013.
[5] H.Y. Joung and E.Y. Do, “Tactile hand gesture recognition through haptic feedback for affective online communication,” 6th international conference on Universal access in human-computer interaction: users diversity - Volume Part II (UAHCI'11), Springer-Verlag, Berlin, Heidelberg, 555-563.
[6] L. Bonanni, C. Vaucelle, J. Lieberman, and O. Zuckerman, “TapTap: a haptic wearable for asynchronous distributed touch therapy,” CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06), pp. 580-585, 2006.
[7] P. Lemmens, F. Crompvoets, D. Brokken, J. van den Eerenbeemd, and G. de Vries, “A body-conforming tactile jacket to enrich movie viewing,” EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 7–12, 2009.
[8] J. Cha, M. Eid, A. Barghout, A.S.M.M Rahman, and A. El Saddik, “Hugme: synchronous haptic teleconferencing,” Proceedings of the seventeenth ACM international conference on Multimedia, pp. 1135–1136, 2009.
[9] S. Krishna, S. Bala, T. McDaniel, S. McGuire, and S. Panchanathan, “VibroGlove: an assistive technology aid for conveying facial expressions,” Extended Abstracts on Human Factors in Computing Systems, CHI EA '10, 2010.
[10] Immersion Corporation, http://ir.immersion.com, accessed May 2, 2016.
[11] S. ur Réhman, M. S. L. Khan, L. Li, and H. Li, "Vibrotactile TV for immersive experience," Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific, Siem Reap, 2014, pp. 1-4.
[12] A. B. Weddle and Hua Yu, "How does audio-haptic enhancement influence emotional response to mobile media?," 2013 Fifth International Workshop on Quality of Multimedia Experience (QoMEX), 2013, pp. 158-163.
[13] M. Rashid, S. A. R. Abu-Bakar, and M. Mokji, “Human emotion recognition from videos using spatio-temporal and audio features,” vol. 29, issue 12, pp 1269-1275, December 2013.
[14] K. Sun, J. Yu, Y. Huang and X. Hu, "An improved valence-arousal emotion space for video affective content representation and recognition," Multimedia and Expo, 2009. ICME 2009. IEEE International Conference on, New York, NY, 2009, pp. 566-569.
[15] V.V. Kryssanov, E.W. Cooper, H. Ogawa and I. Kurose, "A computational model to relay emotions with tactile stimuli," 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1-6, 2009.
[16] F. Arafsha, K. Masudul Alam, and A. Saddik, “Design and development of a user centric affective haptic jacket,” Multimedia Tools and Applications, vol. 74, no. 9, pp. 3035-3052, 2015.
[17] A. Mazzoni and N. Bryan-Kinns, "How does it feel like? An exploratory study of a prototype system to convey emotion through haptic wearable devices," 7th International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN), pp. 64-68, 2015.
[18] F. Danieau, J. Fleureau, A. Cabec, P. Kerbiriou, and P Guillotel "Framework for enhancing video viewing experience with haptic effects of motion," IEEE Haptics Symposium (HAPTICS), pp. 541-546, 2012.
[19] A. Schaefer, F. Nils, X. Sanchez, and P. Philippot, “Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers,” Cognition and Emotion, 24(7), 1153–1172, 2010.