Hand Gestures Based Emotion Identification Using Flex Sensors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32799
Hand Gestures Based Emotion Identification Using Flex Sensors

Authors: S. Ali, R. Yunus, A. Arif, Y. Ayaz, M. Baber Sial, R. Asif, N. Naseer, M. Jawad Khan

Abstract:

In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.

Keywords: Emotion identification, emotion models, gesture recognition, user perception.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1474499

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 887

References:


[1] L. F. Barrett, “Discrete Emotions or Dimensions? The Role of Valence Focus and Arousal Focus,” Cognition and Emotion, vol. 12, no. 4, pp. 579–599, Jul. 1998.
[2] D. Cohen, G. Beattie, and H. Shovelton, “Nonverbal indicators of deception: How iconic gestures reveal thoughts that cannot be suppressed,” Semiotica, vol. 2010, no. 182, pp. 133–174, 2010.
[3] LF Barrett, M Lewis, JM Haviland Jones, Handbook of Emotions. New York: Guilford Press, 2016.
[4] P. A. Andersen, Nonverbal communication: forms and functions, 2nd ed. Long Grove, Ill: Waveland Press, Inc, 2008.
[5] S. Costa, F. Soares, and C. Santos, “Facial Expressions and Gestures to Convey Emotions with a Humanoid Robot,” in Social Robotics, 2013, pp. 542–55
[6] T. Hashimoto, S. Hitramatsu, T. Tsuji, and H. Kobayashi, “Development of the Face Robot SAYA for Rich Facial Expressions,” in 2006 SICE-ICASE International Joint Conference, 2006, pp. 5423–5428.
[7] M. Kipp and J. C. Martin, “Gesture and emotion: Can basic gestural form features discriminate emotions?,” in 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, 2009, pp. 1–8.
[8] A. P. Atkinson, M. L. Tunstall, and W. H. Dittrich, “Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures,” Cognition, vol. 104, no. 1, pp. 59–72, Jul. 2007.
[9] R. M. Edith and A. B. Haripriya, “Gesture recognition using real time EMG,” in 2015 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS), 2015, pp. 1–3.
[10] K. Dermitzakis, A. H. Arieta, and R. Pfeifer, “Gesture recognition in upper-limb prosthetics: A viability study using dynamic time warping and gyroscopes,” in 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2011, pp. 4530–4533.
[11] V. Kartsch, S. Benatti, M. Mancini, M. Magno, and L. Benini, “Smart Wearable Wristband for EMG based Gesture Recognition Powered by Solar Energy Harvester,” in 2018 IEEE International Symposium on Circuits and Systems (ISCAS), 2018, pp. 1–5.
[12] K. Kang and H. C. Shin, “EMG based gesture recognition using feature calibration,” in 2018 International Conference on Information Networking (ICOIN), 2018, pp. 825–827.
[13] A. Moin et al., “An EMG Gesture Recognition System with Flexible High-Density Sensors and Brain-Inspired High-Dimensional Classifier,” in 2018 IEEE International Symposium on Circuits and Systems (ISCAS), 2018, pp. 1–5.
[14] T. K. Chan, Y. K. Yu, H. C. Kam, and K. H. Wong, “Robust hand gesture input using computer vision, inertial measurement unit (IMU) and flex sensors,” Mechatronics, vol. 2018, 2018.
[15] S. Shibata, K. Ohba, and N. Inooka, “Emotional evaluation of human arm motion models,” in Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication, 1993, pp. 346–351.
[16] C. L. Roether, L. Omlor, A. Christensen, and M. A. Giese, “Critical features for the perception of emotion from gait,” Journal of Vision, vol. 9, no. 6, pp. 15–15, Jun. 2009.
[17] S. B. Sial, M. B. Sial, Y. Ayaz, S. I. A. Shah, and A. Zivanovic, “Interaction of robot with humans by communicating simulated emotional states through expressive movements,” Intel Serv Robotics, vol. 9, no. 3, pp. 231–255, Jul. 2016.
[18] P. Kumari, L. Mathew, and P. Syal, “Increasing trend of wearables and multimodal interface for human activity monitoring: A review,” Biosensors and Bioelectronics, vol. 90, pp. 298–307, Apr. 2017.
[19] J. P. Wachs, M. Kölsch, H. Stern, and Y. Edan, “Vision-based Hand-gesture Applications,” Commun. ACM, vol. 54, no. 2, pp. 60–71, Feb. 2011.
[20] S. Tariq, S. Baber, A. Ashfaq, Y. Ayaz, M. Naveed, and S. Mohsin, “Interactive Therapy Approach Through Collaborative Physical Play Between a Socially Assistive Humanoid Robot and Children with Autism Spectrum Disorder,” in Social Robotics, 2016, pp. 561–570.
[21] R. Aylett et al., “Evaluating Robot Facial Expressions,” in Proceedings of the 19th ACM International Conference on Multimodal Interaction, New York, NY, USA, 2017, pp. 516–517.