Hand Gesture Interpretation Using Sensing Glove Integrated with Machine Learning Algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33093
Hand Gesture Interpretation Using Sensing Glove Integrated with Machine Learning Algorithms

Authors: Aqsa Ali, Aleem Mushtaq, Attaullah Memon, Monna

Abstract:

In this paper, we present a low cost design for a smart glove that can perform sign language recognition to assist the speech impaired people. Specifically, we have designed and developed an Assistive Hand Gesture Interpreter that recognizes hand movements relevant to the American Sign Language (ASL) and translates them into text for display on a Thin-Film-Transistor Liquid Crystal Display (TFT LCD) screen as well as synthetic speech. Linear Bayes Classifiers and Multilayer Neural Networks have been used to classify 11 feature vectors obtained from the sensors on the glove into one of the 27 ASL alphabets and a predefined gesture for space. Three types of features are used; bending using six bend sensors, orientation in three dimensions using accelerometers and contacts at vital points using contact sensors. To gauge the performance of the presented design, the training database was prepared using five volunteers. The accuracy of the current version on the prepared dataset was found to be up to 99.3% for target user. The solution combines electronics, e-textile technology, sensor technology, embedded system and machine learning techniques to build a low cost wearable glove that is scrupulous, elegant and portable.

Keywords: American sign language, assistive hand gesture interpreter, human-machine interface, machine learning, sensing glove.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1127466

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2729

References:


[1] WHO, “Deafness and hearing loss,” in World Health Organization, World Health Organization, 2016. (Online). Available: http://www.who.int/mediacentre/factsheets/fs300/en/#.
[2] Ms. Nutan D Sonwane, Prof. S. A. Chhabria, Dr. R. V. Dharaskar, "Self Organizing Markov Map for Speech and Gesture Recognition", International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 2, April 2012.
[3] Daniel Stein, Philippe Dreuw, Hermann Ney, Sara Morrissey, Andy Way Hand in Hand: Automatic Sign Language to English Translation.
[4] Mohamed Alsheakhali, Ahmed Skaik, Mohammed Aldahdouh, Mahmoud Alhelou, "Hand Gesture Recognition System", Computer Engineering Department, The Islamic University of Gaza, Gaza Strip, Palestine, 2011.
[5] Sanjay Meena, "A Study on Hand Gesture Recognition Technique", Department of Electronics and Communication Engineering, National Institute of Technology, Rourkela Orissa 769 008, India 2011.
[6] Yu Shi, Ronnie Taib and Serge Lichman, "GestureCam: A Smart Camera for Gesture Recognition and Gesture-Controlled Web Navigation".
[7] Joyeeta Singha1, Karen Das2, "Hand Gesture Recognition Based on Karhunen-Loeve Transform".
[8] Kumo, Yoshiri, Tomoyo Ishiyama, Kang-Hyun Jo, Nobutaka shimada and Yoshiaki Shirai, Vision-Based Human Interface System: “Selectively recognizing intentiona Hand Gesture”. In Proceedings of the IASTED International Conference on computer Graphics and Imaging, 219-223, 1998.
[9] Utsumi, Akira, Jun Kurumisawa, Takahiro Otsuka, and Jun Ohya, “Direct Manipulation Scene Creation in 3D”, SIGGRAPH’97 Electronic Garden, 1997.
[10] Muhammad Zahak Jamal, "Signal Acquisition Using Surface EMG and Circuit Design Considerations for Robotic Prosthesis".
[11] T. Scott Saponas1, Desney S. Tan2, Dan Morris2, Ravin Balakrishnan3, "Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces".
[12] Eldin Henry Shroffe, D P.Manimegalai "Hand Gesture Recognition Based On Emg Signals Using Ann", International Journal of Computer Application, Issue 3, Volume 2 (April 2013).
[13] Xu Zhang, Xiang Chen, Associate Member, IEEE, Yun Li, Vuokko Lantz, Kongqiao Wang, and Jihai Yang, "A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors".
[14] Laura Dipietro, Angelo M. Sabatini, Senior Member, IEEE, and Paolo Dario, Fellow, IEEE, "A Survey of Glove-Based Systems and Their Applications", IEEE transactions on systems, man, and cybernetics—vol. 38, no. 4, july 2008.
[15] Farid Parvini, Dennis McLeod, Cyrus Shahabi, Bahareh Navai, Baharak Zali, Shahram Ghandeharizadeh, "An Approach to Glove-Based Gesture Recognition", 13th International Conference, Human Computer Interface International 2009, San Diego, CA, USA, July 19-24, 2009, Proceedings, Part II.
[16] Anbarasi Rajamohan, Hemavathy R., Dhanalakshmi M., "Deaf-Mute Communication Interpreter", International Journal of Scientific Engineering and Technology (ISSN: 2277-1581), Volume 2 Issue 5, pp: 336-341 1 May 2013.
[17] Yasir Niaz Khan, Syed Atif Mehdi, "Sign Language Recognition using Sensor Gloves", FAST-National University of Computer and Emerging Sciences, Lahore.
[18] Ajinkya Raut, Vineeta Singh, Vikrant Rajput, Ruchika Mahale, "Hand Sign Interpreter”, The International Journal of Engineering And Science (IJES), 2012.