Search results for: assistive hand gesture interpreter
3857 Hand Gesture Interpretation Using Sensing Glove Integrated with Machine Learning Algorithms
Authors: Aqsa Ali, Aleem Mushtaq, Attaullah Memon, Monna
Abstract:
In this paper, we present a low cost design for a smart glove that can perform sign language recognition to assist the speech impaired people. Specifically, we have designed and developed an Assistive Hand Gesture Interpreter that recognizes hand movements relevant to the American Sign Language (ASL) and translates them into text for display on a Thin-Film-Transistor Liquid Crystal Display (TFT LCD) screen as well as synthetic speech. Linear Bayes Classifiers and Multilayer Neural Networks have been used to classify 11 feature vectors obtained from the sensors on the glove into one of the 27 ASL alphabets and a predefined gesture for space. Three types of features are used; bending using six bend sensors, orientation in three dimensions using accelerometers and contacts at vital points using contact sensors. To gauge the performance of the presented design, the training database was prepared using five volunteers. The accuracy of the current version on the prepared dataset was found to be up to 99.3% for target user. The solution combines electronics, e-textile technology, sensor technology, embedded system and machine learning techniques to build a low cost wearable glove that is scrupulous, elegant and portable.Keywords: American sign language, assistive hand gesture interpreter, human-machine interface, machine learning, sensing glove
Procedia PDF Downloads 2923856 Users’ Preferences for Map Navigation Gestures
Authors: Y. Y. Pang, N. A. Ismail
Abstract:
The map is a powerful and convenient tool in helping us to navigate to different places, but the use of indirect devices often makes its usage cumbersome. This study intends to propose a new map navigation dialogue that uses hand gesture. A set of dialogue was developed from users’ perspective to provide users complete freedom for panning, zooming, rotate, and find direction operations. A participatory design experiment was involved here where one hand gesture and two hand gesture dialogues had been analysed in the forms of hand gestures to develop a set of usable dialogues. The major finding was that users prefer one-hand gesture compared to two-hand gesture in map navigation.Keywords: hand gesture, map navigation, participatory design, intuitive interaction
Procedia PDF Downloads 2733855 Static and Dynamic Hand Gesture Recognition Using Convolutional Neural Network Models
Authors: Keyi Wang
Abstract:
Similar to the touchscreen, hand gesture based human-computer interaction (HCI) is a technology that could allow people to perform a variety of tasks faster and more conveniently. This paper proposes a training method of an image-based hand gesture image and video clip recognition system using a CNN (Convolutional Neural Network) with a dataset. A dataset containing 6 hand gesture images is used to train a 2D CNN model. ~98% accuracy is achieved. Furthermore, a 3D CNN model is trained on a dataset containing 4 hand gesture video clips resulting in ~83% accuracy. It is demonstrated that a Cozmo robot loaded with pre-trained models is able to recognize static and dynamic hand gestures.Keywords: deep learning, hand gesture recognition, computer vision, image processing
Procedia PDF Downloads 1313854 Hand Detection and Recognition for Malay Sign Language
Authors: Mohd Noah A. Rahman, Afzaal H. Seyal, Norhafilah Bara
Abstract:
Developing a software application using an interface with computers and peripheral devices using gestures of human body such as hand movements keeps growing in interest. A review on this hand gesture detection and recognition based on computer vision technique remains a very challenging task. This is to provide more natural, innovative and sophisticated way of non-verbal communication, such as sign language, in human computer interaction. Nevertheless, this paper explores hand detection and hand gesture recognition applying a vision based approach. The hand detection and recognition used skin color spaces such as HSV and YCrCb are applied. However, there are limitations that are needed to be considered. Almost all of skin color space models are sensitive to quickly changing or mixed lighting circumstances. There are certain restrictions in order for the hand recognition to give better results such as the distance of user’s hand to the webcam and the posture and size of the hand.Keywords: hand detection, hand gesture, hand recognition, sign language
Procedia PDF Downloads 3013853 Modeling and Control of a 4DoF Robotic Assistive Device for Hand Rehabilitation
Authors: Christopher Spiewak, M. R. Islam, Mohammad Arifur Rahaman, Mohammad H. Rahman, Roger Smith, Maarouf Saad
Abstract:
For those who have lost the ability to move their hand, going through repetitious motions with the assistance of a therapist is the main method of recovery. We have been developed a robotic assistive device to rehabilitate the hand motions in place of the traditional therapy. The developed assistive device (RAD-HR) is comprised of four degrees of freedom enabling basic movements, hand function, and assists in supporting the hand during rehabilitation. We used a nonlinear computed torque control technique to control the RAD-HR. The accuracy of the controller was evaluated in simulations (MATLAB/Simulink environment). To see the robustness of the controller external disturbance as modelling uncertainty (±10% of joint torques) were added in each joints.Keywords: biorobotics, rehabilitation, robotic assistive device, exoskeleton, nonlinear control
Procedia PDF Downloads 4713852 Hand Gestures Based Emotion Identification Using Flex Sensors
Authors: S. Ali, R. Yunus, A. Arif, Y. Ayaz, M. Baber Sial, R. Asif, N. Naseer, M. Jawad Khan
Abstract:
In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.Keywords: emotion identification, emotion models, gesture recognition, user perception
Procedia PDF Downloads 2813851 Vision-Based Hand Segmentation Techniques for Human-Computer Interaction
Abstract:
This work is the part of vision based hand gesture recognition system for Natural Human Computer Interface. Hand tracking and segmentation are the primary steps for any hand gesture recognition system. The aim of this paper is to develop robust and efficient hand segmentation algorithm such as an input to another system which attempt to bring the HCI performance nearby the human-human interaction, by modeling an intelligent sign language recognition system based on prediction in the context of dialogue between the system (avatar) and the interlocutor. For the purpose of hand segmentation, an overcoming occlusion approach has been proposed for superior results for detection of hand from an image.Keywords: HCI, sign language recognition, object tracking, hand segmentation
Procedia PDF Downloads 4043850 Patient-Friendly Hand Gesture Recognition Using AI
Authors: K. Prabhu, K. Dinesh, M. Ranjani, M. Suhitha
Abstract:
During the tough times of covid, those people who were hospitalized found it difficult to always convey what they wanted to or needed to the attendee. Sometimes the attendees might also not be there. In that case, the patients can use simple hand gestures to control electrical appliances (like its set it for a zero watts bulb)and three other gestures for voice note intimation. In this AI-based hand recognition project, NodeMCU is used for the control action of the relay, and it is connected to the firebase for storing the value in the cloud and is interfaced with the python code via raspberry pi. For three hand gestures, a voice clip is added for intimation to the attendee. This is done with the help of Google’s text to speech and the inbuilt audio file option in the raspberry pi 4. All the five gestures will be detected when shown with their hands via the webcam, which is placed for gesture detection. The personal computer is used for displaying the gestures and for running the code in the raspberry pi imager.Keywords: nodeMCU, AI technology, gesture, patient
Procedia PDF Downloads 1583849 Hand Motion and Gesture Control of Laboratory Test Equipment Using the Leap Motion Controller
Authors: Ian A. Grout
Abstract:
In this paper, the design and development of a system to provide hand motion and gesture control of laboratory test equipment is considered and discussed. The Leap Motion controller is used to provide an input to control a laboratory power supply as part of an electronic circuit experiment. By suitable hand motions and gestures, control of the power supply is provided remotely and without the need to physically touch the equipment used. As such, it provides an alternative manner in which to control electronic equipment via a PC and is considered here within the field of human computer interaction (HCI).Keywords: control, hand gesture, human computer interaction, test equipment
Procedia PDF Downloads 3093848 Hand Gesture Recognition Interface Based on IR Camera
Authors: Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park, Kwang-Mo Jung
Abstract:
Vision based user interfaces to control TVs and PCs have the advantage of being able to perform natural control without being limited to a specific device. Accordingly, various studies on hand gesture recognition using RGB cameras or depth cameras have been conducted. However, such cameras have the disadvantage of lacking in accuracy or the construction cost being large. The proposed method uses a low cost IR camera to accurately differentiate between the hand and the background. Also, complicated learning and template matching methodologies are not used, and the correlation between the fingertips extracted through curvatures is utilized to recognize Click and Move gestures.Keywords: recognition, hand gestures, infrared camera, RGB cameras
Procedia PDF Downloads 4003847 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach
Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed
Abstract:
Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model
Procedia PDF Downloads 4563846 PYTHEIA: A Scale for Assessing Rehabilitation and Assistive Robotics
Authors: Yiannis Koumpouros, Effie Papageorgiou, Alexandra Karavasili, Foteini Koureta
Abstract:
The objective of the present study was to develop a scale called PYTHEIA. The PYTHEIA is a self-reported measure for the assessment of rehabilitation and assistive robotics and other assistive technology devices. The development of PYTHEIA faced the absence of a valid instrument that can be used to evaluate the assistive robotic devices both as a whole, as well as any of their individual components or functionalities implemented. According to the results presented, PYTHEIA is a valid and reliable scale able to be applied to different target groups for the subjective evaluation of various assistive technology devices.Keywords: rehabilitation, assistive technology, assistive robots, rehabilitation robots, scale, psychometric test, assessment, validation, user satisfaction
Procedia PDF Downloads 3073845 Information Retrieval from Internet Using Hand Gestures
Authors: Aniket S. Joshi, Aditya R. Mane, Arjun Tukaram
Abstract:
In the 21st century, in the era of e-world, people are continuously getting updated by daily information such as weather conditions, news, stock exchange market updates, new projects, cricket updates, sports and other such applications. In the busy situation, they want this information on the little use of keyboard, time. Today in order to get such information user have to repeat same mouse and keyboard actions which includes time and inconvenience. In India due to rural background many people are not much familiar about the use of computer and internet also. Also in small clinics, small offices, and hotels and in the airport there should be a system which retrieves daily information with the minimum use of keyboard and mouse actions. We plan to design application based project that can easily retrieve information with minimum use of keyboard and mouse actions and make our task more convenient and easier. This can be possible with an image processing application which takes real time hand gestures which will get matched by system and retrieve information. Once selected the functions with hand gestures, the system will report action information to user. In this project we use real time hand gesture movements to select required option which is stored on the screen in the form of RSS Feeds. Gesture will select the required option and the information will be popped and we got the information. A real time hand gesture makes the application handier and easier to use.Keywords: hand detection, hand tracking, hand gesture recognition, HSV color model, Blob detection
Procedia PDF Downloads 2823844 Hand Motion Trajectory Analysis for Dynamic Hand Gestures Used in Indian Sign Language
Authors: Daleesha M. Viswanathan, Sumam Mary Idicula
Abstract:
Dynamic hand gestures are an intrinsic component in sign language communication. Extracting spatial temporal features of the hand gesture trajectory plays an important role in a dynamic gesture recognition system. Finding a discrete feature descriptor for the motion trajectory based on the orientation feature is the main concern of this paper. Kalman filter algorithm and Hidden Markov Models (HMM) models are incorporated with this recognition system for hand trajectory tracking and for spatial temporal classification, respectively.Keywords: orientation features, discrete feature vector, HMM., Indian sign language
Procedia PDF Downloads 3633843 Hand Gesture Detection via EmguCV Canny Pruning
Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae
Abstract:
Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.Keywords: canny pruning, hand recognition, machine learning, skin tracking
Procedia PDF Downloads 1783842 Hand Controlled Mobile Robot Applied in Virtual Environment
Authors: Jozsef Katona, Attila Kovari, Tibor Ujbanyi, Gergely Sziladi
Abstract:
By the development of IT systems, human-computer interaction is also developing even faster and newer communication methods become available in human-machine interaction. In this article, the application of a hand gesture controlled human-computer interface is being introduced through the example of a mobile robot. The control of the mobile robot is implemented in a realistic virtual environment that is advantageous regarding the aspect of different tests, parallel examinations, so the purchase of expensive equipment is unnecessary. The usability of the implemented hand gesture control has been evaluated by test subjects. According to the opinion of the testing subjects, the system can be well used, and its application would be recommended on other application fields too.Keywords: human-machine interface (HCI), mobile robot, hand control, virtual environment
Procedia PDF Downloads 2923841 Hand Gesture Interface for PC Control and SMS Notification Using MEMS Sensors
Authors: Keerthana E., Lohithya S., Harshavardhini K. S., Saranya G., Suganthi S.
Abstract:
In an epoch of expanding human-machine interaction, the development of innovative interfaces that bridge the gap between physical gestures and digital control has gained significant momentum. This study introduces a distinct solution that leverages a combination of MEMS (Micro-Electro-Mechanical Systems) sensors, an Arduino Mega microcontroller, and a PC to create a hand gesture interface for PC control and SMS notification. The core of the system is an ADXL335 MEMS accelerometer sensor integrated with an Arduino Mega, which communicates with a PC via a USB cable. The ADXL335 provides real-time acceleration data, which is processed by the Arduino to detect specific hand gestures. These gestures, such as left, right, up, down, or custom patterns, are interpreted by the Arduino, and corresponding actions are triggered. In the context of SMS notifications, when a gesture indicative of a new SMS is recognized, the Arduino relays this information to the PC through the serial connection. The PC application, designed to monitor the Arduino's serial port, displays these SMS notifications in the serial monitor. This study offers an engaging and interactive means of interfacing with a PC by translating hand gestures into meaningful actions, opening up opportunities for intuitive computer control. Furthermore, the integration of SMS notifications adds a practical dimension to the system, notifying users of incoming messages as they interact with their computers. The use of MEMS sensors, Arduino, and serial communication serves as a promising foundation for expanding the capabilities of gesture-based control systems.Keywords: hand gestures, multiple cables, serial communication, sms notification
Procedia PDF Downloads 573840 The History and Plausible Future of Assistive Technology and What It Might Mean for Singapore Students With Disabilities
Authors: Thomas Chong, Irene Victor
Abstract:
This paper discusses the history and plausible future of assistive technology and what it means for students with disabilities in Singapore, a country known for its high quality of education in the world. Over more than a century, students with disabilities have benefitted from relatively low-tech assistive technology (like eye-glasses, Braille, magnifiers and wheelchairs) to high-tech assistive technology including electronic mobility switches, alternative keyboards, computer-screen enlargers, text-to-speech readers, electronic sign-language dictionaries and signing avatars for individuals with hearing impairments. Driven by legislation, the use of assistive technology in many countries is becoming so ubiquitous that more and more students with disabilities are able to perform as well as if not better than their counterparts. Yet in many other learning environments where assistive technology is not affordable or mandated, the learning gaps can be quite significant. Without stronger legislation, Singapore may still have a long way to go in levelling the playing field for its students with disabilities.Keywords: assistive technology, students with disabilities, disability laws in Singapore, inclusiveness
Procedia PDF Downloads 663839 Prototyping a Portable, Affordable Sign Language Glove
Authors: Vidhi Jain
Abstract:
Communication between speakers and non-speakers of American Sign Language (ASL) can be problematic, inconvenient, and expensive. This project attempts to bridge the communication gap by designing a portable glove that captures the user’s ASL gestures and outputs the translated text on a smartphone. The glove is equipped with flex sensors, contact sensors, and a gyroscope to measure the flexion of the fingers, the contact between fingers, and the rotation of the hand. The glove’s Arduino UNO microcontroller analyzes the sensor readings to identify the gesture from a library of learned gestures. The Bluetooth module transmits the gesture to a smartphone. Using this device, one day speakers of ASL may be able to communicate with others in an affordable and convenient way.Keywords: sign language, morse code, convolutional neural network, American sign language, gesture recognition
Procedia PDF Downloads 573838 A Novel Combined Finger Counting and Finite State Machine Technique for ASL Translation Using Kinect
Authors: Rania Ahmed Kadry Abdel Gawad Birry, Mohamed El-Habrouk
Abstract:
This paper presents a brief survey of the techniques used for sign language recognition along with the types of sensors used to perform the task. It presents a modified method for identification of an isolated sign language gesture using Microsoft Kinect with the OpenNI framework. It presents the way of extracting robust features from the depth image provided by Microsoft Kinect and the OpenNI interface and to use them in creating a robust and accurate gesture recognition system, for the purpose of ASL translation. The Prime Sense’s Natural Interaction Technology for End-user - NITE™ - was also used in the C++ implementation of the system. The algorithm presents a simple finger counting algorithm for static signs as well as directional Finite State Machine (FSM) description of the hand motion in order to help in translating a sign language gesture. This includes both letters and numbers performed by a user, which in-turn may be used as an input for voice pronunciation systems.Keywords: American sign language, finger counting, hand tracking, Microsoft Kinect
Procedia PDF Downloads 2913837 CONDUCTHOME: Gesture Interface Control of Home Automation Boxes
Authors: J. Branstett, V. Gagneux, A. Leleu, B. Levadoux, J. Pascale
Abstract:
This paper presents the interface CONDUCTHOME which controls home automation systems with a Leap Motion using ‘invariant gesture protocols’. The function of this interface is to simplify the interaction of the user with its environment. A hardware part allows the Leap Motion to be carried around the house. A software part interacts with the home automation box and displays the useful information for the user. An objective of this work is the development a natural/invariant/simple gesture control interface to help elder people/people with disabilities.Keywords: automation, ergonomics, gesture recognition, interoperability
Procedia PDF Downloads 4243836 Real-Time Gesture Recognition System Using Microsoft Kinect
Authors: Ankita Wadhawan, Parteek Kumar, Umesh Kumar
Abstract:
Gesture is any body movement that expresses some attitude or any sentiment. Gestures as a sign language are used by deaf people for conveying messages which helps in eliminating the communication barrier between deaf people and normal persons. Nowadays, everybody is using mobile phone and computer as a very important gadget in their life. But there are some physically challenged people who are blind/deaf and the use of mobile phone or computer like device is very difficult for them. So, there is an immense need of a system which works on body gesture or sign language as input. In this research, Microsoft Kinect Sensor, SDK V2 and Hidden Markov Toolkit (HTK) are used to recognize the object, motion of object and human body joints through Touch less NUI (Natural User Interface) in real-time. The depth data collected from Microsoft Kinect has been used to recognize gestures of Indian Sign Language (ISL). The recorded clips are analyzed using depth, IR and skeletal data at different angles and positions. The proposed system has an average accuracy of 85%. The developed Touch less NUI provides an interface to recognize gestures and controls the cursor and click operation in computer just by waving hand gesture. This research will help deaf people to make use of mobile phones, computers and socialize among other persons in the society.Keywords: gesture recognition, Indian sign language, Microsoft Kinect, natural user interface, sign language
Procedia PDF Downloads 2993835 Design for Sentiment-ancy: Conceptual Framework to Improve User’s Well-being Through Fostering Emotional Attachment in the Use Experience with Their Assistive Devices
Authors: Seba Quqandi
Abstract:
This study investigates the bond that people form using their assistive devices and the tactics applied during the product design process to help improve the user experience leading to a long-term product relationship. The aim is to develop a conceptual framework with which to describe and analyze the bond people form with their assistive devices and to integrate human emotions as a factor during the development of the product design process. The focus will be on the assistive technology market, namely, the Aid-For-Daily-Living market for situational impairments, to increase the quality of wellbeing. Findings will help us better understand the real issues of the product experience concerning people’s interaction throughout the product performance, establish awareness of the emotional effects in the daily interaction that fosters the product attachment, and help product developers and future designers create a connection between users and their assistive devices. The research concludes by discussing the implications of these findings for professionals and academics in the form of experiments in order to identify new areas that can stimulate new /or developed design directions.Keywords: experience design, interaction design, emotion, design psychology, assistive tools, customization, userentred design
Procedia PDF Downloads 2213834 Hands-off Parking: Deep Learning Gesture-based System for Individuals with Mobility Needs
Authors: Javier Romera, Alberto Justo, Ignacio Fidalgo, Joshue Perez, Javier Araluce
Abstract:
Nowadays, individuals with mobility needs face a significant challenge when docking vehicles. In many cases, after parking, they encounter insufficient space to exit, leading to two undesired outcomes: either avoiding parking in that spot or settling for improperly placed vehicles. To address this issue, the following paper presents a parking control system employing gestural teleoperation. The system comprises three main phases: capturing body markers, interpreting gestures, and transmitting orders to the vehicle. The initial phase is centered around the MediaPipe framework, a versatile tool optimized for real-time gesture recognition. MediaPipe excels at detecting and tracing body markers, with a special emphasis on hand gestures. Hands detection is done by generating 21 reference points for each hand. Subsequently, after data capture, the project employs the MultiPerceptron Layer (MPL) for indepth gesture classification. This tandem of MediaPipe's extraction prowess and MPL's analytical capability ensures that human gestures are translated into actionable commands with high precision. Furthermore, the system has been trained and validated within a built-in dataset. To prove the domain adaptation, a framework based on the Robot Operating System (ROS), as a communication backbone, alongside CARLA Simulator, is used. Following successful simulations, the system is transitioned to a real-world platform, marking a significant milestone in the project. This real vehicle implementation verifies the practicality and efficiency of the system beyond theoretical constructs.Keywords: gesture detection, mediapipe, multiperceptron layer, robot operating system
Procedia PDF Downloads 923833 Disability, Technology and Inclusion: Fostering and Inclusive Pedagogical Approach in an Interdisciplinary Project
Authors: M. Lopez-Pereyra, I. Cisneros Alvarado, M. Del Socorro Lobato Alba
Abstract:
This paper aims to discuss a conceptual, pedagogical approach that foster inclusive education and that create an awareness of the use of assistive technology in Mexico. Interdisciplinary understanding of disabilities and the use of assistive technology as a frame for an inclusive education have challenged the reality of the researchers’ participation in decision-making. Drawing upon a pedagogical inquiry process within an interdisciplinary academic project that involved the sciences, design, biotechnology, psychology and education fields, this paper provides a discussion on the challenges of assistive technology and inclusive education in interdisciplinary research on disabilities and technology project. This study is frame on an educational action research design where the team is interested in integrating, disability, technology, and inclusion, theory, and practice. Major findings include: (1) the concept of inclusive education as a strategy for interdisciplinary research; (2) inclusion as a pedagogical approach that challenges the creation of assistive technology from diverse academic fields; and, (3) inclusion as a frame, problem-focused, for decision-making. The findings suggest that inclusive pedagogical approaches provide a unique insight into interdisciplinary teams on disability and assistive technology in education.Keywords: assistive technology, inclusive education, inclusive pedagogy, interdisciplinary research
Procedia PDF Downloads 1863832 Using Assistive Technologies in Teaching Children with Disabilities in Jordan: Teachers' Perceptions
Authors: Kholoud Adeeb. Al-Dababneh
Abstract:
This study aimed at investigating teachers' perceptions of using assistive technologies in teaching children with disabilities in Jordan. The researcher developed a study instrument (questionnaire) to examine teachers' perceptions regarding the use of assistive technologies in teaching children with disabilities. The validity and reliability of the research instrument were checked. A random sample of 260 teachers who teach children with disabilities participated in the study by completing the questionnaire; fifteen teachers were later interviewed. Results revealed that the use of assistive technology by teachers in teaching children with disabilities was high. The results also revealed that there are statistically significant differences at (α= .05) according to the type of disability in favor of teachers of children with specific learning disabilities (SLD), according to educational settings in favor of local public schools (inclusion settings). The results revealed that there were no statistically significant differences attributed to the teacher's level of education and teachers' gender. In light of the study results, the researcher addressed several recommendations and future implications.Keywords: assistive technologies, children with disabilities, Jordan, teachers
Procedia PDF Downloads 1103831 Divergences in Interpreters’ Oral Interpretation among Pentecostal Churches: Sermonic Reflections
Authors: Rufus Olufemi Adebayo, Sylvia Phiwani Zulu
Abstract:
Interpreting in the setting of diverse language and multicultural congregants, is often understood as integrating the content of the message. Preaching, similar to any communication, takes seriously people’s multiple contexts. The one who provides the best insight into understanding “the other”, traditionally speaking could be an interpreter in a multilingual context. Nonetheless, there are reflections in the loss of spiritual communication, translation and interpretive dialogue. No matter how eloquent the preacher is, an interpreter can make or mere the sermon (speech). The sermon that the preacher preaches is not always the one the congregation hears from the interpreter. In other occurrences, however, interpreting can lead not only to distort messages but also to dissatisfied audiences and preacher being overshadowed by the pranks of the interpreter. Using qualitative methodology, this paper explores the challenges and the conventional assumptions about preachers’ interpreter as influenced by spirituality, culture, and language in empirical and theoretical perspectives. An emphasis on the bias translation and the basis of reality that suppresses or devalues the spiritual communication is examined. The result indicates that interpretation of the declaration of guilt, history of congregation, spirituality, attitudes, morals, customs, specific practices of a preacher, education, and the environment form an entangled and misinterpretation. The article concludes by re-examining these qualities and rearticulating them into a preliminary theory for practice, as distinguished from theory, which could possibly enhance the development of more sustainable multilingual interpretation in the South African Pentecostal churches.Keywords: congregants, divergences, interpreting/translation, language & communication, sermon/preaching
Procedia PDF Downloads 1593830 Control Strategies for a Robot for Interaction with Children with Autism Spectrum Disorder
Authors: Vinicius Binotte, Guilherme Baldo, Christiane Goulart, Carlos Valadão, Eliete Caldeira, Teodiano Bastos
Abstract:
Socially assistive robotic has become increasingly active and it is present in therapies of people affected for several neurobehavioral conditions, such as Autism Spectrum Disorder (ASD). In fact, robots have played a significant role for positive interaction with children with ASD, by stimulating their social and cognitive skills. This work introduces a mobile socially-assistive robot, which was built for interaction with children with ASD, using non-linear control techniques for this interaction.Keywords: socially assistive robotics, mobile robot, autonomous control, autism
Procedia PDF Downloads 4953829 Authoring Tactile Gestures: Case Study for Emotion Stimulation
Authors: Rodrigo Lentini, Beatrice Ionascu, Friederike A. Eyssel, Scandar Copti, Mohamad Eid
Abstract:
The haptic modality has brought a new dimension to human computer interaction by engaging the human sense of touch. However, designing appropriate haptic stimuli, and in particular tactile stimuli, for various applications is still challenging. To tackle this issue, we present an intuitive system that facilitates the authoring of tactile gestures for various applications. The system transforms a hand gesture into a tactile gesture that can be rendering using a home-made haptic jacket. A case study is presented to demonstrate the ability of the system to develop tactile gestures that are recognizable by human subjects. Four tactile gestures are identified and tested to intensify the following four emotional responses: high valence – high arousal, high valence – low arousal, low valence – high arousal, and low valence – low arousal. A usability study with 20 participants demonstrated high correlation between the selected tactile gestures and the intended emotional reaction. Results from this study can be used in a wide spectrum of applications ranging from gaming to interpersonal communication and multimodal simulations.Keywords: tactile stimulation, tactile gesture, emotion reactions, arousal, valence
Procedia PDF Downloads 3653828 Students Competencies in the Use of Computer Assistive Technology at Akropong School for the Blind in the Eastern of Ghana
Authors: Joseph Ampratwum, Yaw Nyadu Offei, Afua Ntoaduro, Frank Twum
Abstract:
The use of computer assistive technology has captured the attention of individuals with visual impairment. Children with visual impairments who are tactual learners have one unique need which is quite different from all other disability groups. They depend on the use of computer assistive technology for reading, writing, receiving information and sending information as well. The objective of the study was to assess students’ competencies in the use of computer assistive technology at Akropong School for the Blind in Ghana. This became necessary because little research has been conducted to document the competencies and challenges in the use of computer among students with visual impairments in Africa. A case study design with a mixed research strategy was adopted for the study. A purposive sampling technique was used to sample 35 students from Akropong School for the Blind in the eastern region of Ghana. The researcher gathered both quantitative and qualitative data to measure students’ competencies in keyboarding skills and Job Access with Speech (JAWS), as well as the other challenges. The findings indicated that comparatively students’ competency in keyboard skills was higher than JAWS application use. Thus students had reached higher stages in the conscious competencies matrix in the former than the latter. It was generally noted that challenges limiting effective use of students’ competencies in computer assistive technology in the School were more personal than external influences. This was because most of the challenges were due to the individual response to the training and familiarity in developing their competencies in using computer assistive technology. Base on this it was recommended that efforts should be made to stock up the laboratory with additional computers. Directly in line with the first recommendation, it was further suggested that more practice time should be created for the students to maximize computer use. Also Licensed JAWS must be acquired by the school to advance students’ competence in using computer assistive technology.Keywords: computer assistive technology, job access with speech, keyboard, visual impairment
Procedia PDF Downloads 334