Search results for: tactile gesture
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 159

Search results for: tactile gesture

159 Authoring Tactile Gestures: Case Study for Emotion Stimulation

Authors: Rodrigo Lentini, Beatrice Ionascu, Friederike A. Eyssel, Scandar Copti, Mohamad Eid

Abstract:

The haptic modality has brought a new dimension to human computer interaction by engaging the human sense of touch. However, designing appropriate haptic stimuli, and in particular tactile stimuli, for various applications is still challenging. To tackle this issue, we present an intuitive system that facilitates the authoring of tactile gestures for various applications. The system transforms a hand gesture into a tactile gesture that can be rendering using a home-made haptic jacket. A case study is presented to demonstrate the ability of the system to develop tactile gestures that are recognizable by human subjects. Four tactile gestures are identified and tested to intensify the following four emotional responses: high valence – high arousal, high valence – low arousal, low valence – high arousal, and low valence – low arousal. A usability study with 20 participants demonstrated high correlation between the selected tactile gestures and the intended emotional reaction. Results from this study can be used in a wide spectrum of applications ranging from gaming to interpersonal communication and multimodal simulations.

Keywords: tactile stimulation, tactile gesture, emotion reactions, arousal, valence

Procedia PDF Downloads 341
158 Influence of Tactile Symbol Size on Its Perceptibility in Consideration of Effect of Aging

Authors: T. Nishimura, K. Doi, H. Fujimoto, T. Wada

Abstract:

We conducted perception experiments on tactile symbols to elucidate the impact of the size of these letters on the level of perceptibility. This study was based on the accessible design perspective and aimed at expanding the availability of tactile symbols for the visually impaired who are unable to read Braille characters. In particular, this study targeted people with acquired visual impairments as users of the tactile symbols. The subjects (young and elderly individuals) in this study had normal vision. They were asked to participate in the experiments to identify tactile symbols while unable to see their hand during the experiments. This study investigated the relation between the size and perceptibility of tactile symbols based on an examination using test pieces of these letters in different sizes. The results revealed that the error rates for both young and elderly subjects converged to almost 0% when 12 mm size tactile symbols were used. The findings also showed that the error rate was low and subjects could identify the symbols in 5 s when 16 mm size tactile symbols were introduced.

Keywords: accessible design, tactile sense, tactile symbols, bioinformatic

Procedia PDF Downloads 318
157 The Design of Smart Tactile Textiles for Therapeutic Applications

Authors: Karen Hong

Abstract:

Smart tactile textiles are a series of textile-based products that incorporates smart embedded technology to be utilized as tactile therapeutic applications for 2 main groups of target users. The first group of users will be children with sensory processing disorder who are suffering from tactile sensory dysfunction. Children with tactile sensory issues may have difficulty tolerating the sensations generated from the touch of certain textures on the fabrics. A series of smart tactile textiles, collectively known as ‘Tactile Toys’ are developed as tactile therapy play objects, exposing children to different types of touch sensations within textiles, enabling them to enjoy tactile experiences together with interactive play which will help them to overcome fear of certain touch sensations. The second group of users will be the elderly or geriatric patients who are suffering from deteriorating sense of touch. One of the common consequences of aging is suffering from deteriorating sense of touch and a decline in motoric function. With the focus in stimulating the sense of touch for this particular group of end users, another series of smart tactile textiles, collectively known as ‘Tactile Aids’ are developed also as tactile therapy. This range of products can help to maintain touch sensitivity and at the same time allowing the elderly to enjoy interactive play to practice their hand-eye coordination and enhancing their motor skills. These smart tactile textile products are being designed and tested out by the end users and have proofed their efficacy as tactile therapy enabling the users to lead a better quality of life.

Keywords: smart textiles, embedded technology, tactile therapy, tactile aids, tactile toys

Procedia PDF Downloads 149
156 Users’ Preferences for Map Navigation Gestures

Authors: Y. Y. Pang, N. A. Ismail

Abstract:

The map is a powerful and convenient tool in helping us to navigate to different places, but the use of indirect devices often makes its usage cumbersome. This study intends to propose a new map navigation dialogue that uses hand gesture. A set of dialogue was developed from users’ perspective to provide users complete freedom for panning, zooming, rotate, and find direction operations. A participatory design experiment was involved here where one hand gesture and two hand gesture dialogues had been analysed in the forms of hand gestures to develop a set of usable dialogues. The major finding was that users prefer one-hand gesture compared to two-hand gesture in map navigation.

Keywords: hand gesture, map navigation, participatory design, intuitive interaction

Procedia PDF Downloads 247
155 Sound Selection for Gesture Sonification and Manipulation of Virtual Objects

Authors: Benjamin Bressolette, S´ebastien Denjean, Vincent Roussarie, Mitsuko Aramaki, Sølvi Ystad, Richard Kronland-Martinet

Abstract:

New sensors and technologies – such as microphones, touchscreens or infrared sensors – are currently making their appearance in the automotive sector, introducing new kinds of Human-Machine Interfaces (HMIs). The interactions with such tools might be cognitively expensive, thus unsuitable for driving tasks. It could for instance be dangerous to use touchscreens with a visual feedback while driving, as it distracts the driver’s visual attention away from the road. Furthermore, new technologies in car cockpits modify the interactions of the users with the central system. In particular, touchscreens are preferred to arrays of buttons for space improvement and design purposes. However, the buttons’ tactile feedback is no more available to the driver, which makes such interfaces more difficult to manipulate while driving. Gestures combined with an auditory feedback might therefore constitute an interesting alternative to interact with the HMI. Indeed, gestures can be performed without vision, which means that the driver’s visual attention can be totally dedicated to the driving task. In fact, the auditory feedback can both inform the driver with respect to the task performed on the interface and on the performed gesture, which might constitute a possible solution to the lack of tactile information. As audition is a relatively unused sense in automotive contexts, gesture sonification can contribute to reducing the cognitive load thanks to the proposed multisensory exploitation. Our approach consists in using a virtual object (VO) to sonify the consequences of the gesture rather than the gesture itself. This approach is motivated by an ecological point of view: Gestures do not make sound, but their consequences do. In this experiment, the aim was to identify efficient sound strategies, to transmit dynamic information of VOs to users through sound. The swipe gesture was chosen for this purpose, as it is commonly used in current and new interfaces. We chose two VO parameters to sonify, the hand-VO distance and the VO velocity. Two kinds of sound parameters can be chosen to sonify the VO behavior: Spectral or temporal parameters. Pitch and brightness were tested as spectral parameters, and amplitude modulation as a temporal parameter. Performances showed a positive effect of sound compared to a no-sound situation, revealing the usefulness of sounds to accomplish the task.

Keywords: auditory feedback, gesture sonification, sound perception, virtual object

Procedia PDF Downloads 270
154 Static and Dynamic Hand Gesture Recognition Using Convolutional Neural Network Models

Authors: Keyi Wang

Abstract:

Similar to the touchscreen, hand gesture based human-computer interaction (HCI) is a technology that could allow people to perform a variety of tasks faster and more conveniently. This paper proposes a training method of an image-based hand gesture image and video clip recognition system using a CNN (Convolutional Neural Network) with a dataset. A dataset containing 6 hand gesture images is used to train a 2D CNN model. ~98% accuracy is achieved. Furthermore, a 3D CNN model is trained on a dataset containing 4 hand gesture video clips resulting in ~83% accuracy. It is demonstrated that a Cozmo robot loaded with pre-trained models is able to recognize static and dynamic hand gestures.

Keywords: deep learning, hand gesture recognition, computer vision, image processing

Procedia PDF Downloads 106
153 Development of a Method to Prepare In-School Tactile Guide Maps for Visually Impaired School Children

Authors: K. Doi, T. Nishimura, M. Kawano, H. Fujimoto, Y. Tanaka, M. Sawada, S. Oouchi, T. Kaneko, K. Kanamori

Abstract:

As part of reasonable accommodation for people with disabilities in Japan, which has ratified the Convention on the Rights of Persons with Disabilities, tactile guide maps are necessary. Such maps can enable visually impaired children to attend schools of special needs education (visual impairments) to grasp the arrangement of classrooms on their school campuses. However, it takes many years to be able to use a tactile guide map without difficulty. Thus, information support, in which audio information is added in addition to tactile information, is required. In the present research, a method to prepare an in-school tactile guide map with an additional audio reading function was developed. This map can enable visually impaired school children attending schools of special needs education (visual impairments) to grasp the arrangement of classrooms on their school campuses.

Keywords: accessible design, visually impaired, braille, tactile map, in-school tactile guide map

Procedia PDF Downloads 330
152 Tactile Cues and Spatial Navigation in Mice

Authors: Rubaiyea Uddin

Abstract:

The hippocampus, located in the limbic system, is most commonly known for its role in memory and spatial navigation (as cited in Brain Reward and Pathways). It maintains an especially important role in specifically episodic and declarative memory. The hippocampus has also recently been linked to dopamine, the reward pathway’s primary neurotransmitter. Since research has found that dopamine also contributes to memory consolidation and hippocampal plasticity, this neurotransmitter is potentially responsible for contributing to the hippocampus’s role in memory formation. In this experiment we tested to see the effect of tactile cues on spatial navigation for eight different mice. We used a radial arm that had one designated 'reward' arm containing sucrose. The presence or absence of bedding was our tactile cue. We attempted to see if the memory of that cue would enhance the mice’s memory of having received the reward in that arm. The results from our study showed there was no significant response from the use of tactile cues on spatial navigation on our 129 mice. Tactile cues therefore do not influence spatial navigation.

Keywords: mice, radial arm maze, memory, spatial navigation, tactile cues, hippocampus, reward, sensory skills, Alzheimer’s, neurodegnerative disease

Procedia PDF Downloads 621
151 Evaluation Study of Easily Identification of Tactile Symbol on Body Soap Bottle

Authors: K. Doi, T. Nishimura, H. Fujimoto, Y. Hoshikawa, T. Wada

Abstract:

Japanese industrial standard (JIS) association established one JIS (JIS S 0021) regarding packaging accessible design for people with visual impairments and elderly people in 2000. Recently, tactile symbol on shampoo bottle has been known as one of package accessible design and more effectively used. However, it has been said that people with visual impairment have been not been in trouble with difficulty of identifying body soap bottle between three bottles such as body soap bottle, shampoo bottle, and conditioner bottle. Japanese low vision association asked JIS association to solve this problem. JIS association and Japan cosmetic industry association constituted one review team for solving the problem. The review team asked our research team to make a proposal regarding new tactile symbol on body soap bottle. We conducted user survey and maker survey regarding tactile symbol on body soap bottle with easily identification. Seven test tactile symbol marks were elected in our proposed tactile symbols. In this study, we evaluate easily identification of tactile symbol on body soap bottle. Six visual impaired subjects were participated in our experiment. These subjects were asked to identify body soap bottle between three bottles such as body soap bottle, shampoo bottle, and conditioner bottle. The test tactile symbol on body soap were presented in random order. The test tactile symbols were produced by use of our originally developed 3D raised equipment. From our study, test tactile symbol marks with easily identification were made a short list of our proposed tactile symbols. This knowledge will be helpful in revision of ISO 11156.

Keywords: tactile symbol, easily identification, body soap, people with visual impairments

Procedia PDF Downloads 287
150 Linear Regression Estimation of Tactile Comfort for Denim Fabrics Based on In-Plane Shear Behavior

Authors: Nazli Uren, Ayse Okur

Abstract:

Tactile comfort of a textile product is an essential property and a major concern when it comes to customer perceptions and preferences. The subjective nature of comfort and the difficulties regarding the simulation of human hand sensory feelings make it hard to establish a well-accepted link between tactile comfort and objective evaluations. On the other hand, shear behavior of a fabric is a mechanical parameter which can be measured by various objective test methods. The principal aim of this study is to determine the tactile comfort of commercially available denim fabrics by subjective measurements, create a tactile score database for denim fabrics and investigate the relations between tactile comfort and shear behavior. In-plane shear behaviors of 17 different commercially available denim fabrics with a variety of raw material and weave structure were measured by a custom design shear frame and conventional bias extension method in two corresponding diagonal directions. Tactile comfort of denim fabrics was determined via subjective customer evaluations as well. Aforesaid relations were statistically investigated and introduced as regression equations. The analyses regarding the relations between tactile comfort and shear behavior showed that there are considerably high correlation coefficients. The suggested regression equations were likewise found out to be statistically significant. Accordingly, it was concluded that the tactile comfort of denim fabrics can be estimated with a high precision, based on the results of in-plane shear behavior measurements.

Keywords: denim fabrics, in-plane shear behavior, linear regression estimation, tactile comfort

Procedia PDF Downloads 270
149 Exposure to Tactile Cues Does Not Influence Spatial Navigation in 129 S1/SvLm Mice

Authors: Rubaiyea Uddin, Rebecca Taylor, Emily Levesque

Abstract:

The hippocampus, located in the limbic system, is most commonly known for its role in memory and spatial navigation (as cited in Brain Reward and Pathways). It maintains an especially important role in specifically episodic and declarative memory. The hippocampus has also recently been linked to dopamine, the reward pathway’s primary neurotransmitter. Since research has found that dopamine also contributes to memory consolidation and hippocampal plasticity, this neurotransmitter is potentially responsible for contributing to the hippocampus’s role in memory formation. In this experiment we tested to see the effect of tactile cues on spatial navigation for eight different mice. We used a radial arm that had one designated “reward” arm containing sucrose. The presence or absence of bedding was our tactile cue. We attempted to see if the memory of that cue would enhance the mice’s memory of having received the reward in that arm. The results from our study showed there was no significant response from the use of tactile cues on spatial navigation on our 129 mice. Tactile cues therefore do not influence spatial navigation.

Keywords: mice, radial arm maze, memory, spatial navigation, tactile cues, hippocampus, reward, sensory skills, Alzheimer's, neuro-degenerative diseases

Procedia PDF Downloads 650
148 Teachers’ Perceptions on Communicating with Students Who Are Deaf-Blind in Regular Classes

Authors: Phillimon Mahanya

Abstract:

Learners with deaf-blindness use touch to communicate. However, teachers are not well versed with tactile communication technicalities. Lack of technical know-how is compounded with a lack of standardisation of the tactile signs the world over. Thus, this study arose from the need to have efficient and effective tactile sign communication for learners who are deaf-blind. A qualitative approach that adopted a case study design was used. A sample of 22 participants comprising school administrators and teachers was purposively drawn from the institutions that enrolled learners who are deaf-blind. Data generated using semi-structured interviews, non-participant observations and document analysis were thematically analysed. It emerged that administrators and teachers used mammoth and solo touches that are not standardised to communicate with learners who are deaf-blind. It was recommended that there should be a standardised tactile sign manual in Zimbabwe to promote the inclusion of learners who are deaf-blind.

Keywords: communication, deaf-blind, signing, tactile

Procedia PDF Downloads 199
147 CONDUCTHOME: Gesture Interface Control of Home Automation Boxes

Authors: J. Branstett, V. Gagneux, A. Leleu, B. Levadoux, J. Pascale

Abstract:

This paper presents the interface CONDUCTHOME which controls home automation systems with a Leap Motion using ‘invariant gesture protocols’. The function of this interface is to simplify the interaction of the user with its environment. A hardware part allows the Leap Motion to be carried around the house. A software part interacts with the home automation box and displays the useful information for the user. An objective of this work is the development a natural/invariant/simple gesture control interface to help elder people/people with disabilities.

Keywords: automation, ergonomics, gesture recognition, interoperability

Procedia PDF Downloads 392
146 Muscle: The Tactile Texture Designed for the Blind

Authors: Chantana Insra

Abstract:

The research objective focuses on creating a prototype media of the tactile texture of muscles for educational institutes to help visually impaired students learn massage extra learning materials further than the ordinary curriculum. This media is designed as an extra learning material. The population in this study was 30 blinded students between 4th - 6th grades who were able to read Braille language. The research was conducted during the second semester in 2012 at The Bangkok School for the Blind. The method in choosing the population in the study was purposive sampling. The methodology of the research includes collecting data related to visually impaired people, the production of the tactile texture media, human anatomy and Thai traditional massage from literature reviews and field studies. This information was used for analyzing and designing 14 tactile texture pictures presented to experts to evaluate and test the media.

Keywords: blind, tactile texture, muscle, visual arts and design

Procedia PDF Downloads 247
145 Basic Examination of Easily Distinguishable Tactile Symbols Attached to Containers and Packaging

Authors: T. Nishimura, K. Doi, H. Fujimoto, Y. Hoshikawa, T. Wada

Abstract:

In Japan, it is expected that reasonable accommodation for persons with disabilities will progress further. In particular, there is an urgent need to enhance information support for visually impaired persons who have difficulty accessing information. Recently, tactile symbols have been attached to various surfaces, such as the content labels of containers and packaging of various everyday products. The advantage of tactile symbols is that they are useful for visually impaired persons who cannot read Braille. The method of displaying tactile symbols is prescribed by the International Organization for Standardization (ISO). However, the quantitative data on the shapes and dimensions of tactile symbols is insufficient. In this study, through an evaluation experiments, we examine the easy-to-distinguish shapes and dimensions of tactile symbols used for various applications, including the content labels on containers and packaging. Visually impaired persons participated in the experiments. They used tactile symbols on a daily basis. The details and processes of the experiments were orally explained to the participants prior to the experiments, and the informed consent of the participants was obtained. They were instructed to touch the test pieces of tactile symbols freely with both hands. These tactile symbols were selected because they were likely to be easily distinguishable symbols on the content labels of top surfaces of containers and packaging based on a hearing survey that involved employees of an organization of visually impaired and a social welfare corporation, as well as academic experts of support technology for visually impaired. The participants then answered questions related to ease of distinguishing of tactile symbols on a scale of 5 (where 1 corresponded to ‘difficult to distinguish’ and 5 corresponded to ‘easy to distinguish’). Hearing surveys were also performed in an oral free answer manner with the participants after the experiments. This study revealed the shapes and dimensions regarding easily distinguishable tactile symbols attached to containers and packaging. We expect that this knowledge contributes to improvement of the quality of life of visually impaired persons.

Keywords: visual impairment, accessible design, tactile symbol, containers and packaging

Procedia PDF Downloads 191
144 Buddha Images in Mudras Representing Days of a Week: Tactile Texture Design for the Blind

Authors: Chantana Insra

Abstract:

The research “Buddha Images in Mudras Representing Days of a Week: Tactile Texture Design for the Blind” aims to provide original tactile format to institutions for the blind, as supplementary textbooks, to accumulate Buddhist knowledge, so that it could be extracurricular learning. The research studied on 33 students with both total and partial blindness, the latter with the ability to read Braille’s signs, of elementary 4 – 6, who are pursuing their studies on the second semester of the academic year 2013 at Bangkok School for the Blind. The researcher opted samples specifically, studied data acquired from both documents and fieldworks. Those methods must be related to the blind, tactile format production, and Buddha images in mudras representing days of a week. Afterwards, the formats will be analyzed and designed so that there would be 8 format pictures of Buddha images in mudras representing days of the week. Experts will next evaluate the media and try out.

Keywords: blind, tactile texture, Thai Buddha images, Mudras, texture design

Procedia PDF Downloads 326
143 Vibro-Tactile Equalizer for Musical Energy-Valence Categorization

Authors: Dhanya Nair, Nicholas Mirchandani

Abstract:

Musical haptic systems can enhance a listener’s musical experience while providing an alternative platform for the hearing impaired to experience music. Current music tactile technologies focus on representing tactile metronomes to synchronize performers or encoding musical notes into distinguishable (albeit distracting) tactile patterns. There is growing interest in the development of musical haptic systems to augment the auditory experience, although the haptic-music relationship is still not well understood. This paper represents a tactile music interface that provides vibrations to multiple fingertips in synchronicity with auditory music. Like an audio equalizer, different frequency bands are filtered out, and the power in each frequency band is computed and converted to a corresponding vibrational strength. These vibrations are felt on different fingertips, each corresponding to a different frequency band. Songs with music from different spectrums, as classified by their energy and valence, were used to test the effectiveness of the system and to understand the relationship between music and tactile sensations. Three participants were trained on one song categorized as sad (low energy and low valence score) and one song categorized as happy (high energy and high valence score). They were trained both with and without auditory feedback (listening to the song while experiencing the tactile music on their fingertips and then experiencing the vibrations alone without the music). The participants were then tested on three songs from both categories, without any auditory feedback, and were asked to classify the tactile vibrations they felt into either category. The participants were blinded to the songs being tested and were not provided any feedback on the accuracy of their classification. These participants were able to classify the music with 100% accuracy. Although the songs tested were on two opposite spectrums (sad/happy), the preliminary results show the potential of utilizing a vibrotactile equalizer, like the one presented, for augmenting musical experience while furthering the current understanding of music tactile relationship.

Keywords: haptic music relationship, tactile equalizer, tactile music, vibrations and mood

Procedia PDF Downloads 145
142 Roughness Discrimination Using Bioinspired Tactile Sensors

Authors: Zhengkun Yi

Abstract:

Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.

Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination

Procedia PDF Downloads 273
141 Hand Gestures Based Emotion Identification Using Flex Sensors

Authors: S. Ali, R. Yunus, A. Arif, Y. Ayaz, M. Baber Sial, R. Asif, N. Naseer, M. Jawad Khan

Abstract:

In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.

Keywords: emotion identification, emotion models, gesture recognition, user perception

Procedia PDF Downloads 242
140 Development of Sound Tactile Interface by Use of Human Sensation of Stiffness

Authors: K. Doi, T. Nishimura, M. Umeda

Abstract:

There are very few sound interfaces that both healthy people and hearing handicapped people can use to play together. In this study, we developed a sound tactile interface that makes use of the human sensation of stiffness. The interface comprises eight elastic objects having varying degrees of stiffness. Each elastic object is shaped like a column. When people with and without hearing disabilities press each elastic object, different sounds are produced depending on the stiffness of the elastic object. The types of sounds used were “Do Re Mi sounds.” The interface has a major advantage in that people with or without hearing disabilities can play with it. We found that users were able to recognize the hardness sensation and relate it to the corresponding Do Re Mi sounds.

Keywords: tactile sense, sound interface, stiffness perception, elastic object

Procedia PDF Downloads 256
139 Investigating the Online Effect of Language on Gesture in Advanced Bilinguals of Two Structurally Different Languages in Comparison to L1 Native Speakers of L2 and Explores Whether Bilinguals Will Follow Target L2 Patterns in Speech and Co-speech

Authors: Armita Ghobadi, Samantha Emerson, Seyda Ozcaliskan

Abstract:

Being a bilingual involves mastery of both speech and gesture patterns in a second language (L2). We know from earlier work in first language (L1) production contexts that speech and co-speech gesture form a tightly integrated system: co-speech gesture mirrors the patterns observed in speech, suggesting an online effect of language on nonverbal representation of events in gesture during the act of speaking (i.e., “thinking for speaking”). Relatively less is known about the online effect of language on gesture in bilinguals speaking structurally different languages. The few existing studies—mostly with small sample sizes—suggests inconclusive findings: some show greater achievement of L2 patterns in gesture with more advanced L2 speech production, while others show preferences for L1 gesture patterns even in advanced bilinguals. In this study, we focus on advanced bilingual speakers of two structurally different languages (Spanish L1 with English L2) in comparison to L1 English speakers. We ask whether bilingual speakers will follow target L2 patterns not only in speech but also in gesture, or alternatively, follow L2 patterns in speech but resort to L1 patterns in gesture. We examined this question by studying speech and gestures produced by 23 advanced adult Spanish (L1)-English (L2) bilinguals (Mage=22; SD=7) and 23 monolingual English speakers (Mage=20; SD=2). Participants were shown 16 animated motion event scenes that included distinct manner and path components (e.g., "run over the bridge"). We recorded and transcribed all participant responses for speech and segmented it into sentence units that included at least one motion verb and its associated arguments. We also coded all gestures that accompanied each sentence unit. We focused on motion event descriptions as it shows strong crosslinguistic differences in the packaging of motion elements in speech and co-speech gesture in first language production contexts. English speakers synthesize manner and path into a single clause or gesture (he runs over the bridge; running fingers forward), while Spanish speakers express each component separately (manner-only: el corre=he is running; circle arms next to body conveying running; path-only: el cruza el puente=he crosses the bridge; trace finger forward conveying trajectory). We tallied all responses by group and packaging type, separately for speech and co-speech gesture. Our preliminary results (n=4/group) showed that productions in English L1 and Spanish L1 differed, with greater preference for conflated packaging in L1 English and separated packaging in L1 Spanish—a pattern that was also largely evident in co-speech gesture. Bilinguals’ production in L2 English, however, followed the patterns of the target language in speech—with greater preference for conflated packaging—but not in gesture. Bilinguals used separated and conflated strategies in gesture in roughly similar rates in their L2 English, showing an effect of both L1 and L2 on co-speech gesture. Our results suggest that online production of L2 language has more limited effects on L2 gestures and that mastery of native-like patterns in L2 gesture might take longer than native-like L2 speech patterns.

Keywords: bilingualism, cross-linguistic variation, gesture, second language acquisition, thinking for speaking hypothesis

Procedia PDF Downloads 46
138 A Holographic Infotainment System for Connected and Driverless Cars: An Exploratory Study of Gesture Based Interaction

Authors: Nicholas Lambert, Seungyeon Ryu, Mehmet Mulla, Albert Kim

Abstract:

In this paper, an interactive in-car interface called HoloDash is presented. It is intended to provide information and infotainment in both autonomous vehicles and ‘connected cars’, vehicles equipped with Internet access via cellular services. The research focuses on the development of interactive avatars for this system and its gesture-based control system. This is a case study for the development of a possible human-centred means of presenting a connected or autonomous vehicle’s On-Board Diagnostics through a projected ‘holographic’ infotainment system. This system is termed a Holographic Human Vehicle Interface (HHIV), as it utilises a dashboard projection unit and gesture detection. The research also examines the suitability for gestures in an automotive environment, given that it might be used in both driver-controlled and driverless vehicles. Using Human Centred Design methods, questions were posed to test subjects and preferences discovered in terms of the gesture interface and the user experience for passengers within the vehicle. These affirm the benefits of this mode of visual communication for both connected and driverless cars.

Keywords: gesture, holographic interface, human-computer interaction, user-centered design

Procedia PDF Downloads 283
137 Visualization-Based Feature Extraction for Classification in Real-Time Interaction

Authors: Ágoston Nagy

Abstract:

This paper introduces a method of using unsupervised machine learning to visualize the feature space of a dataset in 2D, in order to find most characteristic segments in the set. After dimension reduction, users can select clusters by manual drawing. Selected clusters are recorded into a data model that is used for later predictions, based on realtime data. Predictions are made with supervised learning, using Gesture Recognition Toolkit. The paper introduces two example applications: a semantic audio organizer for analyzing incoming sounds, and a gesture database organizer where gestural data (recorded by a Leap motion) is visualized for further manipulation.

Keywords: gesture recognition, machine learning, real-time interaction, visualization

Procedia PDF Downloads 320
136 Prototyping a Portable, Affordable Sign Language Glove

Authors: Vidhi Jain

Abstract:

Communication between speakers and non-speakers of American Sign Language (ASL) can be problematic, inconvenient, and expensive. This project attempts to bridge the communication gap by designing a portable glove that captures the user’s ASL gestures and outputs the translated text on a smartphone. The glove is equipped with flex sensors, contact sensors, and a gyroscope to measure the flexion of the fingers, the contact between fingers, and the rotation of the hand. The glove’s Arduino UNO microcontroller analyzes the sensor readings to identify the gesture from a library of learned gestures. The Bluetooth module transmits the gesture to a smartphone. Using this device, one day speakers of ASL may be able to communicate with others in an affordable and convenient way.

Keywords: sign language, morse code, convolutional neural network, American sign language, gesture recognition

Procedia PDF Downloads 20
135 Defect Localization and Interaction on Surfaces with Projection Mapping and Gesture Recognition

Authors: Qiang Wang, Hongyang Yu, MingRong Lai, Miao Luo

Abstract:

This paper presents a method for accurately localizing and interacting with known surface defects by overlaying patterns onto real-world surfaces using a projection system. Given the world coordinates of the defects, we project corresponding patterns onto the surfaces, providing an intuitive visualization of the specific defect locations. To enable users to interact with and retrieve more information about individual defects, we implement a gesture recognition system based on a pruned and optimized version of YOLOv6. This lightweight model achieves an accuracy of 82.8% and is suitable for deployment on low-performance devices. Our approach demonstrates the potential for enhancing defect identification, inspection processes, and user interaction in various applications.

Keywords: defect localization, projection mapping, gesture recognition, YOLOv6

Procedia PDF Downloads 51
134 Patient-Friendly Hand Gesture Recognition Using AI

Authors: K. Prabhu, K. Dinesh, M. Ranjani, M. Suhitha

Abstract:

During the tough times of covid, those people who were hospitalized found it difficult to always convey what they wanted to or needed to the attendee. Sometimes the attendees might also not be there. In that case, the patients can use simple hand gestures to control electrical appliances (like its set it for a zero watts bulb)and three other gestures for voice note intimation. In this AI-based hand recognition project, NodeMCU is used for the control action of the relay, and it is connected to the firebase for storing the value in the cloud and is interfaced with the python code via raspberry pi. For three hand gestures, a voice clip is added for intimation to the attendee. This is done with the help of Google’s text to speech and the inbuilt audio file option in the raspberry pi 4. All the five gestures will be detected when shown with their hands via the webcam, which is placed for gesture detection. The personal computer is used for displaying the gestures and for running the code in the raspberry pi imager.

Keywords: nodeMCU, AI technology, gesture, patient

Procedia PDF Downloads 130
133 Hand Motion and Gesture Control of Laboratory Test Equipment Using the Leap Motion Controller

Authors: Ian A. Grout

Abstract:

In this paper, the design and development of a system to provide hand motion and gesture control of laboratory test equipment is considered and discussed. The Leap Motion controller is used to provide an input to control a laboratory power supply as part of an electronic circuit experiment. By suitable hand motions and gestures, control of the power supply is provided remotely and without the need to physically touch the equipment used. As such, it provides an alternative manner in which to control electronic equipment via a PC and is considered here within the field of human computer interaction (HCI).

Keywords: control, hand gesture, human computer interaction, test equipment

Procedia PDF Downloads 291
132 Automatic Detection of Suicidal Behaviors Using an RGB-D Camera: Azure Kinect

Authors: Maha Jazouli

Abstract:

Suicide is one of the most important causes of death in the prison environment, both in Canada and internationally. Rates of attempts of suicide and self-harm have been on the rise in recent years, with hangings being the most frequent method resorted to. The objective of this article is to propose a method to automatically detect in real time suicidal behaviors. We present a gesture recognition system that consists of three modules: model-based movement tracking, feature extraction, and gesture recognition using machine learning algorithms (MLA). Our proposed system gives us satisfactory results. This smart video surveillance system can help assist staff responsible for the safety and health of inmates by alerting them when suicidal behavior is detected, which helps reduce mortality rates and save lives.

Keywords: suicide detection, Kinect azure, RGB-D camera, SVM, machine learning, gesture recognition

Procedia PDF Downloads 150
131 Real-Time Gesture Recognition System Using Microsoft Kinect

Authors: Ankita Wadhawan, Parteek Kumar, Umesh Kumar

Abstract:

Gesture is any body movement that expresses some attitude or any sentiment. Gestures as a sign language are used by deaf people for conveying messages which helps in eliminating the communication barrier between deaf people and normal persons. Nowadays, everybody is using mobile phone and computer as a very important gadget in their life. But there are some physically challenged people who are blind/deaf and the use of mobile phone or computer like device is very difficult for them. So, there is an immense need of a system which works on body gesture or sign language as input. In this research, Microsoft Kinect Sensor, SDK V2 and Hidden Markov Toolkit (HTK) are used to recognize the object, motion of object and human body joints through Touch less NUI (Natural User Interface) in real-time. The depth data collected from Microsoft Kinect has been used to recognize gestures of Indian Sign Language (ISL). The recorded clips are analyzed using depth, IR and skeletal data at different angles and positions. The proposed system has an average accuracy of 85%. The developed Touch less NUI provides an interface to recognize gestures and controls the cursor and click operation in computer just by waving hand gesture. This research will help deaf people to make use of mobile phones, computers and socialize among other persons in the society.

Keywords: gesture recognition, Indian sign language, Microsoft Kinect, natural user interface, sign language

Procedia PDF Downloads 276
130 Fashion through Senses: A Study of the Impact of Sensory Cues on the Consumption of Fashion Accessories by Female Shoppers

Authors: Vaishali Joshi

Abstract:

Purpose: A literature gap exists on the concept of sensory marketing elements, such as tactile elements, auditory elements, visual elements, and olfactory elements, studied together in the context of retailing. An investigation is required to study the impact of these sensory cues together on consumer behaviour. So, this study will undertake the impact of sensory marketing in fashion accessories stores on female shoppers’ purchasing activities. The present research study highlights the role of sensory cues, such as tactile cues, visual cues, auditory cues, and olfactory cues, on the shopper’s emotional states and their purchase intention. Design/methodology/approach: The emotional states and the purchase intention of the female shoppers influenced by the visual, tactile, olfactory, and auditory cues present in the fashion accessories stores were measured. The mall intercept technique was used for the data collection. Data analysis was done through Structural Equation Modelling. Research limitations/implications: The restricted geographical range and limited sample size of the study had a substantial poor influence on the wide usage of the study’s outcome. Also, here, the sample was female respondents only.

Keywords: sensory marketing, visual cues, olfactory cues, tactile cues, auditory cues

Procedia PDF Downloads 53