Wearable Interface for Telepresence in Robotics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 87758
Wearable Interface for Telepresence in Robotics

Authors: Uriel Martinez-Hernandez, Luke W. Boorman, Hamideh Kerdegari, Tony J. Prescott

Abstract:

In this paper, we present architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface, developed here, that provides the human with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, such as vision, with gaze control and tactile feedback. This allows for a straightforward integration of multiple sensory modalities, but also offers a more complete immersion experience for the human. These systems are integrated, controlled and synchronised by an architecture developed for telepresence and human-robot interaction. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with humans located in the remote environment. Our approach has been tested from local, domestic and business venues, using wired, wireless and Internet based connections. This has involved the implementation of data compression to maintain data quality to improve the immersion experience. Initial testing has shown the wearable interface to be robust. The system will endow humans with the ability to explore and interact with other humans at remote locations using multiple sensing modalities.

Keywords: telepresence, telerobotics, human-robot interaction, virtual reality

Procedia PDF Downloads 290