Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Intention Recognition using a Graph Representation
Authors: So-Jeong Youn, Kyung-Whan Oh
Abstract:
The human friendly interaction is the key function of a human-centered system. Over the years, it has received much attention to develop the convenient interaction through intention recognition. Intention recognition processes multimodal inputs including speech, face images, and body gestures. In this paper, we suggest a novel approach of intention recognition using a graph representation called Intention Graph. A concept of valid intention is proposed, as a target of intention recognition. Our approach has two phases: goal recognition phase and intention recognition phase. In the goal recognition phase, we generate an action graph based on the observed actions, and then the candidate goals and their plans are recognized. In the intention recognition phase, the intention is recognized with relevant goals and user profile. We show that the algorithm has polynomial time complexity. The intention graph is applied to a simple briefcase domain to test our model.Keywords: Intention recognition, intention, graph, HCI.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1077777
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3403References:
[1] C. Breazeal, "Social interactions in HRI: The robot view", IEEE Trans. Systems, Man, and Cybernetics, vol.52, no.6, pp 181-186, May 2004.
[2] Farrah Wong, Kwang-Hyun Park, Dae-Jin Kim, Jin-Woo Jung and Zeungnam Bien, "Intention reading towards engineering applications for the elderly and people with disabilities", in International Journal of ARM, vol. 7, no. 3, pp. 3-15, September 2006.
[3] Ann Elwan, Poverty and disability - a survey of the literature, social protection discussion paper series, Social Protection Unit, Human Development Network, The World Back, no 9932, pp. 5, Dec. 1999.
[4] M. Rahimi and W. Karwowski (Eds.), Human-Robot Interaction, London: Taylor and Francis, pp.2, 1992.
[5] Jun Hong, "Goal recognition through goal graph analysis", Journal of Artificial Intelligence Research, vol.15, pp. 1-30, 2001.
[6] Avrim L. Blum, Merrick L. Furst, "Fast planning through planning graph analysis", Artificial Intelligence, pp. 281-300, 1997.
[7] E. P. D. Pednault, "Synthesizing plans that contain actions with context-dependent effects", Computational Intelligence, vol. 4, pp. 356-372.
[8] Chiung-Hon Leon Lee, and Alan Liu, "An intention-aware interface for services access enhancement", in Proc. of the IEEE Int. Conf. on Sensor Networks, Ubiquitous, and Trustworthy Computing, vol.2, pp.52-57, 2006.
[9] Tomomasa Sato, Yoshifumi Nishida, Junri Ichikawa, Yotaro Hatamura, and Hiroshi Mizoguchi, "Active understanding of human intention by a robot through monitoring of human behavior", Proc. of IROS ?94, vol.1, pp.405-414, 1994.
[10] Yoshinori Kuno, Nobutaka Shimada, and Yoshiaki Shirai, "Look where you're going", IEEE Robotics & Automation Magazine, pp.26-34, March 2003.
[11] D. J. Kim, W. K. Song, J. S. Han, Z. Zenn Bien, "Soft computing based intention reading techniques as a means of human-robot interaction for human centered system", Soft Computing, pp. 160-166, 7, 2003.
[12] Toru Yamaguchi,Shinya Mizuno, Takuya Yoshida, and Tomomi Hashimoto, "Cooperative works for agent robot and human using robot vision based on the model of knowledge, emotion and intention", Proc. of IEEE SMC '99, vol.2, pp.987-992, 1999.
[13] Soshi Iba, Christiaan J. J. Paredis, Pradeep K. Khosla, "Interactive multi-modal robot programming", In Proc. of the 9th Intl. Symposium of Experimental Robotics, June, 2004.
[14] Z.Zenn Bien, Kwang-Hyun Park, Jin-Woo Jung, and Jun-Hyeong Do, "Intention reading is essential in human-friendly interfaces for the elderly and the handicapped", IEEE Trans. On Industrial Electronics, vol.52, no.6, December 2005.