A Hybrid Distributed Vision System for Robot Localization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
A Hybrid Distributed Vision System for Robot Localization

Authors: Hsiang-Wen Hsieh, Chin-Chia Wu, Hung-Hsiu Yu, Shu-Fan Liu

Abstract:

Localization is one of the critical issues in the field of robot navigation. With an accurate estimate of the robot pose, robots will be capable of navigating in the environment autonomously and efficiently. In this paper, a hybrid Distributed Vision System (DVS) for robot localization is presented. The presented approach integrates odometry data from robot and images captured from overhead cameras installed in the environment to help reduce possibilities of fail localization due to effects of illumination, encoder accumulated errors, and low quality range data. An odometry-based motion model is applied to predict robot poses, and robot images captured by overhead cameras are then used to update pose estimates with HSV histogram-based measurement model. Experiment results show the presented approach could localize robots in a global world coordinate system with localization errors within 100mm.

Keywords: Distributed Vision System, Localization, Measurement model, Motion model

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1332348

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298

References:


[1] J. Borenstein, B. Everett, and L. Feng, 1996, "Navigating Mobile Robots: Systems and Techniques," A. K. Peters, Ltd., Wellesley, MA.
[2] J. Gaspar, N. Winters, and J. Santos-Victor, 2000, "Vision-based navigation and environmental epresentations with an omnidirectional camera," IEEE Transaction on Robotics and Automation, Vol. 16(6).
[3] J. Y. Bouget, "Camera calibration toolbox for Matlab," http://www.vision.caltech.edu/bougetj/calib_doc.
[4] H. Everett, D. Gage, G. Gilbreth, R. Laird, and R. Smurlo, 1994, "Realworld issues in warehouse navigation," In Proceedings of the SPIE Conference on Mobile Robots IX, Vol. 2352.
[5] D. Fox, W. Burgard, S. Thrun, and A. B. Cremers, 1998, "Position estimation for mobile robots in dynamic environments," In Proceedings of the AAAI Fifteenth National Conference on Artificial Intelligence.
[6] T. Wilhelm, H. J. B¨ohme, and H. M. Gross, August 2004, "A multi-modal system for tracking and analyzing faces on a mobile robot," Robotics and Autonomous Systems,Vol. 48, pp. 31-40.
[7] J. Wolf, W. Burgard, and H. Burkhardt, 2005, "Robust vision-based localization by combining an image retrieval system with monte carlo localization," IEEE Transactions on Robotics, 21(2), pp. 208-216.
[8] E. Menegatti, G. Gatto, E. Pagello, T. Minato, H. Ishiguro, "Distributed Vision System for robot localization in indoor environments," Proc. of the 2nd European Conference on Mobile Robots ECMR'05 September 2005 Ancona - Italy, pp. 194-199.
[9] T. Sogo, H. Ishiguro and T. Ishida, 2001, ÔÇÿÔÇÿMobile robot navigation by a distributed vision system,-- New Generation Computing, Springer-Verlag.
[10] J. Bruce, M. Veloso, "Fast and accurate vision-based pattern detection and identification", Dept. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA.
[11] Y. Rui and Y. Chen, "Better proposal distributions: Object tracking using unscented particle filter," in Proc. IEEE Conf. on Computer Vision and Pattern Recognition, Kauai, Hawaii, volume II, 2001, pp. 786-793.
[12] J. Sangoh, "Histogram-Based Color Image Retrieval," Available: http://scien.stanford.edu/class/psych221/projects/02/sojeong.