WASET
	%0 Journal Article
	%A Hsiang-Wen Hsieh and  Chin-Chia Wu and  Hung-Hsiu Yu and  Shu-Fan Liu
	%D 2008
	%J International Journal of Computer and Information Engineering
	%B World Academy of Science, Engineering and Technology
	%I Open Science Index 17, 2008
	%T A Hybrid Distributed Vision System for Robot Localization
	%U https://publications.waset.org/pdf/4482
	%V 17
	%X Localization is one of the critical issues in the field of
robot navigation. With an accurate estimate of the robot pose, robots will be capable of navigating in the environment autonomously and efficiently. In this paper, a hybrid Distributed Vision System (DVS)
for robot localization is presented. The presented approach integrates
odometry data from robot and images captured from overhead cameras
installed in the environment to help reduce possibilities of fail
localization due to effects of illumination, encoder accumulated errors,
and low quality range data. An odometry-based motion model is applied to predict robot poses, and robot images captured by overhead
cameras are then used to update pose estimates with HSV histogram-based measurement model. Experiment results show the
presented approach could localize robots in a global world coordinate system with localization errors within 100mm.
	%P 1742 - 1748