{"title":"Map Matching Performance under Various Similarity Metrics for Heterogeneous Robot Teams","authors":"M. C. Akay, A. Aybakan, H. Temeltas ","volume":144,"journal":"International Journal of Electrical and Information Engineering","pagesStart":869,"pagesEnd":874,"ISSN":"1307-6892","URL":"https:\/\/publications.waset.org\/pdf\/10009841","abstract":"
Aerial and ground robots have various advantages of usage in different missions. Aerial robots can move quickly and get a different sight of view of the area, but those vehicles cannot carry heavy payloads. On the other hand, unmanned ground vehicles (UGVs) are slow moving vehicles, since those can carry heavier payloads than unmanned aerial vehicles (UAVs). In this context, we investigate the performances of various Similarity Metrics to provide a common map for Heterogeneous Robot Team (HRT) in complex environments. Within the usage of Lidar Odometry and Octree Mapping technique, the local 3D maps of the environment are gathered. In order to obtain a common map for HRT, informative theoretic similarity metrics are exploited. All types of these similarity metrics gave adequate as allowable simulation time and accurate results that can be used in different types of applications. For the heterogeneous multi robot team, those methods can be used to match different types of maps.<\/p>\r\n","references":"[1]\tL. E. Parker, B. Kannan, F. T. F. Tang, and M. Bailey, \u201cTightly-coupled navigation assistance in heterogeneous multi-robot teams,\u201d 2004 IEEE\/RSJ Int. Conf. Intell. Robot. Syst. (IEEE Cat. No.04CH37566), vol. 1, pp. 1016\u20131022, 2004.\r\n[2]\tM. Hofmeister, M. Kronfeld, and A. Zell, \u201cCooperative visual mapping in a heterogeneous team of mobile robots,\u201d Proc. - IEEE Int. Conf. Robot. Autom., pp. 1491\u20131496, 2011.\r\n[3]\tT. Bailey, M. Bryson, H. Mu, J. Vial, L. McCalman, and H. Durrant-Whyte, \u201cDecentralised cooperative localisation for heterogeneous teams of mobile robots,\u201d Proc. - IEEE Int. Conf. Robot. Autom., pp. 2859\u20132865, 2011.\r\n[4]\tM. Langerwisch, T. Wittmann, S. Thamke, T. Remmersmann, A. Tiderko, and B. Wagner, \u201cHeterogeneous teams of unmanned ground and aerial robots for reconnaissance and surveillance-a field experiment,\u201d Safety, Secur. Rescue Robot. (SSRR), 2013 IEEE Int. Symp., pp. 1\u20136, 2013.\r\n[5]\tY. Ktiri and M. Inaba, \u201cA framework for multiple heterogeneous robots exploration using laser data and MARG sensor,\u201d 2012 IEEE\/SICE Int. Symp. Syst. Integr. SII 2012, pp. 635\u2013640, 2012.\r\n[6]\tA. Husain et al., \u201cMapping planetary caves with an autonomous, heterogeneous robot team,\u201d IEEE Aerosp. Conf. Proc., 2013.\r\n[7]\tC. Forster, M. Pizzoli, and D. Scaramuzza, \u201cAir-ground localization and map augmentation using monocular dense reconstruction,\u201d IEEE Int. Conf. Intell. Robot. Syst., pp. 3971\u20133978, 2013.\r\n[8]\tL. Kneip, M. Chli, and R. Y. Siegwart, \u201cRobust Real-Time Visual Odometry with a Single Camera and an IMU.,\u201d Bmvc, p. 16.1-16.11, 2011.\r\n[9]\tR. Kaeslin et al., \u201cCollaborative Localization of Aerial and Ground Robots through Elevation Maps,\u201d IEEE Int. Symp. Safety, Secur. Map\/Collaborative_Navigation_for_Flying_and_Walking_Robots_Marco_Hutter_2016.pdfy Rescue Robot., 2016.\r\n[10]\tS. Cha, \u201cComprehensive Survey on Distance \/ Similarity Measures between Probability Density Functions,\u201d vol. 1, no. 4, 2007.\r\n[11]\tJ. Zhang and S. Singh, \u201cLOAM: Lidar Odometry and Mapping in Real-time,\u201d 2014.\r\n[12]\tA. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, \u201cOctoMap: An efficient probabilistic 3D mapping framework based on octrees,\u201d Auton. Robots, vol. 34, no. 3, pp. 189\u2013206, 2013.\r\n[13]\tM. Santana, K. R. T. Aires, and R. M. S. Veras, \u201cAn Approach for 2D Visual Occupancy Grid Map Using Monocular Vision,\u201d vol. 281, pp. 175\u2013191, 2011.","publisher":"World Academy of Science, Engineering and Technology","index":"Open Science Index 144, 2018"}