Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32759
Real Time Lidar and Radar High-Level Fusion for Obstacle Detection and Tracking with Evaluation on a Ground Truth

Authors: Hatem Hajri, Mohamed-Cherif Rahal

Abstract:

Both Lidars and Radars are sensors for obstacle detection. While Lidars are very accurate on obstacles positions and less accurate on their velocities, Radars are more precise on obstacles velocities and less precise on their positions. Sensor fusion between Lidar and Radar aims at improving obstacle detection using advantages of the two sensors. The present paper proposes a real-time Lidar/Radar data fusion algorithm for obstacle detection and tracking based on the global nearest neighbour standard filter (GNN). This algorithm is implemented and embedded in an automative vehicle as a component generated by a real-time multisensor software. The benefits of data fusion comparing with the use of a single sensor are illustrated through several tracking scenarios (on a highway and on a bend) and using real-time kinematic sensors mounted on the ego and tracked vehicles as a ground truth.

Keywords: Ground truth, Hungarian algorithm, lidar Radar data fusion, global nearest neighbor filter.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1474353

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 874

References:


[1] D. L. Hall and S. A. H. McMullen, Mathematical Techniques in Multisensor Data Fusion. Artech House, 2004.
[2] H. B. Mitchell, Multi-Sensor Data Fusion: An Introduction, 1st ed. Springer Publishing Company, Incorporated, 2007.
[3] Y. Bar-Shalom and T. E. Fortmann, Tracking and Data Association, ser. Mathematics in Science and Engineering. Academic Press, 1988, no. 179.
[4] C. Blanc, B. T. Le, Y. Le, and G. R. Moreira, “Track to track fusion method applied to road obstacle detection,” IEEE International Conference on Information Fusion, 2004.
[5] C. Blanc, P. Checchin, S. Gidel, and L. Trassoudaine, “Data fusion performance evaluation for range measurements combine with cartesian ones for road obstacle tracking,” 11th IEEE International Conference on Intelligent Transportation Systems, 2007.
[6] D. Gohring, M. Wang, M. Schnurmacher, and T. Ganjineh, “Radar/lidar sensor fusion for car-following on highways,” IEEE Int. Conf. Autom. Robot. Appl., pp. 407–412, 2011.
[7] R. O. Chavez-Garcia and O. Aycard, “Multiple Sensor Fusion and Classification for Moving Object Detection and Tracking,” IEEE Transactions on Intelligent Transportation Systems, vol. 17, no. 2, pp. 252–534, 2015.
[8] A. Rangesh and M. M. Trivedi, “No blind spots: Full-surround multi-object tracking for autonomous vehicles using Cameras and Lidars,” https://arxiv.org/pdf/1802.08755.pdf., 2018.
[9] H. Durrant-Whyte, Introduction to estimation and the Kalman filter. Australian Centre for Fields Robotics.
[10] F. Nashashibi, “Rtm@ps: a framework for prototyping automotive multi-sensor applications,” pp. 99 – 103, 02 2000.
[11] J. Munkres, “Algorithms for the assignment and transportation problems,” Journal of the Society of Industrial and Applied Mathematics, vol. 5, no. 1, pp. 32–38, March 1957.