Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kr. Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with specific focus on Infrared (IR) and Visible image (VI) fusion for various applications including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like Visible camera & IR Thermal Imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (IR) that may be reflected or self-emitted. A digital color camera captures the visible source image and a thermal IR camera acquires the thermal source image. In this paper, some image fusion algorithms based upon Multi-Scale Transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, but they also make it hard to become deployed in system and applications that require real-time operation, high flexibility and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.
Keywords: Image fusion, IR thermal imager, multi-sensor, Multi-Scale Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430References:
[1] Liu, Y.; Liu, S. & Wang, Z. A general framework for image fusion based on multi-scale transform and sparse representation Information Fusion, Elsevier BV, 2015, 24, 147-164
[2] Liu, Y.; Chen, X.; Wang, Z.; Wang, Z. J.; Ward, R. K. & Wang, X. Deep learning for pixel-level image fusion: Recent advances and future prospects Information Fusion, Elsevier BV, 2018, 42, 158-173
[3] Ma, J.; Ma, Y. & Li, C. Infrared and visible image fusion methods and applications: A survey Information Fusion, Elsevier BV, 2019, 45,
[4] Yaochen Liu Lili Dong, Y. J. & Xu, W. Infrared and Visible Image Fusion through Details Preservation
[5] Jin, X.; Jiang, Q.; Yao, S.; Zhou, D.; Nie, R.; Lee, S.-J. & He, K.Infrared and visual image fusion method based on discrete cosine transform and local spatial frequency in discrete stationary wavelet transform domain Infrared Physics & Technology, Elsevier BV, 2018, 88, 1-12
[6] Theses, M. & School, G.University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange
[7] G. Vivone, L. Alparone, J. Chanussot, M. Dalla Mura, A. Garzelli, G. Licciardi, R. Restaino, L. Wald, A critical comparison among pansharpening algorithms, IEEE Trans. Geosci. Remote Sensing 53 (5) (2015) 2565–2586.
[8] A.A. Goshtasby, S. Nikolov, Image fusion: Advances in the state of the art, Inf. Fus. 8 (2) (2007) 114–118.
[9] A.P. James, B.V. Dasarathy, Medical image fusion: A survey of the state of the art, Inf. Fus. 19 (1) (2014) 4–19.
[10] J.G. Han, E.J. Pauwels, P.D. Zeeuw, Fast saliency-aware multi-modality image fusion, Neurocomputing 2013 (111) (2013) 70–80.
[11] W.W. Kong, B.H. Wang, Y. Lei, Technique for infrared and visible image fusion based on non-subsampled shearlet transform and spiking cortical model, Infrared Phys. Technol. 2015 (71) (2015) 87–98.
[12] D.P. Bavirisetti, R. Dhuli, Two-scale image fusion of visible and infrared images using saliency detection, Infrared Phys. Technol. 2016 (76) (2016)
[13] G. Pajares, J.M. de la Cruz, A wavelet-based image fusion tutorial, Pattern Recognit. 37 (9) (2004) 1855–1872.
[14] Z. Zhou, B. Wang, S. Li, M. Dong, Perceptual fusion of infrared and visible im- ages through a hybrid multi-scale decomposition with gaussian and bilateral filters, Inf. Fus. 30 (1) (2016) 15–26.
[15] S. Li, X. Kang, J. Hu, Image fusion with guided filtering, IEEE Trans. Image Process. 22 (7) (2013) 2864–2875.
[16] H. Li, B. Manjunath, S. Mitra, Multisensor image fusion using the wavelet transform, Graph. Models Image Process. 57 (3) (1995) 235–245.
[17] Toet, Alexander (2014): TNO Image Fusion Dataset. figshare. Dataset. https://doi.org/10.6084/m9.figshare.1008029.v1