Loading…

Infrared Weak-Small Targets Fusion Based on Latent Low-Rank Representation and DWT

For the problem of anti-background interference of weak-small targets in infrared images, target extraction and texture detail processing are key tasks in the image fusion algorithm. The single-band infrared data can not fully reflect image details and contour information. There are texture differen...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.112681-112692
Main Authors: Wang, Xiaozhu, Yin, Jianfei, Zhang, Kai, Li, Shayi, Yan, Jie
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:For the problem of anti-background interference of weak-small targets in infrared images, target extraction and texture detail processing are key tasks in the image fusion algorithm. The single-band infrared data can not fully reflect image details and contour information. There are texture differences in different bands of data, which makes it difficult to recognize targets. Therefore, it is necessary to fuse dual-band data to identify weak-small targets clearly. To solve these question, in this paper, we propose an effective image fusion framework using Latent Low-Rank Representation (LatLRR) and Discrete Wavelet Transform (DWT). Firstly, all source images are trained as L matrix by LatLRR which is used to extract salient features. And the original images are decomposed into high frequency and low frequency by DWT. Then high frequency parts are fused by maximum absolute value and low frequency parts are fused by weighted-average. On this basis, the training matrix L and high frequency fusion parts are used for contrast modulation fusion. Finally, the fused image is reconstructed by combining the contour parts and feature parts. The experimental results demonstrate that our proposed method achieves state-of-the-art performance in objective and subjective assessment.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2934523