Loading…

Attention-based multi-scale recursive residual network for low-light image enhancement

Aiming at the problems of color distortion, low image processing efficiency, rich context information, spatial information imbalance in the current low-light image enhancement algorithm based on a convolutional neural network. In this paper, an Attention-based multi-scale recursive residual network...

Full description

Saved in:
Bibliographic Details
Published in:Signal, image and video processing image and video processing, 2024-04, Vol.18 (3), p.2521-2531
Main Authors: Wang, Kaidi, Zheng, Yuanlin, Liao, Kaiyang, Liu, Haiwen, Sun, Bangyong
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Aiming at the problems of color distortion, low image processing efficiency, rich context information, spatial information imbalance in the current low-light image enhancement algorithm based on a convolutional neural network. In this paper, an Attention-based multi-scale recursive residual network for low-light image enhancement (AMR-Net) is proposed based on high-resolution, single-scale image processing. First, shallow features are extracted using convolution and channel attention. In the recursive residual unit, a parallel multi-scale residual block is constructed, and the image features are extracted from the three scales: original image resolution, 1/2 resolution, and 1/4 resolution. Then, the deep features and shallow features are connected by selective kernel feature fusion to obtain rich context information and spatial information. Finally, the residual image is obtained by convolution processing of the deep features, and the enhanced image is obtained by adding the original image to the residual image. The experimental results on LOL, LIME, DICM, MEF datasets show that the proposed method has achieved good results in multiple indicators, and reasonably restored the brightness, contrast, and details of the image, thereby intuitively improving the perceived quality of the image.
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-023-02927-y