Loading…

A Deep Motion Deblurring Network Using Channel Adaptive Residual Module

In this paper, we solve the problem of dynamic scenes deblurring with motion blur. Restoration of images in the presence of motion blur necessitates a network design that the receptive field can completely cover all areas that need to be deblurred, while the existing network increases the receptive...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2021, Vol.9, p.65638-65649
Main Authors: Chen, Ying, Cui, Guangmang, Zhang, Jitong, Zhao, Jufeng
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we solve the problem of dynamic scenes deblurring with motion blur. Restoration of images in the presence of motion blur necessitates a network design that the receptive field can completely cover all areas that need to be deblurred, while the existing network increases the receptive field by continuously stacking the ordinary convolutional layer or increasing the size of the convolution kernel. However, these methods inevitably increase the computational burden of the network. We propose a novel architecture consisting of a channel adaptive residual module. Different features of the blurred image are extracted and distributed on each feature channel. Our network can calculate the weight of each channel through learning, and extract the image features adaptively according to different degrees of blurring and importance of information. We embed the module in a modified encoder-decoder design with skip connections to achieve multi-scale feature fusion for further performance improvement. The extensive comparison with the existing techniques in the baseline dynamic scene deblurring dataset shows that the proposed network can effectively realize image deblurring, and the accuracy and speed are comparable with the existing techniques.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3076241