Loading…

A Comprehensive Review of Deep Learning-Based Real-World Image Restoration

Real-world imagery does not always exhibit good visibility and clean content, but often suffers from various kinds of degradations (e.g., noise, blur, rain drops, fog, color distortion, etc.), which severely affect vision-driven tasks (e.g., image classification, target recognition, and tracking, et...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2023-01, Vol.11, p.1-1
Main Authors: Zhai, Lujun, Wang, Yonghui, Cui, Suxia, Zhou, Yu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Real-world imagery does not always exhibit good visibility and clean content, but often suffers from various kinds of degradations (e.g., noise, blur, rain drops, fog, color distortion, etc.), which severely affect vision-driven tasks (e.g., image classification, target recognition, and tracking, etc.). Thus, restoring the true scene from such degraded images is of significance. In recent years, a large body of deep learning-based image processing works has been exploited due to the advances in deep neural networks. This paper aims to make a comprehensive review of real-world image restoration algorithms and beyond. More specifically, this review provides overviews of critical benchmark datasets, image quality assessment methods, and four major categories of deep learning-based image restoration methods, i.e., based on convolutional neural network (CNN), generative adversarial network (GAN), Transformer, and multi-layer perceptron (MLP). The paper highlights the latest developments and advances in each category of network architecture to provide an up-to-date overview. Moreover, the representative state-of-the-art image restoration methods are compared visually and numerically. Finally, for real-world image restoration, the current situations are objectively assessed, challenges are discussed, and future directions and trends are presented.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3250616