Loading…
Real-Time Image Super-Resolution Using Recursive Depthwise Separable Convolution Network
In recent years, deep convolutional neural networks (CNNs) have been widely used for image super-resolution (SR) to achieve a range of sophisticated performances. Despite the significant advancement made in CNNs, it is still difficult to apply CNNs to practical SR applications due to enormous comput...
Saved in:
Published in: | IEEE access 2019, Vol.7, p.99804-99816 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In recent years, deep convolutional neural networks (CNNs) have been widely used for image super-resolution (SR) to achieve a range of sophisticated performances. Despite the significant advancement made in CNNs, it is still difficult to apply CNNs to practical SR applications due to enormous computations of deep convolutions. In this paper, we propose two lightweight deep neural networks using depthwise separable convolution for the real-time image SR. Specifically, depthwise separable convolution divides the standard convolution into depthwise convolution and pointwise convolution to significantly reduce the number of model parameters and multiplication operations. Moreover, recursive learning is adopted to increase the depth and receptive field of the network in order to improve the SR quality without increasing the model parameters. Finally, we propose a novel technique called Super-Sampling (SS) to learn more abundant high-resolution information by over-sampling the output image followed by adaptive down-sampling. The proposed two models, named SSNet-M and SSNet, outperform the existing state-of-the-art real-time image SR networks, including SRCNN, FSRCNN, ESPCN, and VDSR, in terms of model complexity, and subjective and PSNR/SSIM evaluations on Set5, Set14, B100, Urban100, and Manga109. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2929223 |