Loading…

A Hybrid Single Image Super-Resolution Technique Using Fractal Interpolation and Convolutional Neural Network

Advent of convolutional neural network (CNN) in the field of single image super-resolution (SISR) has shown immense improvement in the process of high resolution (HR) image generation. It involves an end-to-end mathematical mapping through non-linear feature extraction between low resolution (LR) an...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition and image analysis 2021, Vol.31 (1), p.18-23
Main Authors: Pandey, Garima, Ghanekar, Umesh
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Advent of convolutional neural network (CNN) in the field of single image super-resolution (SISR) has shown immense improvement in the process of high resolution (HR) image generation. It involves an end-to-end mathematical mapping through non-linear feature extraction between low resolution (LR) and HR image. Performance of CNN can be improved by increasing depth of the architecture, which generally results in higher computational cost and running time. Also, the performance of CNN can be improved by providing more appropriate input. Presently, in most of the cases input image to a CNN is a LR image that is bi-cubically interpolated to the desired size of HR image. However, bicubic interpolation results into detail smoothing of the image. Therefore, in this paper, a hybrid of CNN and fractal interpolation based SISR algorithm is proposed for reconstruction of HR image. Here, a three layered light-weight CNN architecture is utilize which is capable of producing comparable performance with the traditional SISR techniques and fractal interpolation helps in better preservation of structural and textural properties of the HR image. Experimental results are provided to prove the efficacy of the algorithm proposed in the paper.
ISSN:1054-6618
1555-6212
DOI:10.1134/S1054661821010144