Lightweight Feature Fusion Network for Single Image Super-Resolution

Single image super-resolution (SISR) has witnessed great progress as convolutional neural network (CNN) gets deeper and wider. However, enormous parameters hinder its application to real world problems. In this letter, We propose a lightweight feature fusion network (LFFN) that can fully explore mul...

Full description

Saved in:
Bibliographic Details
Published in:IEEE signal processing letters 2019-04, Vol.26 (4), p.538-542
Main Authors: Yang, Wenming, Wang, Wei, Zhang, Xuechen, Sun, Shuifa, Liao, Qingmin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Single image super-resolution (SISR) has witnessed great progress as convolutional neural network (CNN) gets deeper and wider. However, enormous parameters hinder its application to real world problems. In this letter, We propose a lightweight feature fusion network (LFFN) that can fully explore multi-scale contextual information and greatly reduce network parameters while maximizing SISR results. LFFN is built on spindle blocks and a softmax feature fusion module (SFFM). Specifically, a spindle block is composed of a dimension extension unit, a feature exploration unit. and a feature refinement unit. The dimension extension layer expands low dimension to high dimension and implicitly learns the feature maps which are suitable for the next unit. The feature exploration unit performs linear and nonlinear feature exploration aimed at different feature maps. The feature refinement layer is used to fuse and refine features. SFFM fuses the features from different modules in a self-adaptive learning manner with softmax function, making full use of hierarchical information with a small amount of parameter cost. Both qualitative and quantitative experiments on benchmark datasets show that LFFN achieves favorable performance against state-of-the-art methods with similar parameters.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2018.2890770