Loading…

Exponential linear unit dilated residual network for digital image denoising

Over the past couple of years, deep learning in convolutional neural networks (CNNs) has proven to be extremely powerful in image processing tasks including image denoising. We deal with digital image denoising based on deep learning. In this regard, dilated CNNs have been successful in achieving im...

Full description

Saved in:
Bibliographic Details
Published in:Journal of electronic imaging 2018-09, Vol.27 (5), p.053024-053024
Main Authors: Panda, Aditi, Naskar, Ruchira, Pal, Snehanshu
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Over the past couple of years, deep learning in convolutional neural networks (CNNs) has proven to be extremely powerful in image processing tasks including image denoising. We deal with digital image denoising based on deep learning. In this regard, dilated CNNs have been successful in achieving impressive denoising quality by obtaining a considerably effective trade-off between receptive field size and network depth, and thus reducing the computational intensity of the CNN architecture. Such networks in general use rectifier linear units as the activation function for introducing nonlinearity into the network. We propose a dilated CNN with exponential linear units for image denoising. Our aim is to enhance the performance of dilated networks in image denoising and develop an effective denoising CNN architecture by optimizing computational intensity. We experiment with a (conventional) 10-layer network, as well as a 5-layer version of the proposed model. It is observed that the proposed 5-layer network succeeds to achieve state-of-the-art denoised image quality as well as high computational efficiency. Our experimental results prove the efficiency of the proposed models in image denoising as well as complexity reduction.
ISSN:1017-9909
1560-229X
DOI:10.1117/1.JEI.27.5.053024