Loading…
Multiscale deep network for compressive sensing image reconstruction
Deep learning-based image compressive sensing methods have received extensive attention in recent years due to their superior learning ability and fast processing speed. The majority of existing image compressive sensing neural networks use single-scale sampling, whereas multiscale sampling has demo...
Saved in:
Published in: | Journal of electronic imaging 2022-01, Vol.31 (1), p.013025-013025 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep learning-based image compressive sensing methods have received extensive attention in recent years due to their superior learning ability and fast processing speed. The majority of existing image compressive sensing neural networks use single-scale sampling, whereas multiscale sampling has demonstrated excellent performance compared to single-scale. We propose a multiscale deep network for compressive sensing image reconstruction that consists of a multiscale sampling network and a reconstruction network. First, we use convolution to mimic the linear decomposition of images, and the convolution is learned during the training process. Then a sampling network captures compressive measurements across multiple decomposed scales. The reconstruction network, which includes both the initial and enhanced reconstruction networks, learns an end-to-end mapping between the compressed sensing (CS) measurements and the recovered images of the network. Experimental results indicate that the proposed network framework outperforms the existing CS methods in terms of objective metrics, peak signal to noise ratio (PSNR), structural similarity index, and subjective visual quality. Specifically, at a 0.1 sampling rate, using 10 images for testing, and the average PSNR maximum (minimum) gain is 5.95 dB (0.25 dB). |
---|---|
ISSN: | 1017-9909 1560-229X |
DOI: | 10.1117/1.JEI.31.1.013025 |