Loading…
GAN-Based Local Lightness-Aware Enhancement Network for Underexposed Images
Uneven light in real-world causes visual degradation for underexposed regions. For these regions, insufficientconsideration during enhancement procedure will result in over-/under-exposure, loss of details and colordistortion. Confronting such challenges, an unsupervised low-light image enhancement...
Saved in:
Published in: | Journal of information processing systems 2022, 18(4), 76, pp.575-586 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Uneven light in real-world causes visual degradation for underexposed regions. For these regions, insufficientconsideration during enhancement procedure will result in over-/under-exposure, loss of details and colordistortion. Confronting such challenges, an unsupervised low-light image enhancement network is proposed inthis paper based on the guidance of the unpaired low-/normal-light images. The key components in our networkinclude super-resolution module (SRM), a GAN-based low-light image enhancement network (LLIEN), anddenoising-scaling module (DSM). The SRM improves the resolution of the low-light input images beforeillumination enhancement. Such design philosophy improves the effectiveness of texture details preservationby operating in high-resolution space. Subsequently, local lightness attention module in LLIEN effectivelydistinguishes unevenly illuminated areas and puts emphasis on low-light areas, ensuring the spatial consistencyof illumination for locally underexposed images. Then, multiple discriminators, i.e., global discriminator, localregion discriminator, and color discriminator performs assessment from different perspectives to avoid over-/under-exposure and color distortion, which guides the network to generate images that in line with humanaesthetic perception. Finally, the DSM performs noise removal and obtains high-quality enhanced images. Bothqualitative and quantitative experiments demonstrate that our approach achieves favorable results, whichindicates its superior capacity on illumination and texture details restoration. KCI Citation Count: 0 |
---|---|
ISSN: | 1976-913X 2092-805X |
DOI: | 10.3745/JIPS.02.0179 |