Loading…
Depth-Aware Unpaired Video Dehazing
This paper investigates a novel unpaired video dehazing framework, which can be a good candidate in practice by relieving pressure from collecting paired data. In such a paradigm, two key issues including 1) temporal consistency uninvolved in single image dehazing, and 2) better dehazing ability nee...
Saved in:
Published in: | IEEE transactions on image processing 2024, Vol.33, p.2388-2403 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper investigates a novel unpaired video dehazing framework, which can be a good candidate in practice by relieving pressure from collecting paired data. In such a paradigm, two key issues including 1) temporal consistency uninvolved in single image dehazing, and 2) better dehazing ability need to be considered for satisfied performance. To handle the mentioned problems, we alternatively resort to introducing depth information to construct additional regularization and supervision. Specifically, we attempt to synthesize realistic motions with depth information to improve the effectiveness and applicability of traditional temporal losses, and thus better regularizing the spatiotemporal consistency. Moreover, the depth information is also considered in terms of adversarial learning. For haze removal, the depth information guides the local discriminator to focus on regions where haze residuals are more likely to exist. The dehazing performance is consequently improved by more pertinent guidance from our depth-aware local discriminator. Extensive experiments are conducted to validate our effectiveness and superiority over other competitors. To the best of our knowledge, this study is the initial foray into the task of unpaired video dehazing. Our code is available at https://github.com/YaN9-Y/DUVD . |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2024.3378472 |