Loading…
Deep Label Prior: Pre-Training-Free Salient Object Detection Network based on Label Learning
Due to the excellent semantics extraction capabilities, deep learning methods have significantly progressed in salient object detection (SOD). However, these methods often require time-consuming pre-training and large training datasets with ground truth. To address these issues, by referring to the...
Saved in:
Published in: | IEEE transactions on multimedia 2023-01, Vol.25, p.1-13 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Due to the excellent semantics extraction capabilities, deep learning methods have significantly progressed in salient object detection (SOD). However, these methods often require time-consuming pre-training and large training datasets with ground truth. To address these issues, by referring to the framework known as "deep image prior (DIP)," we propose a SOD method called deep label prior network (DLPNet), which consists of \cal A-stream and \cal B-stream. The \cal A-stream includes two cascaded UNets and a simple CNNs module to extract the initial saliency map, while the \cal B-stream contains only two cascaded UNets, which refines the extracted initial saliency map. Unlike most of the current deep learning methods, DLPNet views the SOD task as a conditional image generation problem, relying on only the internal prior of the input itself to generate the saliency map. Hence, our DLPNet does not require pre-training or large annotated / unannotated datasets. Furthermore, we propose a morphology operation scheme, which creates rich pseudo-labels for facilitating the updating of network weights. Extensive experiments demonstrate that our method outperforms state-of-the-art unsupervised techniques and is even comparable to state-of-the-art supervised and weakly supervised methods on different evaluation metrics. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2022.3204440 |