Loading…
Camouflaged Object Detection That Does Not Require Additional Priors
Camouflaged object detection (COD) is an arduous challenge due to the striking resemblance of camouflaged objects to their surroundings. The abundance of similar background information can significantly impede the efficiency of camouflaged object detection algorithms. Prior research in this domain h...
Saved in:
Published in: | Applied sciences 2024-03, Vol.14 (6), p.2621 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Camouflaged object detection (COD) is an arduous challenge due to the striking resemblance of camouflaged objects to their surroundings. The abundance of similar background information can significantly impede the efficiency of camouflaged object detection algorithms. Prior research in this domain has often relied on supplementary prior knowledge to guide model training. However, acquiring such prior knowledge is resource-intensive. Furthermore, the additional provided prior information is typically already embedded in the original image, but this information is underutilized. To address these issues, in this paper, we introduce a novel Camouflage Cues Guidance Network (CCGNet) for camouflaged object detection that does not rely on additional prior knowledge. Specifically, we use an adaptive approach to track the learning state of the model with respect to the camouflaged object and dynamically extract the cues of the camouflaged object from the original image. In addition, we introduce a foreground separation module and an edge refinement module to effectively utilize these camouflage cues, assisting the model in fully separating camouflaged objects and enabling precise edge prediction. Extensive experimental results demonstrate that our proposed methods can achieve superior performance compared with state-of-the-art approaches. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app14062621 |