Loading…

A Deep Model for Partial Multi-label Image Classification with Curriculum-based Disambiguation

In this paper, we study the partial multi-label (PML) image classification problem, where each image is annotated with a candidate label set consisting of multiple relevant labels and other noisy labels. Existing PML methods typically design a disambiguation strategy to filter out noisy labels by ut...

Full description

Saved in:
Bibliographic Details
Published in:International journal of automation and computing 2024-08, Vol.21 (4), p.801-814
Main Authors: Sun, Feng, Xie, Ming-Kun, Huang, Sheng-Jun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we study the partial multi-label (PML) image classification problem, where each image is annotated with a candidate label set consisting of multiple relevant labels and other noisy labels. Existing PML methods typically design a disambiguation strategy to filter out noisy labels by utilizing prior knowledge with extra assumptions, which unfortunately is unavailable in many real tasks. Furthermore, because the objective function for disambiguation is usually elaborately designed on the whole training set, it can hardly be optimized in a deep model with stochastic gradient descent (SGD) on mini-batches. In this paper, for the first time, we propose a deep model for PML to enhance the representation and discrimination ability. On the one hand, we propose a novel curriculum-based disambiguation strategy to progressively identify ground-truth labels by incorporating the varied difficulties of different classes. On the other hand, consistency regularization is introduced for model training to balance fitting identified easy labels and exploiting potential relevant labels. Extensive experimental results on the commonly used benchmark datasets show that the proposed method significantly outperforms the SOTA methods.
ISSN:2731-538X
1476-8186
2731-5398
1751-8520
DOI:10.1007/s11633-023-1439-3