Loading…

Weight matrix sharing for multi-label learning

•We propose a shared weight matrix with low-rank and sparse regularization for multi-label learning (2SML) algorithm.•The feature manifold and the label manifold share the weight matrix.•We use high representativeness instances to learn implicit correlations for sparse labels.•We employ nuclear norm...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition 2023-04, Vol.136, p.109156, Article 109156
Main Authors: Qian, Kun, Min, Xue-Yang, Cheng, Yusheng, Min, Fan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•We propose a shared weight matrix with low-rank and sparse regularization for multi-label learning (2SML) algorithm.•The feature manifold and the label manifold share the weight matrix.•We use high representativeness instances to learn implicit correlations for sparse labels.•We employ nuclear norm to model low-rank structure for missing labels.•We employ l1 norm to learn label-specific feature for sparse structure. Multi-label learning on real-world data is a challenging task due to sparse labels, missing labels, and sparse structures. Some existing approaches are effective in addressing the former two issues. In this paper, we propose a shared weight matrix with low-rank and sparse regularization for multi-label learning (2SML) algorithm to address the issues simultaneously. First, two explicit correlation matrices are constructed from the feature matrix and label matrix. Second, we select informative labels by instance representativeness to learn implicit correlations. Third, a feature manifold and label manifold are employed to guide the shared weight learning process. Extensive experiments are undertaken on multiple benchmark datasets with and without missing labels. The results show that the proposed method outperforms the state-of-the-art methods.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2022.109156