Loading…

Label Enhancement for Label Distribution Learning

Label distribution is more general than both single-label annotation and multi-label annotation. It covers a certain number of labels, representing the degree to which each label describes the instance. The learning process on the instances labeled by label distributions is called label distribution...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on knowledge and data engineering 2021-04, Vol.33 (4), p.1632-1643
Main Authors: Xu, Ning, Liu, Yun-Peng, Geng, Xin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Label distribution is more general than both single-label annotation and multi-label annotation. It covers a certain number of labels, representing the degree to which each label describes the instance. The learning process on the instances labeled by label distributions is called label distribution learning (LDL). Unfortunately, many training sets only contain simple logical labels rather than label distributions due to the difficulty of obtaining the label distributions directly. To solve this problem, one way is to recover the label distributions from the logical labels in the training set via leveraging the topological information of the feature space and the correlation among the labels. Such process of recovering label distributions from logical labels is defined as label enhancement (LE), which reinforces the supervision information in the training sets. This paper proposes a novel LE algorithm called Graph Laplacian Label Enhancement (GLLE). Experimental results on one artificial dataset and fourteen real-world LDL datasets show clear advantages of GLLE over several existing LE algorithms. Furthermore, experimental results on eleven multi-label learning datasets validate the advantage of GLLE over the state-of-the-art multi-label learning approaches.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2019.2947040