Loading…
Partial Label Feature Selection: An Adaptive Approach
As an emerging weakly supervised learning framework, partial label learning aims to induce a multi-class classifier from ambiguous supervision information where each training example is associated with a set of candidate labels, among which only one is the true label. Traditional feature selection m...
Saved in:
Published in: | IEEE transactions on knowledge and data engineering 2024-08, Vol.36 (8), p.4178-4191 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | As an emerging weakly supervised learning framework, partial label learning aims to induce a multi-class classifier from ambiguous supervision information where each training example is associated with a set of candidate labels, among which only one is the true label. Traditional feature selection methods, either for single label and multiple label problems, are not applicable to partial label learning as the ambiguous information contained in the label space obfuscates the importance of features and misleads the selection process. This makes the selection of a proper feature subset from partial label examples particularly challenging, and therefore has rarely been investigated. In this paper, we propose a novel feature selection algorithm for partial label learning, named PLFS, which considers not only the relationships between features and labels, but also exploits the relationships between instances to select the most informative and important features to enhance the performance of partial label learning. PLFS constructs an adaptive weighted graph to exploit the similarity information among instances, differentiate the label space and weight the feature space, which leads to the selection of a proper feature subset. Extensive experiments over a broad range of benchmark data sets clearly validate the effectiveness of our proposed feature selection approach. |
---|---|
ISSN: | 1041-4347 1558-2191 |
DOI: | 10.1109/TKDE.2024.3365691 |