Loading…
Nonnegative Laplacian embedding guided subspace learning for unsupervised feature selection
Unsupervised feature selection plays an important role in machine learning and data mining, which is very challenging because of unavailable class labels. We propose an unsupervised feature selection framework by combining the discriminative information of class labels with the subspace learning in...
Saved in:
Published in: | Pattern recognition 2019-09, Vol.93, p.337-352 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Unsupervised feature selection plays an important role in machine learning and data mining, which is very challenging because of unavailable class labels. We propose an unsupervised feature selection framework by combining the discriminative information of class labels with the subspace learning in this paper. In the proposed framework, the nonnegative Laplacian embedding is first utilized to produce pseudo labels, so as to improve the classification accuracy. Then, an optimal feature subset is selected by the subspace learning guiding by the discriminative information of class labels, on the premise of maintaining the local structure of data. We develop an iterative strategy for updating similarity matrix and pseudo labels, which can bring about more accurate pseudo labels, and then we provide the convergence of the proposed strategy. Finally, experimental results on six real-world datasets prove the superiority of the proposed approach over seven state-of-the-art ones. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2019.04.020 |