Loading…
Consensus learning guided multi-view unsupervised feature selection
Multi-view unsupervised feature selection has been proven to be an effective approach to reduce the dimensionality of multi-view data. One of its key issues is how to exploit the underlying common structures across different views. In this paper, we propose a consensus learning guided multi-view uns...
Saved in:
Published in: | Knowledge-based systems 2018-11, Vol.160, p.49-60 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Multi-view unsupervised feature selection has been proven to be an effective approach to reduce the dimensionality of multi-view data. One of its key issues is how to exploit the underlying common structures across different views. In this paper, we propose a consensus learning guided multi-view unsupervised feature selection method, which embeds multi-view feature selection into a non-negative matrix factorization based clustering with sparse constrain. The proposed method learns latent feature matrices from all the views, and optimizes a consensus matrix such that the difference between the cluster indicator matrix of each view and the consensus matrix is minimized. The parameters for balancing the weights of different views are automatically adjusted, and a sparse constraint is imposed on the latent feature matrices to perform feature selection. After that, we design an effective iterative algorithm to solve the resultant optimization problem. Extensive experiments have been conducted on six publicly multi-view datasets, and the results demonstrate that the proposed algorithm outperforms several other state-of-the-art single view and multi-view unsupervised feature selection methods in terms of clustering tasks, validating the effectiveness of the proposed multi-view unsupervised feature selection method. The source code of our algorithm will be available on our on-line page: http://tangchang.net/. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2018.06.016 |