Loading…
Balance guided incomplete multi-view spectral clustering
There is a large volume of incomplete multi-view data in the real-world. How to partition these incomplete multi-view data is an urgent realistic problem since almost all of the conventional multi-view clustering methods are inapplicable to cases with missing views. In this paper, a novel graph lear...
Saved in:
Published in: | Neural networks 2023-09, Vol.166, p.260-272 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | There is a large volume of incomplete multi-view data in the real-world. How to partition these incomplete multi-view data is an urgent realistic problem since almost all of the conventional multi-view clustering methods are inapplicable to cases with missing views. In this paper, a novel graph learning-based incomplete multi-view clustering (IMVC) method is proposed to address this issue. Different from existing works, our method aims at learning a common consensus graph from all incomplete views and obtaining a clustering indicator matrix in a unified framework. To achieve a stable clustering result, a relaxed spectral clustering model is introduced to obtain a probability consensus representation with all positive elements that reflect the data clustering result. Considering the different contributions of views to the clustering task, a weighted multi-view learning mechanism is introduced to automatically balance the effects of different views in model optimization. In this way, the intrinsic information of the incomplete multi-view data can be fully exploited. The experiments on several incomplete multi-view datasets show that our method outperforms the compared state-of-the-art clustering methods, which demonstrates the effectiveness of our method for IMVC. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2023.07.022 |