Loading…

Co-Learning Non-Negative Correlated and Uncorrelated Features for Multi-View Data

Multi-view data can represent objects from different perspectives and thus provide complementary information for data analysis. A topic of great importance in multi-view learning is to locate a low-dimensional latent subspace, where common semantic features are shared by multiple data sets. However,...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2021-04, Vol.32 (4), p.1486-1496
Main Authors: Zhao, Liang, Yang, Tao, Zhang, Jie, Chen, Zhikui, Yang, Yi, Wang, Z. Jane
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multi-view data can represent objects from different perspectives and thus provide complementary information for data analysis. A topic of great importance in multi-view learning is to locate a low-dimensional latent subspace, where common semantic features are shared by multiple data sets. However, most existing methods ignore uncorrelated items (i.e., view-specific features) and may cause semantic bias during the process of common feature learning. In this article, we propose a non-negative correlated and uncorrelated feature co-learning (CoUFC) method to address this concern. More specifically, view-specific (uncorrelated) features are identified for each view when learning the common (correlated) feature across views in the latent semantic subspace. By eliminating the effects of uncorrelated information, useful inter-view feature correlations can be captured. We design a new objective function in CoUFC and derive an optimization approach to solve the objective with the analysis on its convergence. Experiments on real-world sensor, image, and text data sets demonstrate that the proposed method outperforms the state-of-the-art multiview learning methods.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2020.2984810