Loading…
Multi-view low-rank dictionary learning for image classification
Recently, a multi-view dictionary learning (DL) technique has received much attention. Although some multi-view DL methods have been presented, they suffer from the problem of performance degeneration when large noise exists in multiple views. In this paper, we propose a novel multi-view DL approach...
Saved in:
Published in: | Pattern recognition 2016-02, Vol.50, p.143-154 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Recently, a multi-view dictionary learning (DL) technique has received much attention. Although some multi-view DL methods have been presented, they suffer from the problem of performance degeneration when large noise exists in multiple views. In this paper, we propose a novel multi-view DL approach named multi-view low-rank DL (MLDL) for image classification. Specifically, inspired by the low-rank matrix recovery theory, we provide a multi-view dictionary low-rank regularization term to solve the noise problem. We further design a structural incoherence constraint for multi-view DL, such that redundancy among dictionaries of different views can be reduced. In addition, to enhance efficiency of the classification procedure, we design a classification scheme for MLDL, which is based on the idea of collaborative representation based classification. We apply MLDL for face recognition, object classification and digit classification tasks. Experimental results demonstrate the effectiveness and efficiency of the proposed approach.
•We offer a multi-view low-rank dictionary learning method for image classification.•Multi-view dictionary low-rank regularization term is designed to handle noise.•Structural incoherence constraint is given to reduce redundancy in dictionaries.•Multi-view collaborative representation based classification scheme is provided.•Effectiveness and efficiency of our method are demonstrated on four datasets. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2015.08.012 |