Loading…

Competitive and collaborative representation for classification

•Correct category coding has the same direction with test sample.•Contributions of incorrect classes is diminished.•Competition among classes is enhanced by novel defined weight.•Competition among classes makes the coding sparse. Deep network recently has achieved promising performance for classific...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition letters 2020-04, Vol.132, p.46-55
Main Authors: Chi, Hongmei, Xia, Haifeng, Zhang, Lifang, Zhang, Chunjiang, Tang, Xin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•Correct category coding has the same direction with test sample.•Contributions of incorrect classes is diminished.•Competition among classes is enhanced by novel defined weight.•Competition among classes makes the coding sparse. Deep network recently has achieved promising performance for classification task with massive training samples. The behavior of this model, however, would be diminished obviously when the training set is small. Meanwhile, linear representation based classifiers have widely applied into many fields. These classifiers mostly attempt to take advantage of the correct class to code the test sample through appending l1-norm or nuclear norm which highly takes computation. Under these observations, we present a novel competitive and collaborative representation classification (Co-CRC) that employs the properties of training data with l2-norm regularization to create a competitive environment which enables the correct class to make more contribution to coding. Additionally, the proposed competitive weight in this paper enhances the competitive relation among all classes and is beneficial for the classifier to find the correct class. Extensive experimental results over popular benchmarks including object, scene and face images database indicate that the proposed algorithm totally based on l2-norm regularization takes less computation to obtain rather sparse coding and mostly outperforms several state-of-the-art approaches.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2018.06.019