Loading…
Sparse Supervised Representation-Based Classifier for Uncontrolled and Imbalanced Classification
The sparse representation-based classification (SRC) has been utilized in many applications and is an effective algorithm in machine learning. However, the performance of SRC highly depends on the data distribution. Some existing works proved that SRC could not obtain satisfactory results on uncontr...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2020-08, Vol.31 (8), p.2847-2856 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The sparse representation-based classification (SRC) has been utilized in many applications and is an effective algorithm in machine learning. However, the performance of SRC highly depends on the data distribution. Some existing works proved that SRC could not obtain satisfactory results on uncontrolled data sets. Except the uncontrolled data sets, SRC cannot deal with imbalanced classification either. In this paper, we proposed a model named sparse supervised representation classifier (SSRC) to solve the above-mentioned issues. The SSRC involves the class label information during the test sample representation phase to deal with the uncontrolled data sets. In SSRC, each class has the opportunity to linearly represent the test sample in its subspace, which can decrease the influences of the uncontrolled data distribution. In order to classify imbalanced data sets, a class weight learning model is proposed and added to SSRC. Each class weight is learned from its corresponding training samples. The experimental results based on the AR face database (uncontrolled) and 15 KEEL data sets (imbalanced) with an imbalanced rate ranging from 1.48 to 61.18 prove SSRC can effectively classify uncontrolled and imbalanced data sets. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2018.2884444 |