Loading…
Neurodynamical classifiers with low model complexity
The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an upper bound on the Vapnik–Chervonenkis (VC) dimension. The VC dimension measures the capacity or model complexity of a learning machine. Vapnik’s risk formula indicates that models with smaller VC d...
Saved in:
Published in: | Neural networks 2020-12, Vol.132, p.405-415 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an upper bound on the Vapnik–Chervonenkis (VC) dimension. The VC dimension measures the capacity or model complexity of a learning machine. Vapnik’s risk formula indicates that models with smaller VC dimension are expected to show improved generalization. On many benchmark datasets, the MCM generalizes better than SVMs and uses far fewer support vectors than the number used by SVMs. In this paper, we describe a neural network that converges to the MCM solution. We employ the MCM neurodynamical system as the final layer of a neural network architecture. Our approach also optimizes the weights of all layers in order to minimize the objective, which is a combination of a bound on the VC dimension and the classification error. We illustrate the use of this model for robust binary and multi-class classification. Numerical experiments on benchmark datasets from the UCI repository show that the proposed approach is scalable and accurate, and learns models with improved accuracies and fewer support vectors. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2020.08.013 |