Loading…
Hyper-sausage coverage function neuron model and learning algorithm for image classification
•We propose a flexible HSCF neuron model, which adaptively changes the positions and directions of the one-dimensional simplex, as well as the radius of the hyperspheres. Thus, higher variability was assured for constructing the geometries, which helps to mine the potential data distribution.•A nove...
Saved in:
Published in: | Pattern recognition 2023-04, Vol.136, p.109216, Article 109216 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We propose a flexible HSCF neuron model, which adaptively changes the positions and directions of the one-dimensional simplex, as well as the radius of the hyperspheres. Thus, higher variability was assured for constructing the geometries, which helps to mine the potential data distribution.•A novel CE_VC loss function was proposed by constructing a volume-coverage loss function, which compresses the volume of the hyper-sausage to the hit, and thus the intra-class compactness of samples is assured.•We introduce a network learning algorithm that primarily conducts a divisive iteration method to determine the optimal hyperparameters adaptively.•Experiments using several datasets demonstrate the effectiveness and generalization ability of the proposed HSCF neuron in achieving excellent performance, including classification accuracy, complexity and computation.
Recently, deep neural networks (DNNs) promote mainly by network architectures and loss functions; however, the development of neuron models has been quite limited. In this study, inspired by the mechanism of human cognition, a hyper-sausage coverage function (HSCF) neuron model possessing a high flexible plasticity. Then, a novel cross-entropy and volume-coverage (CE_VC) loss is defined, which compresses the volume of the hyper-sausage to the hilt, and helps alleviate confusion among different classes, thus ensuring the intra-class compactness of the samples. Finally, a divisive iteration method is introduced, which considers each neuron model as a weak classifier, and iteratively increases the number of weak classifiers. Thus, the optimal number of the HSCF neuron is adaptively determined and an end-to-end learning framework is constructed. In particular, to improve the classification performance, the HSCF neuron can be applied to classical DNNs. Comprehensive experiments on eight datasets in several domains demonstrate the effectiveness of the proposed method. The proposed method exhibits the feasibility of boosting DNNs with neuron plasticity and provides a novel perspective for further developments in DNNs. The source code is available at https://github.com/Tough2011/HSCFNet.git . |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2022.109216 |