Loading…
Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training
Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance. Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible t...
Saved in:
Published in: | IEEE transactions on neural networks 2011-05, Vol.22 (5), p.673-686 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance. Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible to overfitting and poor generalization. Moreover, training a HyperBF network demands the weights, centers, and local scaling factors to be optimized simultaneously. In the case of a relatively large dataset with a large network structure, such optimization becomes computationally challenging. In this paper, a new regularization method that performs soft local dimension reduction in addition to weight decay is proposed. The regularized HyperBF network is shown to provide classification accuracy competitive to a support vector machine while requiring a significantly smaller network structure. Furthermore, a practical training to construct HyperBF networks is presented. Hierarchal clustering is used to initialize neurons followed by a gradient optimization using a scaled version of the Rprop algorithm with a localized partial backtracking step. Experimental results on seven datasets show that the proposed training provides faster and smoother convergence than the regular Rprop algorithm. |
---|---|
ISSN: | 1045-9227 1941-0093 |
DOI: | 10.1109/TNN.2011.2109736 |