Loading…
On-line learning in RBF neural networks: a stochastic approach
The on-line learning of Radial Basis Function neural networks (RBFNs) is analyzed. Our approach makes use of a master equation that describes the dynamics of the weight space probability density. An approximate solution of the master equation is obtained in the limit of a small learning rate. In thi...
Saved in:
Published in: | Neural networks 2000-09, Vol.13 (7), p.719-729 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The on-line learning of Radial Basis Function neural networks (RBFNs) is analyzed. Our approach makes use of a master equation that describes the dynamics of the weight space probability density. An approximate solution of the master equation is obtained in the limit of a small learning rate. In this limit, the on line learning dynamics is analyzed and it is shown that, since fluctuations are small, dynamics can be well described in terms of evolution of the mean. This allows us to analyze the learning process of RBFNs in which the number of hidden nodes
K is larger than the typically small number of input nodes
N. The work represents a complementary analysis of on-line RBFNs, with respect to the previous works (
Phys. Rev. E 56 (1997a) 907; Neur. Comput. 9 (1997) 1601), in which RBFNs with
N≫
K have been analyzed. The generalization error equation and the equations of motion of the weights are derived for generic RBF architectures, and numerically integrated in specific cases. Analytical results are then confirmed by numerical simulations. Unlike the case of large
N>
K we find that the dynamics in the case
N<
K is not affected by the problems of symmetric phases and subsequent symmetry breaking. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/S0893-6080(00)00052-6 |