Loading…

Learning algorithms and fixed dynamics

The authors discuss the equivalence of learning algorithms and nonlinear dynamic systems whose differential equations have fixed coefficients. They show how backpropagation transforms into a fixed-weight recursive neural network suitable for VLSI or optical implementations. The transformation is qui...

Full description

Saved in:
Bibliographic Details
Main Authors: Cotter, N.E., Conwell, P.R.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The authors discuss the equivalence of learning algorithms and nonlinear dynamic systems whose differential equations have fixed coefficients. They show how backpropagation transforms into a fixed-weight recursive neural network suitable for VLSI or optical implementations. The transformation is quite general and implies that understanding physiological networks may require one to determine the values of fixed parameters distributed throughout a network. Equivalently, a particular synaptic weight update mechanism such as Hebbian learning could likely be used to implement many known learning algorithms. The authors use the transformation process to illustrate why a network whose only variable weights are hidden-layer thresholds is capable of universal approximation.< >
DOI:10.1109/IJCNN.1991.155280