Loading…

Separable non-linear least-squares minimization-possible improvements for neural net fitting

Neural network minimization problems are often ill-conditioned and in this contribution two ways to handle this will be discussed. It is shown that a better conditioned minimization problem can be obtained if the problem is separated with respect to the linear parameters. This will increase the conv...

Full description

Saved in:
Bibliographic Details
Main Authors: Sjoberg, J., Viberg, M.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Neural network minimization problems are often ill-conditioned and in this contribution two ways to handle this will be discussed. It is shown that a better conditioned minimization problem can be obtained if the problem is separated with respect to the linear parameters. This will increase the convergence speed of the minimization. The Levenberg-Marquardt minimization method is often concluded to perform better than the Gauss-Newton and the steepest descent methods on neural network minimization problems. The reason for this is investigated and it is shown that the Levenberg-Marquardt method divides the parameters into two subsets. For one subset the convergence is almost quadratic like that of the Gauss-Newton method, and on the other subset the parameters do hardly converge at all. In this way a fast convergence among the important parameters is obtained.
ISSN:1089-3555
2379-2329
DOI:10.1109/NNSP.1997.622415