Loading…

Qualitative analysis of a recurrent neural network for nonlinear continuously differentiable convex minimization over a nonempty closed convex subset

We investigate the qualitative properties of a recurrent neural network (RNN) for minimizing a nonlinear continuously differentiable and convex objective function over any given nonempty, closed, and convex subset which may be bounded or unbounded, by exploiting some key inequalities in mathematical...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2001-11, Vol.12 (6), p.1521-1525
Main Author: Liang, X B
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We investigate the qualitative properties of a recurrent neural network (RNN) for minimizing a nonlinear continuously differentiable and convex objective function over any given nonempty, closed, and convex subset which may be bounded or unbounded, by exploiting some key inequalities in mathematical programming. The global existence and boundedness of the solution of the RNN are proved when the objective function is convex and has a nonempty constrained minimum set. Under the same assumption, the RNN is shown to be globally convergent in the sense that every trajectory of the RNN converges to some equilibrium point of the RNN. If the objective function itself is uniformly convex and its gradient vector is a locally Lipschitz continuous mapping, then the RNN is globally exponentially convergent in the sense that every trajectory of the RNN converges to the unique equilibrium point of the RNN exponentially. These qualitative properties of the RNN render the network model well suitable for solving the convex minimization over any given nonempty, closed, and convex subset, no matter whether the given constrained subset is bounded or not.
ISSN:1045-9227
2162-237X
1941-0093
2162-2388
DOI:10.1109/72.963790