Loading…
Neural approach for solving several types of optimization problems
Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural networks that can be used to solve several classes of optimization problems. More specifically,...
Saved in:
Published in: | Journal of optimization theory and applications 2006-03, Vol.128 (3), p.563-580 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural networks that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its internal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimization problems, dynamic programming problems, and nonlinear optimization problems. [PUBLICATION ABSTRACT] |
---|---|
ISSN: | 0022-3239 1573-2878 |
DOI: | 10.1007/s10957-006-9032-9 |