Loading…
Memory gradient method for the minimization of functions
A new accelerated gradient method for finding the minimum of a function f(x) whose variables are unconstrained is investigated. The new algorithm can be stated as follows: \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{a...
Saved in:
Main Authors: | , |
---|---|
Format: | Book Chapter |
Language: | English |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A new accelerated gradient method for finding the minimum of a function f(x) whose variables are unconstrained is investigated. The new algorithm can be stated as follows: \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\tilde x = x + \delta x,\delta x = - \alpha g(x) + \beta \delta \hat x$$\end{document} where δx is the change in the position vector x, (g(x) is the gradient of the function f(x), and α and β are scalars chosen at each step so as to yield the greatest decrease in the function. The symbol \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\delta \hat x$$\end{document} denotes the change in the position vector for the iteration preceding that under consideration.
For a nonquadratic function, initial convergence of the present method is faster than that of the Fletcher-Reeves method because of the extra degree of freedom available. Three test problems are considered. A comparison is made between the ordinary gradient method, the Fletcher-Reeves method, and the memory gradient method. |
---|---|
ISSN: | 0075-8434 1617-9692 |
DOI: | 10.1007/BFb0066685 |