Loading…
A Second-Order Finite-Difference Method for Derivative-Free Optimization
In this paper, a second-order finite-difference method is proposed for finding the second-order stationary point of derivative-free nonconvex unconstrained optimization problems. The forward-difference or the central-difference technique is used to approximate the gradient and Hessian matrix of obje...
Saved in:
Published in: | Journal of mathematics (Hidawi) 2024, Vol.2024, p.1-12 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper, a second-order finite-difference method is proposed for finding the second-order stationary point of derivative-free nonconvex unconstrained optimization problems. The forward-difference or the central-difference technique is used to approximate the gradient and Hessian matrix of objective function, respectively. The traditional trust-region framework is used, and we minimize the approximation trust region subproblem to obtain the search direction. The global convergence of the algorithm is given without the fully quadratic assumption. Numerical results show the effectiveness of the algorithm using the forward-difference and central-difference approximations. |
---|---|
ISSN: | 2314-4629 2314-4785 |
DOI: | 10.1155/2024/1947996 |