Loading…

An augmented subgradient method for minimizing nonsmooth DC functions

A method, called an augmented subgradient method, is developed to solve unconstrained nonsmooth difference of convex (DC) optimization problems. At each iteration of this method search directions are found by using several subgradients of the first DC component and one subgradient of the second DC c...

Full description

Saved in:
Bibliographic Details
Published in:Computational optimization and applications 2021-11, Vol.80 (2), p.411-438
Main Authors: Bagirov, A. M., Hoseini Monjezi, N., Taheri, S.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A method, called an augmented subgradient method, is developed to solve unconstrained nonsmooth difference of convex (DC) optimization problems. At each iteration of this method search directions are found by using several subgradients of the first DC component and one subgradient of the second DC component of the objective function. The developed method applies an Armijo-type line search procedure to find the next iteration point. It is proved that the sequence of points generated by the method converges to a critical point of the unconstrained DC optimization problem. The performance of the method is demonstrated using academic test problems with nonsmooth DC objective functions and its performance is compared with that of two general nonsmooth optimization solvers and five solvers specifically designed for unconstrained DC optimization. Computational results show that the developed method is efficient and robust for solving nonsmooth DC optimization problems.
ISSN:0926-6003
1573-2894
DOI:10.1007/s10589-021-00304-4