Loading…

Huber Loss-Based Penalty Approach to Problems with Linear Constraints

We consider a convex optimization problem with many linear inequality constraints. To deal with a large number of constraints, we provide a penalty reformulation of the problem, where the penalty is a variant of the one-sided Huber loss function with two penalty parameters. We study the infeasibilit...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2023-11
Main Authors: Nedich, Angelia, Tatarenko, Tatiana
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We consider a convex optimization problem with many linear inequality constraints. To deal with a large number of constraints, we provide a penalty reformulation of the problem, where the penalty is a variant of the one-sided Huber loss function with two penalty parameters. We study the infeasibility properties of the solutions of penalized problems for nonconvex and convex objective functions, as the penalty parameters vary with time. Then, we propose a random incremental penalty method for solving the original problem, and investigate its convergence properties for convex and strongly convex objective functions. We show that the iterates of the method converge to a solution of the original problem almost surely and in expectation for suitable choices of the penalty parameters and the stepsize. Also, we establish convergence rate of the method in terms of the expected function values by utilizing appropriately defined weighted averages of the iterates. We show \(O(\ln^{1/2+\epsilon} k/{\sqrt k})\)-convergence rate when the objective function is convex and \(O(\ln^{\epsilon} k/k)\)-convergence rate when the objective function is strongly convex, with \(\epsilon>0\) being an arbitrarily small scalar. } To the best of our knowledge, these are the first results on the convergence rate for the penalty-based incremental subgradient method with time-varying penalty parameters.
ISSN:2331-8422