Loading…

Solving Convex Min-Min Problems with Smoothness and Strong Convexity in One Group of Variables and Low Dimension in the Other

The article deals with some approaches to solving convex problems of the min-min type with smoothness and strong convexity in only one of the two groups of variables. It is shown that the proposed approaches based on Vaidya’s method, the fast gradient method, and the accelerated gradient method with...

Full description

Saved in:
Bibliographic Details
Published in:Automation and remote control 2021-10, Vol.82 (10), p.1679-1691
Main Authors: Gladin, E., Alkousa, M., Gasnikov, A.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The article deals with some approaches to solving convex problems of the min-min type with smoothness and strong convexity in only one of the two groups of variables. It is shown that the proposed approaches based on Vaidya’s method, the fast gradient method, and the accelerated gradient method with variance reduction have linear convergence. It is proposed to use Vaidya’s method to solve the exterior problem and the fast gradient method to solve the interior (smooth and strongly convex) one. Due to its importance for applications in machine learning, the case where the objective function is the sum of a large number of functions is considered separately. In this case, the accelerated gradient method with variance reduction is used instead of the fast gradient method. The results of numerical experiments are presented that illustrate the advantages of the proposed procedures for a logistic regression problem in which the a priori distribution for one of the two groups of variables is available.
ISSN:0005-1179
1608-3032
DOI:10.1134/S0005117921100064