Loading…
Low-Rank Matrix Learning Using Biconvex Surrogate Minimization
Many machine learning problems involve learning a low-rank positive semidefinite matrix. However, existing solvers for this low-rank semidefinite program (SDP) are often expensive. In this paper, by factorizing the target matrix as a product of two matrices and using a Courant penalty to penalize fo...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2019-11, Vol.30 (11), p.3517-3527 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many machine learning problems involve learning a low-rank positive semidefinite matrix. However, existing solvers for this low-rank semidefinite program (SDP) are often expensive. In this paper, by factorizing the target matrix as a product of two matrices and using a Courant penalty to penalize for their difference, we reformulate the SDP as a biconvex optimization problem. This allows the use of multiconvex optimization techniques to define simple surrogates, which can be minimized easily by block coordinate descent. Moreover, while traditionally this biconvex problem approaches the original problem only when the penalty parameter is infinite, we show that the two problems are equivalent when the penalty parameter is sufficiently large. Experiments on a number of SDP applications in machine learning show that the proposed algorithm is as accurate as other state-of-the-art algorithms, but is much faster, especially on large data sets. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2019.2927819 |