Loading…

A unified linear convergence analysis of k-SVD

Eigenvector computation, e.g., k -SVD for finding top- k singular subspaces, is often of central importance to many scientific and engineering tasks. There has been resurgent interest recently in analyzing relevant methods in terms of singular value gap dependence. Particularly, when the gap vanishe...

Full description

Saved in:
Bibliographic Details
Published in:Memetic computing 2020-12, Vol.12 (4), p.343-353
Main Authors: Xu, Zhiqiang, Ke, Yiping, Cao, Xin, Zhou, Chunlai, Wei, Pengfei, Gao, Xin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Eigenvector computation, e.g., k -SVD for finding top- k singular subspaces, is often of central importance to many scientific and engineering tasks. There has been resurgent interest recently in analyzing relevant methods in terms of singular value gap dependence. Particularly, when the gap vanishes, the convergence of k -SVD is considered to be capped by a gap-free sub-linear rate. We argue in this work both theoretically and empirically that this is not necessarily the case, refreshing our understanding on this significant problem. Specifically, we leverage the recently proposed structured gap in a careful analysis to establish a unified linear convergence of k -SVD to one of the ground-truth solutions, regardless of what target matrix and how large target rank k are given. Theoretical results are evaluated and verified by experiments on synthetic or real data.
ISSN:1865-9284
1865-9292
DOI:10.1007/s12293-020-00315-4