Loading…
Accelerated orthogonal least-squares for large-scale sparse reconstruction
We study the problem of inferring a sparse vector from random linear combinations of its components. We propose the Accelerated Orthogonal Least-Squares (AOLS) algorithm that improves performance of the well-known Orthogonal Least-Squares (OLS) algorithm while requiring significantly lower computati...
Saved in:
Published in: | Digital signal processing 2018-11, Vol.82, p.91-105 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We study the problem of inferring a sparse vector from random linear combinations of its components. We propose the Accelerated Orthogonal Least-Squares (AOLS) algorithm that improves performance of the well-known Orthogonal Least-Squares (OLS) algorithm while requiring significantly lower computational costs. While OLS greedily selects columns of the coefficient matrix that correspond to non-zero components of the sparse vector, AOLS employs a novel computationally efficient procedure that speeds up the search by anticipating future selections via choosing L columns in each step, where L is an adjustable hyper-parameter. We analyze the performance of AOLS and establish lower bounds on the probability of exact recovery for both noiseless and noisy random linear measurements. In the noiseless scenario, it is shown that when the coefficients are samples from a Gaussian distribution, AOLS with high probability recovers a k-sparse m-dimensional sparse vector using O(klogmk+L−1) measurements. Similar result is established for the bounded-noise scenario where an additional condition on the smallest nonzero element of the unknown vector is required. The asymptotic sampling complexity of AOLS is lower than the asymptotic sampling complexity of the existing sparse reconstruction algorithms. In simulations, AOLS is compared to state-of-the-art sparse recovery techniques and shown to provide better performance in terms of accuracy, running time, or both. Finally, we consider an application of AOLS to clustering high-dimensional data lying on the union of low-dimensional subspaces and demonstrate its superiority over existing methods.
•A novel scheme for compressed sensing and sparse recovery with performance guarantees.•Theoretical analysis in the case of sparse recovery from linear random measurements.•Characterized with low complexity and high accuracy compared to competing schemes.•State-of-the-art results in subspace clustering applications. |
---|---|
ISSN: | 1051-2004 1095-4333 |
DOI: | 10.1016/j.dsp.2018.07.010 |