Loading…

Group Sparse Bayesian Learning Via Exact and Fast Marginal Likelihood Maximization

This paper concerns the sparse Bayesian learning (SBL) problem for group sparse signals. Group sparsity means that the signal coefficients can be divided into groups and that the entries in one group are simultaneously zero or nonzero. In SBL, each group is controlled by a hyperparameter, which is e...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on signal processing 2017-05, Vol.65 (10), p.2741-2753
Main Authors: Ma, Zeqiang, Dai, Wei, Liu, Yimin, Wang, Xiqin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper concerns the sparse Bayesian learning (SBL) problem for group sparse signals. Group sparsity means that the signal coefficients can be divided into groups and that the entries in one group are simultaneously zero or nonzero. In SBL, each group is controlled by a hyperparameter, which is estimated by solving the marginal likelihood maximization (MLM) problem. MLM is used to maximize the marginal likelihood of a given hyper-parameter by fixing all others. The main contribution of this paper is to extend the fast marginal likelihood maximization (FMLM) method to the group sparsity case. The key process of realizing FMLM in the group sparsity case is solved by finding the roots of a polynomial. Hence, the local maximum of the marginal likelihood can be found exactly, and the final estimation is obtained more efficiently. Furthermore, most large matrix inverses involved in MLM are replaced with the eigenvalue decompositions of much smaller matrices, which substantially reduces the computational complexity. Numerical results with both simulated and real data demonstrate the excellent performance and relatively high computational efficiency of the proposed method.
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2017.2675867