Loading…

Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression

Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively. In this paper, we propose a generalized ver...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2019-04, Vol.337, p.180-190
Main Authors: Sun, Shiliang, He, Shaojie
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively. In this paper, we propose a generalized version of EP called generalized EP (GEP), which is a new method based on the minimization of KL divergence for approximate inference. However, when the variance of the gradient is large, the algorithm may need a long time to converge. We use control variates and develop a variance reduced version of this method called GEP-CV. We evaluate our approach on Bayesian logistic regression, which provides faster convergence and better performance than other state-of-the-art approaches.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2019.01.065