Loading…

Unnormalized Variational Bayes

We unify empirical Bayes and variational Bayes for approximating unnormalized densities. This framework, named unnormalized variational Bayes (UVB), is based on formulating a latent variable model for the random variable \(Y=X+N(0,\sigma^2 I_d)\) and using the evidence lower bound (ELBO), computed b...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2020-07
Main Author: Saremi, Saeed
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We unify empirical Bayes and variational Bayes for approximating unnormalized densities. This framework, named unnormalized variational Bayes (UVB), is based on formulating a latent variable model for the random variable \(Y=X+N(0,\sigma^2 I_d)\) and using the evidence lower bound (ELBO), computed by a variational autoencoder, as a parametrization of the energy function of \(Y\) which is then used to estimate \(X\) with the empirical Bayes least-squares estimator. In this intriguing setup, the \(\textit{gradient}\) of the ELBO with respect to noisy inputs plays the central role in learning the energy function. Empirically, we demonstrate that UVB has a higher capacity to approximate energy functions than the parametrization with MLPs as done in neural empirical Bayes (DEEN). We especially showcase \(\sigma=1\), where the differences between UVB and DEEN become visible and qualitative in the denoising experiments. For this high level of noise, the distribution of \(Y\) is very smoothed and we demonstrate that one can traverse in a single run \(-\) without a restart \(-\) all MNIST classes in a variety of styles via walk-jump sampling with a fast-mixing Langevin MCMC sampler. We finish by probing the encoder/decoder of the trained models and confirm UVB \(\neq\) VAE.
ISSN:2331-8422