Loading…

CONVERGENCE COMPLEXITY ANALYSIS OF ALBERT AND CHIB’S ALGORITHM FOR BAYESIAN PROBIT REGRESSION

The use of MCMC algorithms in high dimensional Bayesian problems has become routine. This has spurred so-called convergence complexity analysis, the goal of which is to ascertain how the convergence rate of a Monte Carlo Markov chain scales with sample size, n, and/or number of covariates, p. This a...

Full description

Saved in:
Bibliographic Details
Published in:The Annals of statistics 2019-08, Vol.47 (4), p.2320-2347
Main Authors: Qin, Qian, Hobert, James P.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The use of MCMC algorithms in high dimensional Bayesian problems has become routine. This has spurred so-called convergence complexity analysis, the goal of which is to ascertain how the convergence rate of a Monte Carlo Markov chain scales with sample size, n, and/or number of covariates, p. This article provides a thorough convergence complexity analysis of Albert and Chib’s [J. Amer. Statist. Assoc. 88 (1993) 669–679] data augmentation algorithm for the Bayesian probit regression model. The main tools used in this analysis are drift and minorization conditions. The usual pitfalls associated with this type of analysis are avoided by utilizing centered drift functions, which are minimized in high posterior probability regions, and by using a new technique to suppress high-dimensionality in the construction of minorization conditions. The main result is that the geometric convergence rate of the underlying Markov chain is bounded below 1 both as n → ∞ (with p fixed), and as p → ∞ (with n fixed). Furthermore, the first computable bounds on the total variation distance to stationarity are byproducts of the asymptotic analysis.
ISSN:0090-5364
2168-8966
DOI:10.1214/18-aos1749