Dimension free convergence rates for Gibbs samplers for Bayesian linear mixed models


الملخص بالإنكليزية

The emergence of big data has led to a growing interest in so-called convergence complexity analysis, which is the study of how the convergence rate of a Monte Carlo Markov chain (for an intractable Bayesian posterior distribution) scales as the underlying data set grows in size. Convergence complexity analysis of practical Monte Carlo Markov chains on continuous state spaces is quite challenging, and there have been very few successful analyses of such chains. One fruitful analysis was recently presented by Qin and Hobert (2021b), who studied a Gibbs sampler for a simple Bayesian random effects model. These authors showed that, under regularity conditions, the geometric convergence rate of this Gibbs sampler converges to zero as the data set grows in size. It is shown herein that similar behavior is exhibited by Gibbs samplers for more general Bayesian models that possess both random effects and traditional continuous covariates, the so-called mixed models. The analysis employs the Wasserstein-based techniques introduced by Qin and Hobert (2021b).

تحميل البحث