ﻻ يوجد ملخص باللغة العربية
In this article, we analyze Hamiltonian Monte Carlo (HMC) by placing it in the setting of Riemannian geometry using the Jacobi metric, so that each step corresponds to a geodesic on a suitable Riemannian manifold. We then combine the notion of curvature of a Markov chain due to Joulin and Ollivier with the classical sectional curvature from Riemannian geometry to derive error bounds for HMC in important cases, where we have positive curvature. These cases include several classical distributions such as multivariate Gaussians, and also distributions arising in the study of Bayesian image registration. The theoretical development suggests the sectional curvature as a new diagnostic tool for convergence for certain Markov chains.
In this article, we consider the preconditioned Hamiltonian Monte Carlo (pHMC) algorithm defined directly on an infinite-dimensional Hilbert space. In this context, and under a condition reminiscent of strong log-concavity of the target measure, we p
The iterated conditional sequential Monte Carlo (i-CSMC) algorithm from Andrieu, Doucet and Holenstein (2010) is an MCMC approach for efficiently sampling from the joint posterior distribution of the $T$ latent states in challenging time-series model
Probabilistic programming uses programs to express generative models whose posterior probability is then computed by built-in inference engines. A challenging goal is to develop general purpose inference algorithms that work out-of-the-box for arbitr
We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-$k$ inhomogeneous random tensor $T$ with sparsity $p_{max}geq frac{clog n}{n }$, we show that $|T-mat
We present a method for performing Hamiltonian Monte Carlo that largely eliminates sample rejection for typical hyperparameters. In situations that would normally lead to rejection, instead a longer trajectory is computed until a new state is reached