ﻻ يوجد ملخص باللغة العربية
Markov chain Monte Carlo (MCMC) is a simulation method commonly used for estimating expectations with respect to a given distribution. We consider estimating the covariance matrix of the asymptotic multivariate normal distribution of a vector of sample means. Geyer (1992) developed a Monte Carlo error estimation method for estimating a univariate mean. We propose a novel multivariate version of Geyers method that provides an asymptotically valid estimator for the covariance matrix and results in stable Monte Carlo estimates. The finite sample properties of the proposed method are investigated via simulation experiments.
A novel class of non-reversible Markov chain Monte Carlo schemes relying on continuous-time piecewise-deterministic Markov Processes has recently emerged. In these algorithms, the state of the Markov process evolves according to a deterministic dynam
This paper proposes a family of weighted batch means variance estimators, which are computationally efficient and can be conveniently applied in practice. The focus is on Markov chain Monte Carlo simulations and estimation of the asymptotic covarianc
Markov chain Monte Carlo (MCMC) produces a correlated sample for estimating expectations with respect to a target distribution. A fundamental question is when should sampling stop so that we have good estimates of the desired quantities? The key to a
We propose Adaptive Incremental Mixture Markov chain Monte Carlo (AIMM), a novel approach to sample from challenging probability distributions defined on a general state-space. While adaptive MCMC methods usually update a parametric proposal kernel w
Markov Chain Monte Carlo (MCMC) requires to evaluate the full data likelihood at different parameter values iteratively and is often computationally infeasible for large data sets. In this paper, we propose to approximate the log-likelihood with subs