ﻻ يوجد ملخص باللغة العربية
Markov Chain Monte Carlo (MCMC) is a powerful method for drawing samples from non-standard probability distributions and is utilized across many fields and disciplines. Methods such as Metropolis-Adjusted Langevin (MALA) and Hamiltonian Monte Carlo (HMC), which use gradient information to explore the target distribution, are popular variants of MCMC. The Sequential Monte Carlo (SMC) sampler is an alternative sampling method which, unlike MCMC, can readily utilise parallel computing architectures and also has tuning parameters not available to MCMC. One such parameter is the L-kernel which can be used to minimise the variance of the estimates from an SMC sampler. In this letter, we show how the proposal used in the No-U-Turn Sampler (NUTS), an advanced variant of HMC, can be incorporated into an SMC sampler to improve the efficiency of the exploration of the target space. We also show how the SMC sampler can be optimized using both a near-optimal L-kernel and a Hamiltonian proposal
Key to any cosmic microwave background (CMB) analysis is the separation of the CMB from foreground contaminants. In this paper we present a novel implementation of Bayesian CMB component separation. We sample from the full posterior distribution usin
In this article, we derive a novel non-reversible, continuous-time Markov chain Monte Carlo (MCMC) sampler, called Coordinate Sampler, based on a piecewise deterministic Markov process (PDMP), which can be seen as a variant of the Zigzag sampler. In
The Bouncy Particle Sampler is a Markov chain Monte Carlo method based on a nonreversible piecewise deterministic Markov process. In this scheme, a particle explores the state space of interest by evolving according to a linear dynamics which is alte
The self-learning Metropolis-Hastings algorithm is a powerful Monte Carlo method that, with the help of machine learning, adaptively generates an easy-to-sample probability distribution for approximating a given hard-to-sample distribution. This pape
Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probability distributions and offer guarantees of exact sampling. However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MC