Sequentially guided MCMC proposals for synthetic likelihoods and correlated synthetic likelihoods


الملخص بالإنكليزية

Synthetic likelihood (SL) is a strategy for parameter inference when the likelihood function is analytically or computationally intractable. In SL, the likelihood function of the data is replaced by a multivariate Gaussian density over summary statistics of the data. SL requires simulation of many replicate datasets at every parameter value considered by a sampling algorithm, such as MCMC, making the method computationally-intensive. We propose two strategies to alleviate the computational burden imposed by SL algorithms. We first introduce a novel MCMC algorithm for SL where the proposal distribution is sequentially tuned and is also made conditional to data, thus it rapidly guides the proposed parameters towards high posterior probability regions. Second, we exploit strategies borrowed from the correlated pseudo-marginal MCMC literature, to improve the chains mixing in a SL framework. Our methods enable inference for challenging case studies when the chain is initialised in low posterior probability regions of the parameter space, where standard samplers failed. Our guided sampler can also be potentially used with MCMC samplers for approximate Bayesian computation (ABC). Our goal is to provide ways to make the best out of each expensive MCMC iteration, which will broaden the scope of likelihood-free inference for models with costly simulators. To illustrate the advantages stemming from our framework we consider four benchmark examples, including estimation of parameters for a cosmological model and a stochastic model with highly non-Gaussian summary statistics.

تحميل البحث