ﻻ يوجد ملخص باللغة العربية
emcee is a Python library implementing a class of affine-invariant ensemble samplers for Markov chain Monte Carlo (MCMC). This package has been widely applied to probabilistic modeling problems in astrophysics where it was originally published, with some applications in other fields. When it was first released in 2012, the interface implemented in emcee was fundamentally different from the MCMC libraries that were popular at the time, such as PyMC, because it was specifically designed to work with black box models instead of structured graphical models. This has been a popular interface for applications in astrophysics because it is often non-trivial to implement realistic physics within the modeling frameworks required by other libraries. Since emcees release, other libraries have been developed with similar interfaces, such as dynesty (Speagle 2019). The version 3.0 release of emcee is the first major release of the library in about 6 years and it includes a full re-write of the computational backend, several commonly requested features, and a set of new move implementations.
We introduce a stable, well tested Python implementation of the affine-invariant ensemble sampler for Markov chain Monte Carlo (MCMC) proposed by Goodman & Weare (2010). The code is open source and has already been used in several published projects
We introduce zeus, a well-tested Python implementation of the Ensemble Slice Sampling (ESS) method for Bayesian parameter inference. ESS is a novel Markov chain Monte Carlo (MCMC) algorithm specifically designed to tackle the computational challenges
We propose a new supervised learning algorithm, for classification and regression problems where two or more preliminary predictors are available. We introduce texttt{KernelCobra}, a non-linear learning strategy for combining an arbitrary number of i
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large datasets. We design a Markov chain whose transition kernel uses an (unknown) fraction of (fixed size) of the available data that is randomly refreshed
We investigate the application of ensemble transform approaches to Bayesian inference of logistic regression problems. Our approach relies on appropriate extensions of the popular ensemble Kalman filter and the feedback particle filter to the cross e