ﻻ يوجد ملخص باللغة العربية
Since its debut by John Skilling in 2004, nested sampling has proven a valuable tool to the scientist, providing hypothesis evidence calculations and parameter inference for complicated posterior distributions, particularly in the field of astronomy. Due to its computational complexity and long-running nature, in the past, nested sampling has been reserved for offline-type Bayesian inference, leaving tools such as variational inference and MCMC for online-type, time-constrained, Bayesian computations. These tools do not easily handle complicated multi-modal posteriors, discrete random variables, and posteriors lacking gradients, nor do they enable practical calculations of the Bayesian evidence. An opening thus remains for a high-performance out-of-the-box nested sampling package that can close the gap in computational time, and let nested sampling become common place in the data science toolbox. We present JAX-based nested sampling (JAXNS), a high-performance nested sampling package written in XLA-primitives using JAX, and show that it is several orders of magnitude faster than the currently available nested sampling implementations of PolyChord, MultiNEST, and dynesty, while maintaining the same accuracy of evidence calculation. The JAXNS package is publically available at url{https://github.com/joshuaalbert/jaxns}.
Bayesian inference involves two main computational challenges. First, in estimating the parameters of some model for the data, the posterior distribution may well be highly multi-modal: a regime in which the convergence to stationarity of traditional
In performing a Bayesian analysis, two difficult problems often emerge. First, in estimating the parameters of some model for the data, the resulting posterior distribution may be multi-modal or exhibit pronounced (curving) degeneracies. Secondly, in
We present the development and characterisation of a high frequency (500-750 GHz) corrugated horn based on stacked rings. A previous horn design, based on a Winston profile, has been adapted for the purpose of this manufacturing process without notic
Developing efficient GPU kernels can be difficult because of the complexity of GPU architectures and programming models. Existing performance tools only provide coarse-grained suggestions at the kernel level, if any. In this paper, we describe GPA, a
Metropolis nested sampling evolves a Markov chain from a current livepoint and accepts new points along the chain according to a version of the Metropolis acceptance ratio modified to satisfy the likelihood constraint, characteristic of nested sampli