ترغب بنشر مسار تعليمي؟ اضغط هنا

Solar Bayesian Analysis Toolkit -- a new Markov chain Monte Carlo IDL code for Bayesian parameter inference

60   0   0.0 ( 0 )
 نشر من قبل Sergey Anfinogentov
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present the Solar Bayesian Analysis Toolkit (SoBAT) which is a new easy to use tool for Bayesian analysis of observational data, including parameter inference and model comparison. SoBAT is aimed (but not limited) to be used for the analysis of solar observational data. We describe a new Interactive Data Language (IDL) code designed to facilitate the comparison of user-supplied model with data. Bayesian inference allows prior information to be taken into account. The use of Markov chain Monte Carlo (MCMC) sampling allows efficient exploration of large parameter spaces and provides reliable estimation of model parameters and their uncertainties. The Bayesian evidence for different models can be used for quantitative comparison. The code is tested to demonstrate its ability to accurately recover a variety of parameter probability distributions. Its application to practical problems is demonstrated using studies of the structure and oscillation of coronal loops.



قيم البحث

اقرأ أيضاً

This work discusses the implementation of Markov Chain Monte Carlo (MCMC) sampling from an arbitrary Gaussian mixture model (GMM) within SRAM. We show a novel architecture of SRAM by embedding it with random number generators (RNGs), digital-to-analo g converters (DACs), and analog-to-digital converters (ADCs) so that SRAM arrays can be used for high performance Metropolis-Hastings (MH) algorithm-based MCMC sampling. Most of the expensive computations are performed within the SRAM and can be parallelized for high speed sampling. Our iterative compute flow minimizes data movement during sampling. We characterize power-performance trade-off of our design by simulating on 45 nm CMOS technology. For a two-dimensional, two mixture GMM, the implementation consumes ~ 91 micro-Watts power per sampling iteration and produces 500 samples in 2000 clock cycles on an average at 1 GHz clock frequency. Our study highlights interesting insights on how low-level hardware non-idealities can affect high-level sampling characteristics, and recommends ways to optimally operate SRAM within area/power constraints for high performance sampling.
Bayesian inference for nonlinear diffusions, observed at discrete times, is a challenging task that has prompted the development of a number of algorithms, mainly within the computational statistics community. We propose a new direction, and accompan ying methodology, borrowing ideas from statistical physics and computational chemistry, for inferring the posterior distribution of latent diffusion paths and model parameters, given observations of the process. Joint configurations of the underlying process noise and of parameters, mapping onto diffusion paths consistent with observations, form an implicitly defined manifold. Then, by making use of a constrained Hamiltonian Monte Carlo algorithm on the embedded manifold, we are able to perform computationally efficient inference for an extensive class of discretely observed diffusion models. Critically, in contrast with other approaches proposed in the literature, our methodology is highly automated, requiring minimal user intervention and applying alike in a range of settings, including: elliptic or hypo-elliptic systems; observations with or without noise; linear or non-linear observation operators. Exploiting Markovianity, we propose a variant of the method with complexity that scales linearly in the resolution of path discretisation and the number of observation times.
Density-functional theory is widely used to predict the physical properties of materials. However, it usually fails for strongly correlated materials. A popular solution is to use the Hubbard corrections to treat strongly correlated electronic states . Unfortunately, the exact values of the Hubbard $U$ and $J$ parameters are initially unknown, and they can vary from one material to another. In this semi-empirical study, we explore the $U$ and $J$ parameter space of a group of iron-based compounds to simultaneously improve the prediction of physical properties (volume, magnetic moment, and bandgap). We used a Bayesian calibration assisted by Markov chain Monte Carlo sampling for three different exchange-correlation functionals (LDA, PBE, and PBEsol). We found that LDA requires the largest $U$ correction. PBE has the smallest standard deviation and its $U$ and $J$ parameters are the most transferable to other iron-based compounds. Lastly, PBE predicts lattice parameters reasonably well without the Hubbard correction.
Markov chain Monte Carlo (MCMC) produces a correlated sample for estimating expectations with respect to a target distribution. A fundamental question is when should sampling stop so that we have good estimates of the desired quantities? The key to a nswering this question lies in assessing the Monte Carlo error through a multivariate Markov chain central limit theorem (CLT). The multivariate nature of this Monte Carlo error largely has been ignored in the MCMC literature. We present a multivariate framework for terminating simulation in MCMC. We define a multivariate effective sample size, estimating which requires strongly consistent estimators of the covariance matrix in the Markov chain CLT; a property we show for the multivariate batch means estimator. We then provide a lower bound on the number of minimum effective samples required for a desired level of precision. This lower bound depends on the problem only in the dimension of the expectation being estimated, and not on the underlying stochastic process. This result is obtained by drawing a connection between terminating simulation via effective sample size and terminating simulation using a relative standard deviation fixed-volume sequential stopping rule; which we demonstrate is an asymptotically valid procedure. The finite sample properties of the proposed method are demonstrated in a variety of examples.
We present orbital elements and mass sums for eighteen visual binary stars of spectral types B to K (five of which are new orbits) with periods ranging from 20 to more than 500 yr. For two double-line spectroscopic binaries with no previous orbits, t he individual component masses, using combined astrometric and radial velocity data, have a formal uncertainty of ~0.1 MSun. Adopting published photometry, and trigonometric parallaxes, plus our own measurements, we place these objects on an H-R diagram, and discuss their evolutionary status. These objects are part of a survey to characterize the binary population of stars in the Southern Hemisphere, using the SOAR 4m telescope+HRCAM at CTIO. Orbital elements are computed using a newly developed Markov Chain Monte Carlo algorithm that delivers maximum likelihood estimates of the parameters, as well as posterior probability density functions that allow us to evaluate the uncertainty of our derived parameters in a robust way. For spectroscopic binaries, using our approach, it is possible to derive a self-consistent parallax for the system from the combined astrometric plus radial velocity data (orbital parallax), which compares well with the trigonometric parallaxes. We also present a mathematical formalism that allows a dimensionality reduction of the feature space from seven to three search parameters (or from ten to seven dimensions - including parallax - in the case of spectroscopic binaries with astrometric data), which makes it possible to explore a smaller number of parameters in each case, improving the computational efficiency of our Markov Chain Monte Carlo code.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا