ترغب بنشر مسار تعليمي؟ اضغط هنا

Bayesian model comparison in cosmology with Population Monte Carlo

272   0   0.0 ( 0 )
 نشر من قبل Martin Kilbinger
 تاريخ النشر 2009
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We use Bayesian model selection techniques to test extensions of the standard flat LambdaCDM paradigm. Dark-energy and curvature scenarios, and primordial perturbation models are considered. To that end, we calculate the Bayesian evidence in favour of each model using Population Monte Carlo (PMC), a new adaptive sampling technique which was recently applied in a cosmological context. The Bayesian evidence is immediately available from the PMC sample used for parameter estimation without further computational effort, and it comes with an associated error evaluation. Besides, it provides an unbiased estimator of the evidence after any fixed number of iterations and it is naturally parallelizable, in contrast with MCMC and nested sampling methods. By comparison with analytical predictions for simulated data, we show that our results obtained with PMC are reliable and robust. The variability in the evidence evaluation and the stability for various cases are estimated both from simulations and from data. For the cases we consider, the log-evidence is calculated with a precision of better than 0.08. Using a combined set of recent CMB, SNIa and BAO data, we find inconclusive evidence between flat LambdaCDM and simple dark-energy models. A curved Universe is moderately to strongly disfavoured with respect to a flat cosmology. Using physically well-motivated priors within the slow-roll approximation of inflation, we find a weak preference for a running spectral index. A Harrison-Zeldovich spectrum is weakly disfavoured. With the current data, tensor modes are not detected; the large prior volume on the tensor-to-scalar ratio r results in moderate evidence in favour of r=0. [Abridged]



قيم البحث

اقرأ أيضاً

We present the public release of the Bayesian sampling algorithm for cosmology, CosmoPMC (Cosmology Population Monte Carlo). CosmoPMC explores the parameter space of various cosmological probes, and also provides a robust estimate of the Bayesian evi dence. CosmoPMC is based on an adaptive importance sampling method called Population Monte Carlo (PMC). Various cosmology likelihood modules are implemented, and new modules can be added easily. The importance-sampling algorithm is written in C, and fully parallelised using the Message Passing Interface (MPI). Due to very little overhead, the wall-clock time required for sampling scales approximately with the number of CPUs. The CosmoPMC package contains post-processing and plotting programs, and in addition a Monte-Carlo Markov chain (MCMC) algorithm. The sampling engine is implemented in the library pmclib, and can be used independently. The software is available for download at http://www.cosmopmc.info.
Weak lensing by large-scale structure is a powerful probe of cosmology and of the dark universe. This cosmic shear technique relies on the accurate measurement of the shapes and redshifts of background galaxies and requires precise control of systema tic errors. The Monte Carlo Control Loops (MCCL) is a forward modelling method designed to tackle this problem. It relies on the Ultra Fast Image Generator (UFig) to produce simulated images tuned to match the target data statistically, followed by calibrations and tolerance loops. We present the first end-to-end application of this method, on the Dark Energy Survey (DES) Year 1 wide field imaging data. We simultaneously measure the shear power spectrum $C_{ell}$ and the redshift distribution $n(z)$ of the background galaxy sample. The method includes maps of the systematic sources, Point Spread Function (PSF), an Approximate Bayesian Computation (ABC) inference of the simulation model parameters, a shear calibration scheme, and the fast estimation of the covariance matrix. We find a close statistical agreement between the simulations and the DES Y1 data using an array of diagnostics. In a non-tomographic setting, we derive a set of $C_ell$ and $n(z)$ curves that encode the cosmic shear measurement, as well as the systematic uncertainty. Following a blinding scheme, we measure the combination of $Omega_m$, $sigma_8$, and intrinsic alignment amplitude $A_{rm{IA}}$, defined as $S_8D_{rm{IA}} = sigma_8(Omega_m/0.3)^{0.5}D_{rm{IA}}$, where $D_{rm{IA}}=1-0.11(A_{rm{IA}}-1)$. We find $S_8D_{rm{IA}}=0.895^{+0.054}_{-0.039}$, where systematics are at the level of roughly 60% of the statistical errors. We discuss these results in the context of earlier cosmic shear analyses of the DES Y1 data. Our findings indicate that this method and its fast runtime offer good prospects for cosmic shear measurements with future wide-field surveys.
Despite the ability of the cosmological concordance model ($Lambda$CDM) to describe the cosmological observations exceedingly well, power law expansion of the Universe scale radius, $R(t)propto t^n$, has been proposed as an alternative framework. We examine here these models, analyzing their ability to fit cosmological data using robust model comparison criteria. Type Ia supernovae (SNIa), baryonic acoustic oscillations (BAO) and acoustic scale information from the cosmic microwave background (CMB) have been used. We find that SNIa data either alone or combined with BAO can be well reproduced by both $Lambda$CDM and power law expansion models with $nsim 1.5$, while the constant expansion rate model $(n=1)$ is clearly disfavored. Allowing for some redshift evolution in the SNIa luminosity essentially removes any clear preference for a specific model. The CMB data are well known to provide the most stringent constraints on standard cosmological models, in particular, through the position of the first peak of the temperature angular power spectrum, corresponding to the sound horizon at recombination, a scale physically related to the BAO scale. Models with $ngeq 1$ lead to a divergence of the sound horizon and do not naturally provide the relevant scales for the BAO and the CMB. We retain an empirical footing to overcome this issue: we let the data choose the preferred values for these scales, while we recompute the ionization history in power law models, to obtain the distance to the CMB. In doing so, we find that the scale coming from the BAO data is not consistent with the observed position of the first peak of the CMB temperature angular power spectrum for any power law cosmology. Therefore, we conclude that when the three standard probes are combined, the $Lambda$CDM model is very strongly favored over any of these alternative models, which are then essentially ruled out.
Monte Carlo methods are widely used for approximating complicated, multidimensional integrals for Bayesian inference. Population Monte Carlo (PMC) is an important class of Monte Carlo methods, which utilizes a population of proposals to generate weig hted samples that approximate the target distribution. The generic PMC framework iterates over three steps: samples are simulated from a set of proposals, weights are assigned to such samples to correct for mismatch between the proposal and target distributions, and the proposals are then adapted via resampling from the weighted samples. When the target distribution is expensive to evaluate, the PMC has its computational limitation since the convergence rate is $mathcal{O}(N^{-1/2})$. To address this, we propose in this paper a new Population Quasi-Monte Carlo (PQMC) framework, which integrates Quasi-Monte Carlo ideas within the sampling and adaptation steps of PMC. A key novelty in PQMC is the idea of importance support points resampling, a deterministic method for finding an optimal subsample from the weighted proposal samples. Moreover, within the PQMC framework, we develop an efficient covariance adaptation strategy for multivariate normal proposals. Lastly, a new set of correction weights is introduced for the weighted PMC estimator to improve the efficiency from the standard PMC estimator. We demonstrate the improved empirical convergence of PQMC over PMC in extensive numerical simulations and a friction drilling application.
Online solvers for partially observable Markov decision processes have difficulty scaling to problems with large action spaces. Monte Carlo tree search with progressive widening attempts to improve scaling by sampling from the action space to constru ct a policy search tree. The performance of progressive widening search is dependent upon the action sampling policy, often requiring problem-specific samplers. In this work, we present a general method for efficient action sampling based on Bayesian optimization. The proposed method uses a Gaussian process to model a belief over the action-value function and selects the action that will maximize the expected improvement in the optimal action value. We implement the proposed approach in a new online tree search algorithm called Bayesian Optimized Monte Carlo Planning (BOMCP). Several experiments show that BOMCP is better able to scale to large action space POMDPs than existing state-of-the-art tree search solvers.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا