ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantifying dimensionality: Bayesian cosmological model complexities

62   0   0.0 ( 0 )
 نشر من قبل W.J. Handley
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We demonstrate a measure for the effective number of parameters constrained by a posterior distribution in the context of cosmology. In the same way that the mean of the Shannon information (i.e. the Kullback-Leibler divergence) provides a measure of the strength of constraint between prior and posterior, we show that the variance of the Shannon information gives a measure of dimensionality of constraint. We examine this quantity in a cosmological context, applying it to likelihoods derived from Cosmic Microwave Background, large scale structure and supernovae data. We show that this measure of Bayesian model dimensionality compares favourably both analytically and numerically in a cosmological context with the existing measure of model complexity used in the literature.



قيم البحث

اقرأ أيضاً

We provide a new interpretation for the Bayes factor combination used in the Dark Energy Survey (DES) first year analysis to quantify the tension between the DES and Planck datasets. The ratio quantifies a Bayesian confidence in our ability to combin e the datasets. This interpretation is prior-dependent, with wider prior widths boosting the confidence. We therefore propose that if there are any reasonable priors which reduce the confidence to below unity, then we cannot assert that the datasets are compatible. Computing the evidence ratios for the DES first year analysis and Planck, given that narrower priors drop the confidence to below unity, we conclude that DES and Planck are, in a Bayesian sense, incompatible under LCDM. Additionally we compute ratios which confirm the consensus that measurements of the acoustic scale by the Baryon Oscillation Spectroscopic Survey (SDSS) are compatible with Planck, whilst direct measurements of the acceleration rate of the Universe by the SHOES collaboration are not. We propose a modification to the Bayes ratio which removes the prior dependency using Kullback-Leibler divergences, and using this statistical test find Planck in strong tension with SHOES, in moderate tension with DES, and in no tension with SDSS. We propose this statistic as the optimal way to compare datasets, ahead of the next DES data releases, as well as future surveys. Finally, as an element of these calculations, we introduce in a cosmological setting the Bayesian model dimensionality, which is a parameterisation-independent measure of the number of parameters that a given dataset constrains.
Analyzes of next-generation galaxy data require accurate treatment of systematic effects such as the bias between observed galaxies and the underlying matter density field. However, proposed models of the phenomenon are either numerically expensive o r too inaccurate to achieve unbiased inferences of cosmological parameters even at mildly-nonlinear scales of the data. As an alternative to constructing accurate galaxy bias models, requiring understanding galaxy formation, we propose to construct likelihood distributions for Bayesian forward modeling approaches that are insensitive to linear, scale-dependent bias and provide robustness against model misspecification. We use maximum entropy arguments to construct likelihood distributions designed to account only for correlations between data and inferred quantities. By design these correlations are insensitive to linear galaxy biasing relations, providing the desired robustness. The method is implemented and tested within a Markov Chain Monte Carlo approach. The method is assessed using a halo mock catalog based on standard full, cosmological, N-body simulations. We obtain unbiased and tight constraints on cosmological parameters exploiting only linear cross-correlation rates for $kle 0.10$ Mpc/h. Tests for halos of masses ~10$^{12}$ M$_odot$ to ~10$^{13}$ M$_odot$ indicate that it is possible to ignore all details of the linear, scale dependent, bias function while obtaining robust constraints on cosmology. Our results provide a promising path forward to analyzes of galaxy surveys without the requirement of having to accurately model the details of galaxy biasing but by designing robust likelihoods for the inference.
Creating accurate and low-noise covariance matrices represents a formidable challenge in modern-day cosmology. We present a formalism to compress arbitrary observables into a small number of bins by projection into a model-specific subspace that mini mizes the prior-averaged log-likelihood error. The lower dimensionality leads to a dramatic reduction in covariance matrix noise, significantly reducing the number of mocks that need to be computed. Given a theory model, a set of priors, and a simple model of the covariance, our method works by using singular value decompositions to construct a basis for the observable that is close to Euclidean; by restricting to the first few basis vectors, we can capture almost all the constraining power in a lower-dimensional subspace. Unlike conventional approaches, the method can be tailored for specific analyses and captures non-linearities that are not present in the Fisher matrix, ensuring that the full likelihood can be reproduced. The procedure is validated with full-shape analyses of power spectra from BOSS DR12 mock catalogs, showing that the 96-bin power spectra can be replaced by 12 subspace coefficients without biasing the output cosmology; this allows for accurate parameter inference using only $sim 100$ mocks. Such decompositions facilitate accurate testing of power spectrum covariances; for the largest BOSS data chunk, we find that: (a) analytic covariances provide accurate models (with or without trispectrum terms); and (b) using the sample covariance from the MultiDark-Patchy mocks incurs a $sim 0.5sigma$ shift in $Omega_m$, unless the subspace projection is applied. The method is easily extended to higher order statistics; the $sim 2000$-bin bispectrum can be compressed into only $sim 10$ coefficients, allowing for accurate analyses using few mocks and without having to increase the bin sizes.
We present a large-scale Bayesian inference framework to constrain cosmological parameters using galaxy redshift surveys, via an application of the Alcock-Paczynski (AP) test. Our physical model of the non-linearly evolved density field, as probed by galaxy surveys, employs Lagrangian perturbation theory (LPT) to connect Gaussian initial conditions to the final density field, followed by a coordinate transformation to obtain the redshift space representation for comparison with data. We generate realizations of primordial and present-day matter fluctuations given a set of observations. This hierarchical approach encodes a novel AP test, extracting several orders of magnitude more information from the cosmological expansion compared to classical approaches, to infer cosmological parameters and jointly reconstruct the underlying 3D dark matter density field. The novelty of this AP test lies in constraining the comoving-redshift transformation to infer the appropriate cosmology which yields isotropic correlations of the galaxy density field, with the underlying assumption relying purely on the cosmological principle. Such an AP test does not rely explicitly on modelling the full statistics of the field. We verify in depth via simulations that this renders our test robust to model misspecification. This leads to another crucial advantage, namely that the cosmological parameters exhibit extremely weak dependence on the currently unresolved phenomenon of galaxy bias, thereby circumventing a potentially key limitation. This is consequently among the first methods to extract a large fraction of information from statistics other than that of direct density contrast correlations, without being sensitive to the amplitude of density fluctuations. We perform several statistical efficiency and consistency tests on a mock galaxy catalogue, using the SDSS-III survey as template.
We present $it{CosmoPower}$, a suite of neural cosmological power spectrum emulators providing orders-of-magnitude acceleration for parameter estimation from two-point statistics analyses of Large-Scale Structure (LSS) and Cosmic Microwave Background (CMB) surveys. The emulators replace the computation of matter and CMB power spectra from Boltzmann codes; thus, they do not need to be re-trained for different choices of astrophysical nuisance parameters or redshift distributions. The matter power spectrum emulation error is less than $0.4%$ in the wavenumber range $k in [10^{-5}, 10] , mathrm{Mpc}^{-1}$, for redshift $z in [0, 5]$. $it{CosmoPower}$ emulates CMB temperature, polarisation and lensing potential power spectra in the $5sigma$ region of parameter space around the $it{Planck}$ best fit values with an error $lesssim 20%$ of the expected shot noise for the forthcoming Simons Observatory. $it{CosmoPower}$ is showcased on a joint cosmic shear and galaxy clustering analysis from the Kilo-Degree Survey, as well as on a Stage IV $it{Euclid}$-like simulated cosmic shear analysis. For the CMB case, $it{CosmoPower}$ is tested on a $it{Planck}$ 2018 CMB temperature and polarisation analysis. The emulators always recover the fiducial cosmological constraints with differences in the posteriors smaller than sampling noise, while providing a speed-up factor up to $O(10^4)$ to the complete inference pipeline. This acceleration allows posterior distributions to be recovered in just a few seconds, as we demonstrate in the $it{Planck}$ likelihood case. $it{CosmoPower}$ is written entirely in Python, can be interfaced with all commonly used cosmological samplers and is publicly available https://github.com/alessiospuriomancini/cosmopower .
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا