ترغب بنشر مسار تعليمي؟ اضغط هنا

Bayesian cosmological inference through implicit cross-correlation statistics

115   0   0.0 ( 0 )
 نشر من قبل Guilhem Lavaux
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Analyzes of next-generation galaxy data require accurate treatment of systematic effects such as the bias between observed galaxies and the underlying matter density field. However, proposed models of the phenomenon are either numerically expensive or too inaccurate to achieve unbiased inferences of cosmological parameters even at mildly-nonlinear scales of the data. As an alternative to constructing accurate galaxy bias models, requiring understanding galaxy formation, we propose to construct likelihood distributions for Bayesian forward modeling approaches that are insensitive to linear, scale-dependent bias and provide robustness against model misspecification. We use maximum entropy arguments to construct likelihood distributions designed to account only for correlations between data and inferred quantities. By design these correlations are insensitive to linear galaxy biasing relations, providing the desired robustness. The method is implemented and tested within a Markov Chain Monte Carlo approach. The method is assessed using a halo mock catalog based on standard full, cosmological, N-body simulations. We obtain unbiased and tight constraints on cosmological parameters exploiting only linear cross-correlation rates for $kle 0.10$ Mpc/h. Tests for halos of masses ~10$^{12}$ M$_odot$ to ~10$^{13}$ M$_odot$ indicate that it is possible to ignore all details of the linear, scale dependent, bias function while obtaining robust constraints on cosmology. Our results provide a promising path forward to analyzes of galaxy surveys without the requirement of having to accurately model the details of galaxy biasing but by designing robust likelihoods for the inference.



قيم البحث

اقرأ أيضاً

We present a large-scale Bayesian inference framework to constrain cosmological parameters using galaxy redshift surveys, via an application of the Alcock-Paczynski (AP) test. Our physical model of the non-linearly evolved density field, as probed by galaxy surveys, employs Lagrangian perturbation theory (LPT) to connect Gaussian initial conditions to the final density field, followed by a coordinate transformation to obtain the redshift space representation for comparison with data. We generate realizations of primordial and present-day matter fluctuations given a set of observations. This hierarchical approach encodes a novel AP test, extracting several orders of magnitude more information from the cosmological expansion compared to classical approaches, to infer cosmological parameters and jointly reconstruct the underlying 3D dark matter density field. The novelty of this AP test lies in constraining the comoving-redshift transformation to infer the appropriate cosmology which yields isotropic correlations of the galaxy density field, with the underlying assumption relying purely on the cosmological principle. Such an AP test does not rely explicitly on modelling the full statistics of the field. We verify in depth via simulations that this renders our test robust to model misspecification. This leads to another crucial advantage, namely that the cosmological parameters exhibit extremely weak dependence on the currently unresolved phenomenon of galaxy bias, thereby circumventing a potentially key limitation. This is consequently among the first methods to extract a large fraction of information from statistics other than that of direct density contrast correlations, without being sensitive to the amplitude of density fluctuations. We perform several statistical efficiency and consistency tests on a mock galaxy catalogue, using the SDSS-III survey as template.
We present $it{CosmoPower}$, a suite of neural cosmological power spectrum emulators providing orders-of-magnitude acceleration for parameter estimation from two-point statistics analyses of Large-Scale Structure (LSS) and Cosmic Microwave Background (CMB) surveys. The emulators replace the computation of matter and CMB power spectra from Boltzmann codes; thus, they do not need to be re-trained for different choices of astrophysical nuisance parameters or redshift distributions. The matter power spectrum emulation error is less than $0.4%$ in the wavenumber range $k in [10^{-5}, 10] , mathrm{Mpc}^{-1}$, for redshift $z in [0, 5]$. $it{CosmoPower}$ emulates CMB temperature, polarisation and lensing potential power spectra in the $5sigma$ region of parameter space around the $it{Planck}$ best fit values with an error $lesssim 20%$ of the expected shot noise for the forthcoming Simons Observatory. $it{CosmoPower}$ is showcased on a joint cosmic shear and galaxy clustering analysis from the Kilo-Degree Survey, as well as on a Stage IV $it{Euclid}$-like simulated cosmic shear analysis. For the CMB case, $it{CosmoPower}$ is tested on a $it{Planck}$ 2018 CMB temperature and polarisation analysis. The emulators always recover the fiducial cosmological constraints with differences in the posteriors smaller than sampling noise, while providing a speed-up factor up to $O(10^4)$ to the complete inference pipeline. This acceleration allows posterior distributions to be recovered in just a few seconds, as we demonstrate in the $it{Planck}$ likelihood case. $it{CosmoPower}$ is written entirely in Python, can be interfaced with all commonly used cosmological samplers and is publicly available https://github.com/alessiospuriomancini/cosmopower .
We present a comparison of simulation-based inference to full, field-based analytical inference in cosmological data analysis. To do so, we explore parameter inference for two cases where the information content is calculable analytically: Gaussian r andom fields whose covariance depends on parameters through the power spectrum; and correlated lognormal fields with cosmological power spectra. We compare two inference techniques: i) explicit field-level inference using the known likelihood and ii) implicit likelihood inference with maximally informative summary statistics compressed via Information Maximising Neural Networks (IMNNs). We find that a) summaries obtained from convolutional neural network compression do not lose information and therefore saturate the known field information content, both for the Gaussian covariance and the lognormal cases, b) simulation-based inference using these maximally informative nonlinear summaries recovers nearly losslessly the exact posteriors of field-level inference, bypassing the need to evaluate expensive likelihoods or invert covariance matrices, and c) even for this simple example, implicit, simulation-based likelihood incurs a much smaller computational cost than inference with an explicit likelihood. This work uses a new IMNNs implementation in $texttt{Jax}$ that can take advantage of fully-differentiable simulation and inference pipeline. We also demonstrate that a single retraining of the IMNN summaries effectively achieves the theoretically maximal information, enhancing the robustness to the choice of fiducial model where the IMNN is trained.
We demonstrate a measure for the effective number of parameters constrained by a posterior distribution in the context of cosmology. In the same way that the mean of the Shannon information (i.e. the Kullback-Leibler divergence) provides a measure of the strength of constraint between prior and posterior, we show that the variance of the Shannon information gives a measure of dimensionality of constraint. We examine this quantity in a cosmological context, applying it to likelihoods derived from Cosmic Microwave Background, large scale structure and supernovae data. We show that this measure of Bayesian model dimensionality compares favourably both analytically and numerically in a cosmological context with the existing measure of model complexity used in the literature.
We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect halos of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesia n large-scale structure inference algorithm, HADES, and a Bayesian chain rule (the Blackwell-Rao Estimator), which we use to connect the inferred density field to the properties of dark matter halos. To demonstrate the capability of our approach we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful of detection of halos in the mock catalogue increases as a function of the signal-to-noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا