ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantifying tensions in cosmological parameters: Interpreting the DES evidence ratio

89   0   0.0 ( 0 )
 نشر من قبل W.J. Handley
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We provide a new interpretation for the Bayes factor combination used in the Dark Energy Survey (DES) first year analysis to quantify the tension between the DES and Planck datasets. The ratio quantifies a Bayesian confidence in our ability to combine the datasets. This interpretation is prior-dependent, with wider prior widths boosting the confidence. We therefore propose that if there are any reasonable priors which reduce the confidence to below unity, then we cannot assert that the datasets are compatible. Computing the evidence ratios for the DES first year analysis and Planck, given that narrower priors drop the confidence to below unity, we conclude that DES and Planck are, in a Bayesian sense, incompatible under LCDM. Additionally we compute ratios which confirm the consensus that measurements of the acoustic scale by the Baryon Oscillation Spectroscopic Survey (SDSS) are compatible with Planck, whilst direct measurements of the acceleration rate of the Universe by the SHOES collaboration are not. We propose a modification to the Bayes ratio which removes the prior dependency using Kullback-Leibler divergences, and using this statistical test find Planck in strong tension with SHOES, in moderate tension with DES, and in no tension with SDSS. We propose this statistic as the optimal way to compare datasets, ahead of the next DES data releases, as well as future surveys. Finally, as an element of these calculations, we introduce in a cosmological setting the Bayesian model dimensionality, which is a parameterisation-independent measure of the number of parameters that a given dataset constrains.



قيم البحث

اقرأ أيضاً

149 - Marco Raveri , Cyrille Doux 2021
We discuss how to efficiently and reliably estimate the level of agreement and disagreement on parameter determinations from different experiments, fully taking into account non-Gaussianities in the parameter posteriors. We develop two families of sc alable algorithms that allow us to perform this type of calculations in increasing number of dimensions and for different levels of tensions. One family of algorithms rely on kernel density estimates of posterior distributions while the other relies on machine learning modeling of the posterior distribution with normalizing flows. We showcase their effectiveness and accuracy with a set of benchmark examples and find both methods agree with each other and the true tension within $0.5sigma$ in difficult cases and generally to $0.2sigma$ or better. This allows us to study the level of internal agreement between different measurements of the clustering of cosmological structures from the Dark Energy Survey and their agreement with measurements of the Cosmic Microwave Background from the Planck satellite.
Cosmological tensions can arise within $Lambda$CDM scenario amongst different observational windows, which may indicate new physics beyond the standard paradigm if confirmed by measurements. In this article, we report how to alleviate both the $H_0$ and $sigma_8$ tensions simultaneously within torsional gravity from the perspective of effective field theory (EFT). Following these observations, we construct concrete models of Lagrangians of torsional gravity. Specifically, we consider the parametrization $f(T)=-T-2Lambda/M_P^2+alpha T^beta$, where two out of the three parameters are independent. This model can efficiently fit observations solving the two tensions. To our knowledge, this is the first time where a modified gravity theory can alleviate both $H_0$ and $sigma_8$ tensions simultaneously, hence, offering an additional argument in favor of gravitational modification.
We investigate constraints on some key cosmological parameters by confronting metastable dark energy models with different combinations of the most recent cosmological observations. Along with the standard $Lambda$CDM model, two phenomenological meta stable dark energy models are considered: (romannumeral1) DE decays exponentially, (romannumeral2) DE decays into dark matter. We find that: (1) when considering the most recent supernovae and BAO data, and assuming a fiducial $Lambda$CDM model, the inconsistency in the estimated value of the $Omega_{rm{m,0}}h^2$ parameter obtained by either including or excluding Planck CMB data becomes very much substantial and points to a clear tension~citep{sahni2014model,zhao2017dynamical}; (2) although the two metastable dark energy models that we study provide greater flexibility in fitting the data, and they indeed fit the SNe Ia+BAO data substantially better than $Lambda$CDM, they are not able to alleviate this tension significantly when CMB data are included; (3) while local measurements of the Hubble constant are significantly higher relative to the estimated value of $H_0$ in our models (obtained by fitting to SNe Ia and BAO data), the situation seems to be rather complicated with hints of inconsistency among different observational data sets (CMB, SNe Ia+BAO and local $H_0$ measurements). Our results indicate that we might not be able to remove the current tensions among different cosmological observations by considering simple modifications of the standard model or by introducing minimal dark energy models. A complicated form of expansion history, different systematics in different data and/or a non-conventional model of the early Universe might be responsible for these tensions.
We demonstrate a measure for the effective number of parameters constrained by a posterior distribution in the context of cosmology. In the same way that the mean of the Shannon information (i.e. the Kullback-Leibler divergence) provides a measure of the strength of constraint between prior and posterior, we show that the variance of the Shannon information gives a measure of dimensionality of constraint. We examine this quantity in a cosmological context, applying it to likelihoods derived from Cosmic Microwave Background, large scale structure and supernovae data. We show that this measure of Bayesian model dimensionality compares favourably both analytically and numerically in a cosmological context with the existing measure of model complexity used in the literature.
The late-time modifications of the standard $Lambda$ Cold Dark Matter ($Lambda$CDM) cosmological model can be parameterized by three time-dependent functions describing the expansion history of the Universe and gravitational effects on light and matt er in the Large Scale Structure. In this Letter, we present the first joint reconstruction of these three functions performed in a non-parametric way from a combination of recent cosmological observations. The reconstruction is performed with a theory-informed prior, built on the general Horndeski class of scalar-tensor theories. We find that current data can constrain 15 combined modes of these three functions with respect to the prior. Our methodology enables us to identify the phenomenological features that alternative theories would need to have in order to ease some of the tensions between datasets within $Lambda$CDM, and deduce important constraints on broad classes of modified gravity models.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا