ترغب بنشر مسار تعليمي؟ اضغط هنا

Even in a universe that is homogeneous on large scales, local density fluctuations can imprint a systematic signature on the cosmological inferences we make from distant sources. One example is the effect of a local under-density on supernova cosmolo gy. Also known as a Hubble-bubble, it has been suggested that a large enough under-density could account for the supernova magnitude- redshift relation without the need for dark energy or acceleration. Although the size and depth of under-density required for such an extreme result is extremely unlikely to be a random fluctuation in an on-average homogeneous universe, even a small under-density can leave residual effects on our cosmological inferences. In this paper we show that there remain systematic shifts in our cosmological parameter measure- ments, even after excluding local supernovae that are likely to be within any small Hubble-bubble. We study theoretically the low-redshift cutoff typically imposed by supernova cosmology analyses, and show that a low-redshift cut of z0 sim 0.02 may be too low based on the observed inhomogeneity in our local universe. Neglecting to impose any low-redshift cutoff can have a significant effect on the cosmological pa- rameters derived from supernova data. A slight local under-density, just 30% under-dense with scale 70h^{-1} Mpc, causes an error in the inferred cosmological constant density {Omega}{Lambda} of sim 4%. Imposing a low-redshift cutoff reduces this systematic error but does not remove it entirely. A residual systematic shift of 0.99% remains in the inferred value {Omega}{Lambda} even when neglecting all data within the currently pre- ferred low-redshift cutoff of 0.02. Given current measurement uncertainties this shift is not negligible, and will need to be accounted for when future measurements yield higher precision.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا