No Arabic abstract
We report novel cosmological constraints obtained from cosmic voids in the final BOSS DR12 dataset. They arise from the joint analysis of geometric and dynamic distortions of average void shapes (i.e., the stacked void-galaxy cross-correlation function) in redshift space. Our model uses tomographic deprojection to infer real-space void profiles and self-consistently accounts for the Alcock-Paczynski (AP) effect and redshift-space distortions (RSD) without any prior assumptions on cosmology or structure formation. It is derived from first physical principles and provides an extremely good description of the data at linear perturbation order. We validate this model with the help of mock catalogs and apply it to the final BOSS data to constrain the RSD and AP parameters $f/b$ and $D_AH/c$, where $f$ is the linear growth rate, $b$ the linear galaxy bias, $D_A$ the comoving angular diameter distance, $H$ the Hubble rate, and $c$ the speed of light. In addition, we include two nuisance parameters in our analysis to marginalize over potential systematics. We obtain $f/b=0.540pm0.091$ and $D_AH/c=0.588pm0.004$ from the full void sample at a mean redshift of $z=0.51$. In a flat $Lambda$CDM cosmology, this implies $Omega_mathrm{m}=0.312pm0.020$ for the present-day matter density parameter. When we use additional information from the survey mocks to calibrate our model, these constraints improve to $f/b=0.347pm0.023$, $D_AH/c=0.588pm0.003$, and $Omega_mathrm{m}=0.310pm0.017$. However, we emphasize that the calibration depends on the specific model of cosmology and structure formation assumed in the mocks, so the calibrated results should be considered less robust. Nevertheless, our calibration-independent constraints are among the tightest of their kind to date, demonstrating the immense potential of using cosmic voids for cosmology in current and future data.
Cosmic voids in the large-scale structure of the Universe affect the peculiar motions of objects in their vicinity. Although these motions are difficult to observe directly, the clustering pattern of their surrounding tracers in redshift space is influenced in a unique way. This allows to investigate the interplay between densities and velocities around voids, which is solely dictated by the laws of gravity. With the help of $N$-body simulations and derived mock-galaxy catalogs we calculate the average density fluctuations around voids identified with a watershed algorithm in redshift space and compare the results with the expectation from general relativity and the $Lambda$CDM model. We find linear theory to work remarkably well in describing the dynamics of voids. Adopting a Bayesian inference framework, we explore the full posterior of our model parameters and forecast the achievable accuracy on measurements of the growth rate of structure and the geometric distortion through the Alcock-Paczynski effect. Systematic errors in the latter are reduced from $sim15%$ to $sim5%$ when peculiar velocities are taken into account. The relative parameter uncertainties in galaxy surveys with number densities comparable to the SDSS MAIN (CMASS) sample probing a volume of $1h^{-3}{rm Gpc}^3$ yield $sigma_{f/b}left/(f/b)right.sim2%$ ($20%$) and $sigma_{D_AH}/D_AHsim0.2%$ ($2%$), respectively. At this level of precision the linear-theory model becomes systematics dominated, with parameter biases that fall beyond these values. Nevertheless, the presented method is highly model independent; its viability lies in the underlying assumption of statistical isotropy of the Universe.
The Universe is mostly composed of large and relatively empty domains known as cosmic voids, whereas its matter content is predominantly distributed along their boundaries. The remaining material inside them, either dark or luminous matter, is attracted to these boundaries and causes voids to expand faster and to grow emptier over time. Using the distribution of galaxies centered on voids identified in the Sloan Digital Sky Survey and adopting minimal assumptions on the statistical motion of these galaxies, we constrain the average matter content $Omega_mathrm{m}=0.281pm0.031$ in the Universe today, as well as the linear growth rate of structure $f/b=0.417pm0.089$ at median redshift $bar{z}=0.57$, where $b$ is the galaxy bias ($68%$ C.L.). These values originate from a percent-level measurement of the anisotropic distortion in the void-galaxy cross-correlation function, $varepsilon = 1.003pm0.012$, and are robust to consistency tests with bootstraps of the data and simulated mock catalogs within an additional systematic uncertainty of half that size. They surpass (and are complementary to) existing constraints by unlocking cosmological information on smaller scales through an accurate model of nonlinear clustering and dynamics in void environments. As such, our analysis furnishes a powerful probe of deviations from Einsteins general relativity in the low-density regime which has largely remained untested so far. We find no evidence for such deviations in the data at hand.
We reexamine big bang nucleosynthesis with large-scale baryon density inhomogeneities when the length scale of the density fluctuations exceeds the neutron diffusion length ($sim 10^7-10^8$ cm at BBN), and the amplitude of the fluctuations is sufficiently small to prevent gravitational collapse. In this limit, the final light element abundances can be determined by simply mixing the abundances from regions with different baryon/photon ratios without interactions. We examine gaussian, lognormal, and gamma distributions for the baryon/photon ratio, $eta $. We find that the deuterium and lithium-7 abundances increase with the RMS fluctuation in $eta $, while the effect on helium-4 is much smaller. We show that these increases in the deuterium and lithium-7 abundances are a consequence of Jensens inequality, and we derive analytic approximations for these abundances in the limit of small RMS fluctuations. Observational upper limits on the primordial deuterium abundance constrain the RMS fluctuation in $eta $ to be less than $17%$ of the mean value of $eta $. This provides us with a new limit on the graininess of the early universe.
We present GIGANTES, the most extensive and realistic void catalog suite ever released -- containing over 1 billion cosmic voids covering a volume larger than the observable Universe, more than 20 TB of data, and created by running the void finder VIDE on QUIJOTEs halo simulations. The expansive and detailed GIGANTES suite, spanning thousands of cosmological models, opens up the study of voids, answering compelling questions: Do voids carry unique cosmological information? How is this information correlated with galaxy information? Leveraging the large number of voids in the GIGANTES suite, our Fisher constraints demonstrate voids contain additional information, critically tightening constraints on cosmological parameters. We use traditional void summary statistics (void size function, void density profile) and the void auto-correlation function, which independently yields an error of $0.13,mathrm{eV}$ on $sum,m_{ u}$ for a 1 $h^{-3}mathrm{Gpc}^3$ simulation, without CMB priors. Combining halos and voids we forecast an error of $0.09,mathrm{eV}$ from the same volume. Extrapolating to next generation multi-Gpc$^3$ surveys such as DESI, Euclid, SPHEREx, and the Roman Space Telescope, we expect voids should yield an independent determination of neutrino mass. Crucially, GIGANTES is the first void catalog suite expressly built for intensive machine learning exploration. We illustrate this by training a neural network to perform likelihood-free inference on the void size function. Cosmology problems provide an impetus to develop novel deep learning techniques, leveraging the symmetries embedded throughout the universe from physical laws, interpreting models, and accurately predicting errors. With GIGANTES, machine learning gains an impressive dataset, offering unique problems that will stimulate new techniques.
We show that the Big Bang Observer (BBO), a proposed space-based gravitational-wave (GW) detector, would provide ultra-precise measurements of cosmological parameters. By detecting ~300,000 compact-star binaries, and utilizing them as standard sirens, BBO would determine the Hubble constant to 0.1%, and the dark energy parameters w_0 and w_a to ~0.01 and 0.1,resp. BBOs dark-energy figure-of-merit would be approximately an order of magnitude better than all other proposed dark energy missions. To date, BBO has been designed with the primary goal of searching for gravitational waves from inflation. To observe this inflationary background, BBO would first have to detect and subtract out ~300,000 merging compact-star binaries, out to z~5. It is precisely this foreground which would enable high-precision cosmology. BBO would determine the luminosity distance to each binary to ~percent accuracy. BBOs angular resolution would be sufficient to uniquely identify the host galaxy for most binaries; a coordinated optical/infrared observing campaign could obtain the redshifts. Combining the GW-derived distances and EM-derived redshifts for such a large sample of objects leads to extraordinarily tight constraints on cosmological parameters. Such ``standard siren measurements of cosmology avoid many of the systematic errors associated with other techniques. We also show that BBO would be an exceptionally powerful gravitational lensing mission, and we briefly discuss other astronomical uses of BBO.