Do you want to publish a course? Click here

The impact of braiding covariance and in-survey covariance on next-generation galaxy surveys

262   0   0.0 ( 0 )
 Added by Fabien Lacasa
 Publication date 2019
  fields Physics
and research's language is English
 Authors Fabien Lacasa




Ask ChatGPT about the research

As galaxy surveys become more precise and push to smaller scales, the need for accurate covariances beyond the classical Gaussian formula becomes more acute. Here, I investigate the analytical implementation and impact of non-Gaussian covariance terms that I previously derived for galaxy clustering. Braiding covariance is such a class of terms and it gets contribution both from in-survey and super-survey modes. I present an approximation for braiding covariance which speeds up the numerical computation. I show that including braiding covariance is a necessary condition for including other non-Gaussian terms: the in-survey 2-, 3- and 4-halo covariance, which yield covariance matrices with negative eigenvalues if considered on their own. I then quantify the impact on parameter constraints, with forecasts for a Euclid-like survey. Compared to the Gaussian case, braiding and in-survey covariances significantly increase the error bars on cosmological parameters, in particular by 50% for w. The Halo Occupation Distribution (HOD) error bars are also affected between 12% and 39%. Accounting for super-sample covariance (SSC) also increases parameter errors, by 90% for w and between 7% and 64% for HOD. In total, non-Gaussianity increases the error bar on w by 120% (between 15% and 80% for other cosmological parameters), and the error bars on HOD parameters between 17% and 85%. Accounting for the 1-halo trispectrum term on top of SSC is not sufficient for capturing the full non-Gaussian impact: braiding and the rest of in-survey covariance have to be accounted for. Finally, I discuss why the inclusion of non-Gaussianity generally eases up parameter degeneracies, making cosmological constraints more robust to astrophysical uncertainties. The data and a Python notebook reproducing the results and plots of the article are available at url{https://github.com/fabienlacasa/BraidingArticle}. [Abridged]



rate research

Read More

Photometric galaxy surveys probe the late-time Universe where the density field is highly non-Gaussian. A consequence is the emergence of the super-sample covariance (SSC), a non-Gaussian covariance term that is sensitive to fluctuations on scales larger than the survey window. In this work, we study the impact of the survey geometry on the SSC and, subsequently, on cosmological parameter inference. We devise a fast SSC approximation that accounts for the survey geometry and compare its performance to the common approximation of rescaling the results by the fraction of the sky covered by the survey, $f_mathrm{SKY}$, dubbed full-sky approximation. To gauge the impact of our new SSC recipe, dubbed partial-sky, we perform Fisher forecasts on the parameters of the $(w_0,w_a)$-CDM model in a 3x2 points analysis, varying the survey area, the geometry of the mask and the galaxy distribution inside our redshift bins. The differences in the marginalised forecast errors, with the full-sky approximation performing poorly for small survey areas but excellently for stage-IV-like areas, are found to be absorbed by the marginalisation on galaxy bias nuisance parameters. For large survey areas, the unmarginalised errors are underestimated by about 10% for all probes considered. This is a hint that, even for stage-IV-like surveys, the partial-sky method introduced in this work will be necessary if tight priors are applied on these nuisance parameters.
We describe and test the fiducial covariance matrix model for the combined 2-point function analysis of the Dark Energy Survey Year 3 (DES-Y3) dataset. Using a variety of new ansatzes for covariance modelling and testing we validate the assumptions and approximations of this model. These include the assumption of a Gaussian likelihood, the trispectrum contribution to the covariance, the impact of evaluating the model at a wrong set of parameters, the impact of masking and survey geometry, deviations from Poissonian shot-noise, galaxy weighting schemes and other, sub-dominant effects. We find that our covariance model is robust and that its approximations have little impact on goodness-of-fit and parameter estimation. The largest impact on best-fit figure-of-merit arises from the so-called $f_{mathrm{sky}}$ approximation for dealing with finite survey area, which on average increases the $chi^2$ between maximum posterior model and measurement by $3.7%$ ($Delta chi^2 approx 18.9$). Standard methods to go beyond this approximation fail for DES-Y3, but we derive an approximate scheme to deal with these features. For parameter estimation, our ignorance of the exact parameters at which to evaluate our covariance model causes the dominant effect. We find that it increases the scatter of maximum posterior values for $Omega_m$ and $sigma_8$ by about $3%$ and for the dark energy equation of state parameter by about $5%$.
Upcoming weak lensing surveys will probe large fractions of the sky with unprecedented accuracy. To infer cosmological constraints, a large ensemble of survey simulations are required to accurately model cosmological observables and their covariances. We develop a parallelized multi-lens-plane pipeline called UFalcon, designed to generate full-sky weak lensing maps from lightcones within a minimal runtime. It makes use of L-PICOLA, an approximate numerical code, which provides a fast and accurate alternative to cosmological $N$-Body simulations. The UFalcon maps are constructed by nesting 2 simulations covering a redshift-range from $z=0.1$ to $1.5$ without replicating the simulation volume. We compute the convergence and projected overdensity maps for L-PICOLA in the lightcone or snapshot mode. The generation of such a map, including the L-PICOLA simulation, takes about 3 hours walltime on 220 cores. We use the maps to calculate the spherical harmonic power spectra, which we compare to theoretical predictions and to UFalcon results generated using the full $N$-Body code GADGET-2. We then compute the covariance matrix of the full-sky spherical harmonic power spectra using 150 UFalcon maps based on L-PICOLA in lightcone mode. We consider the PDF, the higher-order moments and the variance of the smoothed field variance to quantify the accuracy of the covariance matrix, which we find to be a few percent for scales $ell sim 10^2$ to $10^3$. We test the impact of this level of accuracy on cosmological constraints using an optimistic survey configuration, and find that the final results are robust to this level of uncertainty. The speed and accuracy of our developed pipeline provides a basis to also include further important features such as masking, varying noise and will allow us to compute covariance matrices for models beyond $Lambda$CDM. [abridged]
Aims. We investigate the contribution of shot-noise and sample variance to the uncertainty of cosmological parameter constraints inferred from cluster number counts in the context of the Euclid survey. Methods. By analysing 1000 Euclid-like light-cones, produced with the PINOCCHIO approximate method, we validate the analytical model of Hu & Kravtsov 2003 for the covariance matrix, which takes into account both sources of statistical error. Then, we use such covariance to define the likelihood function that better extracts cosmological information from cluster number counts at the level of precision that will be reached by the future Euclid photometric catalogs of galaxy clusters. We also study the impact of the cosmology dependence of the covariance matrix on the parameter constraints. Results. The analytical covariance matrix reproduces the variance measured from simulations within the 10 per cent level; such difference has no sizeable effect on the error of cosmological parameter constraints at this level of statistics. Also, we find that the Gaussian likelihood with cosmology-dependent covariance is the only model that provides an unbiased inference of cosmological parameters without underestimating the errors.
The thermal Sunyaev-Zeldovich (tSZ) effect is one of the primary tools for finding and characterizing galaxy clusters. Several ground-based experiments are either underway or are being planned for mapping wide areas of the sky at $sim 150$ GHz with large-aperture telescopes. We present cosmological forecasts for a straw man tSZ survey that will observe a sky area between $200$ and $10^4$ deg$^2$ to an rms noise level between 2.8 and 20.2 $mu$K-arcmin. The probes we consider are the cluster number counts (as a function of the integrated Compton-$Y$ parameter and redshift) and their angular clustering (as a function of redshift). At fixed observing time, we find that wider surveys constrain cosmology slightly better than deeper ones due to their increased ability to detect rare high-mass clusters. In all cases, we notice that adding the clustering information does not practically improve the constraints derived from the number counts. We compare forecasts obtained by sampling the posterior distribution with the Markov-chain-Monte-Carlo method against those derived using the Fisher-matrix formalism. We find that the latter produces slightly optimistic constraints where errors are underestimated at the 10 per cent level. Most importantly, we use an analytic method to estimate the selection function of the survey and account for its response to variations of the cosmological parameters in the likelihood function. Our analysis demonstrates that neglecting this effect (as routinely done in the literature) yields artificially tighter constraints by a factor of 2.2 and 1.7 for $sigma_8$ and $Omega_mathrm{M}$, respectively.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا