Do you want to publish a course? Click here

Constraining Neutrino Mass with the Tomographic Weak Lensing Bispectrum

76   0   0.0 ( 0 )
 Added by William Coulton
 Publication date 2018
  fields Physics
and research's language is English




Ask ChatGPT about the research

We explore the effect of massive neutrinos on the weak lensing shear bispectrum using the Cosmological Massive Neutrino Simulations. We find that the primary effect of massive neutrinos is to suppress the amplitude of the bispectrum with limited effect on the bispectrum shape. The suppression of the bispectrum amplitude is a factor of two greater than the suppression of the small scale power-spectrum. For an LSST-like weak lensing survey that observes half of the sky with five tomographic redshift bins, we explore the constraining power of the bispectrum on three cosmological parameters: the sum of the neutrino mass $sum m_ u$, the matter density $Omega_m$ and the amplitude of primordial fluctuations $A_s$. Bispectrum measurements alone provide similar constraints to the power spectrum measurements and combining the two probes leads to significant improvements than using the latter alone. We find that the joint constraints tighten the power spectrum $95%$ constraints by $sim 32%$ for $sum m_ u$, $13%$ for $Omega_m$ and $57%$ for $A_s$ .



rate research

Read More

We forecast and optimize the cosmological power of various weak-lensing aperture mass ($M_{rm ap}$) map statistics for future cosmic shear surveys, including peaks, voids, and the full distribution of pixels (1D $M_{rm ap}$). These alternative methods probe the non-Gaussian regime of the matter distribution, adding complementary cosmological information to the classical two-point estimators. Based on the SLICS and cosmo-SLICS $N$-body simulations, we build Euclid-like mocks to explore the $S_8 - Omega_{rm m} - w_0$ parameter space. We develop a new tomographic formalism which exploits the cross-information between redshift slices (cross-$M_{rm ap}$) in addition to the information from individual slices (auto-$M_{rm ap}$) probed in the standard approach. Our auto-$M_{rm ap}$ forecast precision is in good agreement with the recent literature on weak-lensing peak statistics, and is improved by $sim 50$% when including cross-$M_{rm ap}$. It is further boosted by the use of 1D $M_{rm ap}$ that outperforms all other estimators, including the shear two-point correlation function ($gamma$-2PCF). When considering all tomographic terms, our uncertainty range on the structure growth parameter $S_8$ is enhanced by $sim 45$% (almost twice better) when combining 1D $M_{rm ap}$ and the $gamma$-2PCF compared to the $gamma$-2PCF alone. We additionally measure the first combined forecasts on the dark energy equation of state $w_0$, finding a factor of three reduction of the statistical error compared to the $gamma$-2PCF alone. This demonstrates that the complementary cosmological information explored by non-Gaussian $M_{rm ap}$ map statistics not only offers the potential to improve the constraints on the recent $sigma_8$ - $Omega_{rm m}$ tension, but also constitutes an avenue to understand the accelerated expansion of our Universe.
Massive neutrinos influence the background evolution of the Universe as well as the growth of structure. Being able to model this effect and constrain the sum of their masses is one of the key challenges in modern cosmology. Weak-lensing cosmological constraints will also soon reach higher levels of precision with next-generation surveys like LSST, WFIRST and Euclid. We use the MassiveNus simulations to derive constraints on the sum of neutrino masses $M_{ u}$, the present-day total matter density $Omega_{rm m}$, and the primordial power spectrum normalization $A_{rm s}$ in a tomographic setting. We measure the lensing power spectrum as second-order statistics along with peak counts as higher-order statistics on lensing convergence maps generated from the simulations. We investigate the impact of multiscale filtering approaches on cosmological parameters by employing a starlet (wavelet) filter and a concatenation of Gaussian filters. In both cases peak counts perform better than the power spectrum on the set of parameters [$M_{ u}$, $Omega_{rm m}$, $A_{rm s}$] respectively by 63$%$, 40$%$ and 72$%$ when using a starlet filter and by 70$%$, 40$%$ and 77$%$ when using a multiscale Gaussian. More importantly, we show that when using a multiscale approach, joining power spectrum and peaks does not add any relevant information over considering just the peaks alone. While both multiscale filters behave similarly, we find that with the starlet filter the majority of the information in the data covariance matrix is encoded in the diagonal elements; this can be an advantage when inverting the matrix, speeding up the numerical implementation.
We address key points for an efficient implementation of likelihood codes for modern weak lensing large-scale structure surveys. Specifically, we focus on the joint weak lensing convergence power spectrum-bispectrum probe and we tackle the numerical challenges required by a realistic analysis. Under the assumption of (multivariate) Gaussian likelihoods, we have developed a high performance code that allows highly parallelised prediction of the binned tomographic observables and of their joint non-Gaussian covariance matrix accounting for terms up to the 6-point correlation function and super-sample effects. This performance allows us to qualitatively address several interesting scientific questions. We find that the bispectrum provides an improvement in terms of signal-to-noise ratio (S/N) of about 10% on top of the power spectrum, making it a non-negligible source of information for future surveys. Furthermore, we are capable to test the impact of theoretical uncertainties in the halo model used to build our observables; with presently allowed variations we conclude that the impact is negligible on the S/N. Finally, we consider data compression possibilities to optimise future analyses of the weak lensing bispectrum. We find that, ignoring systematics, 5 equipopulated redshift bins are enough to recover the information content of a Euclid-like survey, with negligible improvement when increasing to 10 bins. We also explore principal component analysis and dependence on the triangle shapes as ways to reduce the numerical complexity of the problem.
Recent studies have demonstrated that {em secondary} non-Gaussianity induced by gravity will be detected with a high signal-to-noise (S/N) by future and even by on-going weak lensing surveys. One way to characterise such non-Gaussianity is through the detection of a non-zero three-point correlation function of the lensing convergence field, or of its harmonic transform, the bispectrum. A recent study analysed the properties of the squeezed configuration of the bispectrum, when two wavenumbers are much larger than the third one. We extend this work by estimating the amplitude of the (reduced) bispectrum in four generic configurations, i.e., {em squeezed, equilateral, isosceles} and {em folded}, and for four different source redshifts $z_s=0.5,1.0,1.5,2.0$, by using an ensemble of all-sky high-resolution simulations. We compare these results against theoretical predictions. We find that, while the theoretical expectations based on widely used fitting functions can predict the general trends of the reduced bispectra, a more accurate theoretical modelling will be required to analyse the next generation of all-sky weak lensing surveys. The disagreement is particularly pronounced in the squeezed limit.
Upcoming surveys such as LSST{} and Euclid{} will significantly improve the power of weak lensing as a cosmological probe. To maximise the information that can be extracted from these surveys, it is important to explore novel statistics that complement standard weak lensing statistics such as the shear-shear correlation function and peak counts. In this work, we use a recently proposed weak lensing observable -- weak lensing voids -- to make parameter constraint forecasts for an LSST-like survey. We use the cosmoslics{} $w$CDM simulation suite to measure void statistics as a function of cosmological parameters. The simulation data is used to train a Gaussian process regression emulator that we use to generate likelihood contours and provide parameter constraints from mock observations. We find that the void abundance is more constraining than the tangential shear profiles, though the combination of the two gives additional constraining power. We forecast that without tomographic decomposition, these void statistics can constrain the matter fluctuation amplitude, $S_8$ within 0.3% (68% confidence interval), while offering 1.5, 1.5 and 2.7% precision on the matter density parameter, $Omega_{rm m}$, the reduced Hubble constant, $h$, and the dark energy equation of state parameter, $w_0$, respectively. These results are tighter than the constraints from the shear-shear correlation function with the same observational specifications for $Omega_m$, $S_8$ and $w_0$. The constraints from the WL voids also have complementary parameter degeneracy directions to the shear 2PCF for all combinations of parameters that include $h$, making weak lensing void statistics a promising cosmological probe.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا