Do you want to publish a course? Click here

Probing Cosmology with Weak Lensing Peak Counts

173   0   0.0 ( 0 )
 Added by Zoltan Haiman
 Publication date 2009
  fields Physics
and research's language is English




Ask ChatGPT about the research

We propose counting peaks in weak lensing (WL) maps, as a function of their height, to probe models of dark energy and to constrain cosmological parameters. Because peaks can be identified in two-dimensional WL maps directly, they can provide constraints that are free from potential selection effects and biases involved in identifying and determining the masses of galaxy clusters. We have run cosmological N-body simulations to produce WL convergence maps in three models with different constant values of the dark energy equation of state parameter, w=-0.8, -1, and -1.2, with a fixed normalization of the primordial power spectrum (corresponding to present-day normalizations of sigma8=0.742, 0.798, and 0.839, respectively). By comparing the number of WL peaks in 8 convergence bins in the range of -0.1 < kappa < 0.2, in multiple realizations of a single simulated 3x3 degree field, we show that the first (last) pair of models can be distinguished at the 95% (85%) confidence level. A survey with depth and area (20,000 sq. degrees), comparable to those expected from LSST, should have a factor of approx. 50 better parameter sensitivity. We find that relatively low-amplitude peaks (kappa = 0.03), which typically do not correspond to a single collapsed halo along the line of sight, account for most of this sensitivity. We study a range of smoothing scales and source galaxy redshifts (z_s). With a fixed source galaxy density of 15/arcmin^2, the best results are provided by the smallest scale we can reliably simulate, 1 arcminute, and z_s=2 provides substantially better sensitivity than z_s< 1.5.



rate research

Read More

Massive neutrinos influence the background evolution of the Universe as well as the growth of structure. Being able to model this effect and constrain the sum of their masses is one of the key challenges in modern cosmology. Weak-lensing cosmological constraints will also soon reach higher levels of precision with next-generation surveys like LSST, WFIRST and Euclid. We use the MassiveNus simulations to derive constraints on the sum of neutrino masses $M_{ u}$, the present-day total matter density $Omega_{rm m}$, and the primordial power spectrum normalization $A_{rm s}$ in a tomographic setting. We measure the lensing power spectrum as second-order statistics along with peak counts as higher-order statistics on lensing convergence maps generated from the simulations. We investigate the impact of multiscale filtering approaches on cosmological parameters by employing a starlet (wavelet) filter and a concatenation of Gaussian filters. In both cases peak counts perform better than the power spectrum on the set of parameters [$M_{ u}$, $Omega_{rm m}$, $A_{rm s}$] respectively by 63$%$, 40$%$ and 72$%$ when using a starlet filter and by 70$%$, 40$%$ and 77$%$ when using a multiscale Gaussian. More importantly, we show that when using a multiscale approach, joining power spectrum and peaks does not add any relevant information over considering just the peaks alone. While both multiscale filters behave similarly, we find that with the starlet filter the majority of the information in the data covariance matrix is encoded in the diagonal elements; this can be an advantage when inverting the matrix, speeding up the numerical implementation.
216 - Jan M. Kratochvil 2011
In this paper, we show that Minkowski Functionals (MFs) of weak gravitational lensing (WL) convergence maps contain significant non-Gaussian, cosmology-dependent information. To do this, we use a large suite of cosmological ray-tracing N-body simulations to create mock WL convergence maps, and study the cosmological information content of MFs derived from these maps. Our suite consists of 80 independent 512^3 N-body runs, covering seven different cosmologies, varying three cosmological parameters Omega_m, w, and sigma_8 one at a time, around a fiducial LambdaCDM model. In each cosmology, we use ray-tracing to create a thousand pseudo-independent 12 deg^2 convergence maps, and use these in a Monte Carlo procedure to estimate the joint confidence contours on the above three parameters. We include redshift tomography at three different source redshifts z_s=1, 1.5, 2, explore five different smoothing scales theta_G=1, 2, 3, 5, 10 arcmin, and explicitly compare and combine the MFs with the WL power spectrum. We find that the MFs capture a substantial amount of information from non-Gaussian features of convergence maps, i.e. beyond the power spectrum. The MFs are particularly well suited to break degeneracies and to constrain the dark energy equation of state parameter w (by a factor of ~ three better than from the power spectrum alone). The non-Gaussian information derives partly from the one-point function of the convergence (through V_0, the area MF), and partly through non-linear spatial information (through combining different smoothing scales for V_0, and through V_1 and V_2, the boundary length and genus MFs, respectively). In contrast to the power spectrum, the best constraints from the MFs are obtained only when multiple smoothing scales are combined.
The statistics of peaks in weak lensing convergence maps is a promising tool to investigate both the properties of dark matter haloes and constrain the cosmological parameters. We study how the number of detectable peaks and its scaling with redshift depend upon the cluster dark matter halo profiles and use peak statistics to constrain the parameters of the mass - concentration (MC) relation. We investigate which constraints the Euclid mission can set on the MC coefficients also taking into account degeneracies with the cosmological parameters. To this end, we first estimate the number of peaks and its redshift distribution for different MC relations. We find that the steeper the mass dependence and the larger the normalisation, the higher is the number of detectable clusters, with the total number of peaks changing up to $40%$ depending on the MC relation. We then perform a Fisher matrix forecast of the errors on the MC relation parameters as well as cosmological parameters. We find that peak number counts detected by Euclid can determine the normalization $A_v$, the mass $B_v$ and redshift $C_v$ slopes and intrinsic scatter $sigma_v$ of the MC relation to an unprecedented accuracy being $sigma(A_v)/A_v = 1%$, $sigma(B_v)/B_v = 4%$, $sigma(C_v)/C_v = 9%$, $sigma(sigma_v)/sigma_v = 1%$ if all cosmological parameters are assumed to be known. Should we relax this severe assumption, constraints are degraded, but remarkably good results can be restored setting only some of the parameters or combining peak counts with Planck data. This precision can give insight on competing scenarios of structure formation and evolution and on the role of baryons in cluster assembling. Alternatively, for a fixed MC relation, future peaks counts can perform as well as current BAO and SNeIa when combined with Planck.
505 - Jia Liu 2014
Lensing peaks have been proposed as a useful statistic, containing cosmological information from non-Gaussianities that is inaccessible from traditional two-point statistics such as the power spectrum or two-point correlation functions. Here we examine constraints on cosmological parameters from weak lensing peak counts, using the publicly available data from the 154 deg$^2$ CFHTLenS survey. We utilize a new suite of ray-tracing N-body simulations on a grid of 91 cosmological models, covering broad ranges of the three parameters $Omega_m$, $sigma_8$, and $w$, and replicating the Galaxy sky positions, redshifts, and shape noise in the CFHTLenS observations. We then build an emulator that interpolates the power spectrum and the peak counts to an accuracy of $leq 5%$, and compute the likelihood in the three-dimensional parameter space ($Omega_m$, $sigma_8$, $w$) from both observables. We find that constraints from peak counts are comparable to those from the power spectrum, and somewhat tighter when different smoothing scales are combined. Neither observable can constrain $w$ without external data. When the power spectrum and peak counts are combined, the area of the error banana in the ($Omega_m$, $sigma_8$) plane reduces by a factor of $approx2$, compared to using the power spectrum alone. For a flat $Lambda$ cold dark matter model, combining both statistics, we obtain the constraint $sigma_8(Omega_m/0.27)^{0.63}=0.85substack{+0.03 -0.03}$.
We develop and apply an analytic method to predict peak counts in weak-lensing surveys. It is based on the theory of Gaussian random fields and suitable to quantify the level of spurious detections caused by chance projections of large-scale structures as well as the shape and shot noise contributed by the background galaxies. We compare our method to peak counts obtained from numerical ray-tracing simulations and find good agreement at the expected level. The number of peak detections depends substantially on the shape and size of the filter applied to the gravitational shear field. Our main results are that weak-lensing peak counts are dominated by spurious detections up to signal-to-noise ratios of 3--5 and that most filters yield only a few detections per square degree above this level, while a filter optimised for suppressing large-scale structure noise returns up to an order of magnitude more.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا