Do you want to publish a course? Click here

Cosmological inference from standard sirens without redshift measurements

81   0   0.0 ( 0 )
 Added by Xuheng Ding
 Publication date 2018
  fields Physics
and research's language is English




Ask ChatGPT about the research

The purpose of this work is to investigate the prospects of using the future standard siren data without redshift measurements to constrain cosmological parameters. With successful detections of gravitational wave (GW) signals an era of GW astronomy has begun. Unlike the electromagnetic domain, GW signals allow direct measurements of luminosity distances to the sources, while their redshifts remain to be measured by identifying electromagnetic counterparts. This leads to significant technical problems for almost all possible BH-BH systems. It is the major obstacle to cosmological applications of GW standard sirens. In this paper, we introduce the general framework of using luminosity distances alone for cosmological inference. The idea is to use the prior knowledge of the redshift probability distribution for coalescing sources from the intrinsic merger rates assessed with population synthesis codes. Then the posterior probability distributions for cosmological parameters can be calculated. We demonstrate the performance of our method on the simulated mock data and show that the luminosity distance measurement would enable an accurate determination of cosmological parameters up to $20%$ uncertainty level. We also find that in order to infer $H_0$ to 1% level with flat $Lambda$CDM model, we need about $10^5$ events.



rate research

Read More

Multi-messenger observations of binary neutron star mergers offer a promising path towards resolution of the Hubble constant ($H_0$) tension, provided their constraints are shown to be free from systematics such as the Malmquist bias. In the traditional Bayesian framework, accounting for selection effects in the likelihood requires calculation of the expected number (or fraction) of detections as a function of the parameters describing the population and cosmology; a potentially costly and/or inaccurate process. This calculation can, however, be bypassed completely by performing the inference in a framework in which the likelihood is never explicitly calculated, but instead fit using forward simulations of the data, which naturally include the selection. This is Likelihood-Free Inference (LFI). Here, we use density-estimation LFI, coupled to neural-network-based data compression, to infer $H_0$ from mock catalogues of binary neutron star mergers, given noisy redshift, distance and peculiar velocity estimates for each object. We demonstrate that LFI yields statistically unbiased estimates of $H_0$ in the presence of selection effects, with precision matching that of sampling the full Bayesian hierarchical model. Marginalizing over the bias increases the $H_0$ uncertainty by only $6%$ for training sets consisting of $O(10^4)$ populations. The resulting LFI framework is applicable to population-level inference problems with selection effects across astrophysics.
The observation of binary neutron star merger GW170817, along with its optical counterpart, provided the first constraint on the Hubble constant $H_0$ using gravitational wave standard sirens. When no counterpart is identified, a galaxy catalog can be used to provide the necessary redshift information. However, the true host might not be contained in a catalog which is not complete out to the limit of gravitational-wave detectability. These electromagnetic and gravitational-wave selection effects must be accounted for. We describe and implement a method to estimate $H_0$ using both the counterpart and the galaxy catalog standard siren methods. We perform a series of mock data analyses using binary neutron star mergers to confirm our ability to recover an unbiased estimate of $H_0$. Our simulations used a simplified universe with no redshift uncertainties or galaxy clustering, but with different magnitude-limited catalogs and assumed host galaxy properties, to test our treatment of both selection effects. We explore how the incompleteness of catalogs affects the final measurement of $H_0$, as well as the effect of weighting each galaxys likelihood of being a host by its luminosity. In our most realistic simulation, where the simulated catalog is about three times denser than the density of galaxies in the local universe, we find that a 4.4% measurement precision can be reached using galaxy catalogs with 50% completeness and $sim 250$ binary neutron star detections with sensitivity similar to that of Advanced LIGOs second observing run.
Modifications of General Relativity leave their imprint both on the cosmic expansion history through a non-trivial dark energy equation of state, and on the evolution of cosmological perturbations in the scalar and in the tensor sectors. In particular, the modification in the tensor sector gives rise to a notion of gravitational-wave (GW) luminosity distance, different from the standard electromagnetic luminosity distance, that can be studied with standard sirens at GW detectors such as LISA or third-generation ground based experiments. We discuss the predictions for modified GW propagation from some of the best studied theories of modified gravity, such as Horndeski or the more general degenerate higher order scalar-tensor (DHOST) theories, non-local infrared modifications of gravity, bigravity theories and the corresponding phenomenon of GW oscillation, as well as theories with extra or varying dimensions. We show that modified GW propagation is a completely generic phenomenon in modified gravity. We then use a simple parametrization of the effect in terms of two parameters $(Xi_0,n)$, that is shown to fit well the results from a large class of models, to study the prospects of observing modified GW propagation using supermassive black hole binaries as standard sirens with LISA. We construct mock source catalogs and perform detailed Markov Chain Monte Carlo studies of the likelihood obtained from LISA standard sirens alone, as well as by combining them with CMB, BAO and SNe data to reduce the degeneracies between cosmological parameters. We find that the combination of LISA with the other cosmological datasets allows one to measure the parameter $Xi_0$ that characterizes modified GW propagation to the percent level accuracy, sufficient to test several modified gravity theories. [Abridged]
299 - Raul Jimenez 2017
An interesting test on the nature of the Universe is to measure the global spatial curvature of the metric in a model independent way, at a level of $|Omega_k|<10^{-4}$, or, if possible, at the cosmic variance level of the amplitude of the CMB fluctuations $|Omega_k|approx10^{-5}$. A limit of $|Omega_k|<10^{-4}$ would yield stringent tests on several models of inflation. Further, improving the constraint by an order of magnitude would help in reducing model confusion in standard parameter estimation. Moreover, if the curvature is measured to be at the value of the amplitude of the CMB fluctuations, it would offer a powerful test on the inflationary paradigm and would indicate that our Universe must be significantly larger than the current horizon. On the contrary, in the context of standard inflation, measuring a value above CMB fluctuations will lead us to conclude that the Universe is not much larger than the current observed horizon, this can also be interpreted as the presence of large fluctuations outside the horizon. However, it has proven difficult, so far, to find observables that can achieve such level of accuracy, and, most of all, be model-independent. Here we propose a method that can in principle achieve that, this is done by making minimal assumptions and using distance probes that are cosmology-independent: gravitational waves, redshift drift and cosmic chronometers. We discuss what kind of observations are needed in principle to achieve the desired accuracy.
Interacting dark matter (DM) - dark energy (DE) models have been intensively investigated in the literature for their ability to fit various data sets as well as to explain some observational tensions persisting within the $Lambda$CDM cosmology. In this work, we employ Gaussian processes (GP) algorithm to perform a joint analysis by using the geometrical cosmological probes such as Cosmic chronometers, Supernova Type Ia, Baryon Acoustic Oscillations and the H0LiCOW lenses sample to infer a reconstruction of the coupling function between the dark components in a general framework, where the DE can assume a dynamical character via its equation of state. In addition to the joint analysis with these data, we simulate a catalog with standard siren events from binary neutron star mergers, within the sensitivity predicted by the Einstein Telescope, to reconstruct the dark sector coupling with more accuracy in a robust way. We find that the particular case, where $w = -1$ is fixed on the DE nature, has a statistical preference for an interaction in the dark sector at late times. In the general case, where $w(z)$ is analyzed, we find no evidence for such dark coupling, and the predictions are compatible with the $Lambda$CDM paradigm. When the mock events of the standard sirens are considered to improve the kernel in GP predictions, we find preference for an interaction in the dark sector at late times.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا