Do you want to publish a course? Click here

Improved time-delay lens modelling and $H_0$ inference with transient sources

97   0   0.0 ( 0 )
 Added by Xuheng Ding
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

Strongly lensed explosive transients such as supernovae, gamma-ray bursts, fast radio bursts, and gravitational waves are very promising tools to determine the Hubble constant ($H_0$) in the near future in addition to strongly lensed quasars. In this work, we show that the transient nature of the point source provides an advantage over quasars: the lensed host galaxy can be observed before or after the transients appearance. Therefore, the lens model can be derived from images free of contamination from bright point sources. We quantify this advantage by comparing the precision of a lens model obtained from the same lenses with and without point sources. Based on Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) observations with the same sets of lensing parameters, we simulate realistic mock datasets of 48 quasar lensing systems (i.e., adding AGN in the galaxy center) and 48 galaxy-galaxy lensing systems (assuming the transient source is not visible but the time delay and image positions have been or will be measured). We then model the images and compare the inferences of the lens model parameters and $H_0$. We find that the precision of the lens models (in terms of the deflector mass slope) is better by a factor of 4.1 for the sample without lensed point sources, resulting in an increase of $H_0$ precision by a factor of 2.9. The opportunity to observe the lens systems without the transient point sources provides an additional advantage for time-delay cosmography over lensed quasars. It facilitates the determination of higher signal-to-noise stellar kinematics of the main deflector, and thus its mass density profile, which in turn plays a key role in breaking the mass-sheet degeneracy and constraining $H_0$.



rate research

Read More

160 - X. Ding , T. Treu , S. Birrer 2020
In recent years, breakthroughs in methods and data have enabled gravitational time delays to emerge as a very powerful tool to measure the Hubble constant $H_0$. However, published state-of-the-art analyses require of order 1 year of expert investigator time and up to a million hours of computing time per system. Furthermore, as precision improves, it is crucial to identify and mitigate systematic uncertainties. With this time delay lens modelling challenge we aim to assess the level of precision and accuracy of the modelling techniques that are currently fast enough to handle of order 50 lenses, via the blind analysis of simulated datasets. The results in Rung 1 and Rung 2 show that methods that use only the point source positions tend to have lower precision ($10 - 20%$) while remaining accurate. In Rung 2, the methods that exploit the full information of the imaging and kinematic datasets can recover $H_0$ within the target accuracy ($ |A| < 2%$) and precision ($< 6%$ per system), even in the presence of poorly known point spread function and complex source morphology. A post-unblinding analysis of Rung 3 showed the numerical precision of the ray-traced cosmological simulations to be insufficient to test lens modelling methodology at the percent level, making the results difficult to interpret. A new challenge with improved simulations is needed to make further progress in the investigation of systematic uncertainties. For completeness, we present the Rung 3 results in an appendix, and use them to discuss various approaches to mitigating against similar subtle data generation effects in future blind challenges.
Strong gravitational lenses with measured time delay are a powerful tool to measure cosmological parameters, especially the Hubble constant ($H_0$). Recent studies show that by combining just three multiply-imaged AGN systems, one can determine $H_0$ to 2.4% precision. Furthermore, the number of time-delay lens systems is growing rapidly, enabling the determination of $H_0$ to 1% precision in the near future. However, as the precision increases it is important to ensure that systematic errors and biases remain subdominant. For this purpose, challenges with simulated datasets are a key component in this process. Following the experience of the past challenge on time delay, where it was shown that time delays can indeed be measured precisely and accurately at the sub-percent level, we now present the Time Delay Lens Modeling Challenge (TDLMC). The goal of this challenge is to assess the present capabilities of lens modeling codes and assumptions and test the level of accuracy of inferred cosmological parameters given realistic mock datasets. We invite scientists to model a set of simulated HST observations of 50 mock lens systems. The systems are organized in rungs, with the complexity and realism increasing going up the ladder. The goal of the challenge is to infer $H_0$ for each rung, given the HST images, the time delay, and stellar velocity dispersion of the deflector for a fixed background cosmology. The TDLMC challenge starts with the mock data release on 2018 January 8th. The deadline for blind submission is different for each rung. The deadline for Rung 0-1 is 2018 September 8; the deadline for Rung 2 is 2019 April 8 and the one for Rung 3 is 2019 September 8. This first paper gives an overview of the challenge including the data design, and a set of metrics to quantify the modeling performance and challenge details.
We present the lens mass model of the quadruply-imaged gravitationally lensed quasar WFI2033-4723, and perform a blind cosmographical analysis based on this system. Our analysis combines (1) time-delay measurements from 14 years of data obtained by the COSmological MOnitoring of GRAvItational Lenses (COSMOGRAIL) collaboration, (2) high-resolution $textit{Hubble Space Telescope}$ imaging, (3) a measurement of the velocity dispersion of the lens galaxy based on ESO-MUSE data, and (4) multi-band, wide-field imaging and spectroscopy characterizing the lens environment. We account for all known sources of systematics, including the influence of nearby perturbers and complex line-of-sight structure, as well as the parametrization of the light and mass profiles of the lensing galaxy. After unblinding, we determine the effective time-delay distance to be $4784_{-248}^{+399}~mathrm{Mpc}$, an average precision of $6.6%$. This translates to a Hubble constant $H_{0} = 71.6_{-4.9}^{+3.8}~mathrm{km~s^{-1}~Mpc^{-1}}$, assuming a flat $Lambda$CDM cosmology with a uniform prior on $Omega_mathrm{m}$ in the range [0.05, 0.5]. This work is part of the $H_0$ Lenses in COSMOGRAILs Wellspring (H0LiCOW) collaboration, and the full time-delay cosmography results from a total of six strongly lensed systems are presented in a companion paper (H0LiCOW XIII).
376 - S. Rathna Kumar , C. S. Stalin , 2014
In this work, we present a homogeneous curve-shifting analysis using the difference-smoothing technique of the publicly available light curves of 24 gravitationally lensed quasars, for which time delays have been reported in the literature. The uncertainty of each measured time delay was estimated using realistic simulated light curves. The recipe for generating such simulated light curves with known time delays in a plausible range around the measured time delay is introduced here. We identified 14 gravitationally lensed quasars that have light curves of sufficiently good quality to enable the measurement of at least one time delay between the images, adjacent to each other in terms of arrival-time order, to a precision of better than 20% (including systematic errors). We modeled the mass distribution of ten of those systems that have known lens redshifts, accurate astrometric data, and sufficiently simple mass distribution, using the publicly available PixeLens code to infer a value of $H_0$ of 68.1 $pm$ 5.9 km s$^{-1}$ Mpc$^{-1}$ (1$sigma$ uncertainty, 8.7% precision) for a spatially flat universe having $Omega_m$ = 0.3 and $Omega_Lambda$ = 0.7. We note here that the lens modeling approach followed in this work is a relatively simple one and does not account for subtle systematics such as those resulting from line-of-sight effects and hence our $H_0$ estimate should be considered as indicative.
Strongly lensed quasars can provide measurements of the Hubble constant ($H_{0}$) independent of any other methods. One of the key ingredients is exquisite high-resolution imaging data, such as Hubble Space Telescope (HST) imaging and adaptive-optics (AO) imaging from ground-based telescopes, which provide strong constraints on the mass distribution of the lensing galaxy. In this work, we expand on the previous analysis of three time-delay lenses with AO imaging (RXJ1131-1231, HE0435-1223, and PG1115+080), and perform a joint analysis of J0924+0219 by using AO imaging from the Keck Telescope, obtained as part of the SHARP (Strong lensing at High Angular Resolution Program) AO effort, with HST imaging to constrain the mass distribution of the lensing galaxy. Under the assumption of a flat $Lambda$CDM model with fixed $Omega_{rm m}=0.3$, we show that by marginalizing over two different kinds of mass models (power-law and composite models) and their transformed mass profiles via a mass-sheet transformation, we obtain $Delta t_{rm BA}hhat{sigma}_{v}^{-2}=6.89substack{+0.8-0.7}$ days, $Delta t_{rm CA}hhat{sigma}_{v}^{-2}=10.7substack{+1.6-1.2}$ days, and $Delta t_{rm DA}hhat{sigma}_{v}^{-2}=7.70substack{+1.0-0.9}$ days, where $h=H_{0}/100~rm km,s^{-1},Mpc^{-1}$ is the dimensionless Hubble constant and $hat{sigma}_{v}=sigma^{rm ob}_{v}/(280~rm km,s^{-1})$ is the scaled dimensionless velocity dispersion. Future measurements of time delays with 10% uncertainty and velocity dispersion with 5% uncertainty would yield a $H_0$ constraint of $sim15$% precision.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا