No Arabic abstract
In this work, we present a homogeneous curve-shifting analysis using the difference-smoothing technique of the publicly available light curves of 24 gravitationally lensed quasars, for which time delays have been reported in the literature. The uncertainty of each measured time delay was estimated using realistic simulated light curves. The recipe for generating such simulated light curves with known time delays in a plausible range around the measured time delay is introduced here. We identified 14 gravitationally lensed quasars that have light curves of sufficiently good quality to enable the measurement of at least one time delay between the images, adjacent to each other in terms of arrival-time order, to a precision of better than 20% (including systematic errors). We modeled the mass distribution of ten of those systems that have known lens redshifts, accurate astrometric data, and sufficiently simple mass distribution, using the publicly available PixeLens code to infer a value of $H_0$ of 68.1 $pm$ 5.9 km s$^{-1}$ Mpc$^{-1}$ (1$sigma$ uncertainty, 8.7% precision) for a spatially flat universe having $Omega_m$ = 0.3 and $Omega_Lambda$ = 0.7. We note here that the lens modeling approach followed in this work is a relatively simple one and does not account for subtle systematics such as those resulting from line-of-sight effects and hence our $H_0$ estimate should be considered as indicative.
Observed time delays between images of a lensed QSO lead to the determination of the Hubble constant by Refsdals method, provided the mass distribution in the lensing galaxy is reasonably well known. Since the two or four QSO images usually observed are woefully inadequate by themselves to provide a unique reconstruction of the galaxy mass, most previous reconstructions have been limited to simple parameterized models, which may lead to large systematic errors in the derived H_0 by failing to consider enough possibilities for the mass distribution of the lens. We use non-parametric modeling of galaxy lenses to better explore physically plausible but not overly constrained galaxy mass maps, all of which reproduce the lensing observables exactly, and derive the corresponding distribution of H_0s. Blind tests - where one of us simulated galaxy lenses, lensing observables, and a value for H_0, and the other applied our modeling technique to estimate H_0 indicate that our procedure is reliable. For four simulated lensed QSOs the distribution of inferred H_0 have an uncertainty of simeq 10% at 90% confidence. Application to published observations of the two best constrained time-delay lenses, PG1115+080 and B1608+656, yields H_0=61 +/- 11 km/s/Mpc at 68% confidence and 61 +/- 18 km/s/Mpc at 90% confidence.
The use of time-delay gravitational lenses to examine the cosmological expansion introduces a new standard ruler with which to test theoretical models. The sample suitable for this kind of work now includes 12 lens systems, which have thus far been used solely for optimizing the parameters of $Lambda$CDM. In this paper, we broaden the base of support for this new, important cosmic probe by using these observations to carry out a one-on-one comparison between {it competing} models. The currently available sample indicates a likelihood of $sim 70-80%$ that the $R_{rm h}=ct$ Universe is the correct cosmology versus $sim 20-30%$ for the standard model. This possibly interesting result reinforces the need to greatly expand the sample of time-delay lenses, e.g., with the successful implementation of the Dark Energy Survey, the VST ATLAS survey, and the Large Synoptic Survey Telescope. In anticipation of a greatly expanded catalog of time-delay lenses identified with these surveys, we have produced synthetic samples to estimate how large they would have to be in order to rule out either model at a $sim 99.7%$ confidence level. We find that if the real cosmology is $Lambda$CDM, a sample of $sim 150$ time-delay lenses would be sufficient to rule out $R_{rm h}=ct$ at this level of accuracy, while $sim 1,000$ time-delay lenses would be required to rule out $Lambda$CDM if the real Universe is instead $R_{rm h}=ct$. This difference in required sample size reflects the greater number of free parameters available to fit the data with $Lambda$CDM.
We present a simultaneous analysis of 10 galaxy lenses having time-delay measurements. For each lens we derive a detailed free-form mass map, with uncertainties, and with the additional requirement of a shared value of the Hubble parameter across all the lenses. We test the prior involved in the lens reconstruction against a galaxy-formation simulation. Assuming a concordance cosmology, we obtain 1/H_0 = 13.5 (+2.5/-1.3) Gyr
Strongly lensed explosive transients such as supernovae, gamma-ray bursts, fast radio bursts, and gravitational waves are very promising tools to determine the Hubble constant ($H_0$) in the near future in addition to strongly lensed quasars. In this work, we show that the transient nature of the point source provides an advantage over quasars: the lensed host galaxy can be observed before or after the transients appearance. Therefore, the lens model can be derived from images free of contamination from bright point sources. We quantify this advantage by comparing the precision of a lens model obtained from the same lenses with and without point sources. Based on Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) observations with the same sets of lensing parameters, we simulate realistic mock datasets of 48 quasar lensing systems (i.e., adding AGN in the galaxy center) and 48 galaxy-galaxy lensing systems (assuming the transient source is not visible but the time delay and image positions have been or will be measured). We then model the images and compare the inferences of the lens model parameters and $H_0$. We find that the precision of the lens models (in terms of the deflector mass slope) is better by a factor of 4.1 for the sample without lensed point sources, resulting in an increase of $H_0$ precision by a factor of 2.9. The opportunity to observe the lens systems without the transient point sources provides an additional advantage for time-delay cosmography over lensed quasars. It facilitates the determination of higher signal-to-noise stellar kinematics of the main deflector, and thus its mass density profile, which in turn plays a key role in breaking the mass-sheet degeneracy and constraining $H_0$.
The construction of the cosmic distance-duality relation (CDDR) has been widely studied. However, its consistency with various new observables remains a topic of interest. We present a new way to constrain the CDDR $eta(z)$ using different dynamic and geometric properties of strong gravitational lenses (SGL) along with SNe Ia observations. We use a sample of $102$ SGL with the measurement of corresponding velocity dispersion $sigma_0$ and Einstein radius $theta_E$. In addition, we also use a dataset of $12$ two image lensing systems containing the measure of time delay $Delta t$ between source images. Jointly these two datasets give us the angular diameter distance $D_{A_{ol}}$ of the lens. Further, for luminosity distance, we use the $740$ observations from JLA compilation of SNe Ia. To study the combined behavior of these datasets we use a model independent method, Gaussian Process (GP). We also check the efficiency of GP by applying it on simulated datasets, which are generated in a phenomenological way by using realistic cosmological error bars. Finally, we conclude that the combined bounds from the SGL and SNe Ia observation do not favor any deviation of CDDR and are in concordance with the standard value ($eta=1$) within $2sigma$ confidence region, which further strengthens the theoretical acceptance of CDDR.