ترغب بنشر مسار تعليمي؟ اضغط هنا

Generalized multi-plane gravitational lensing: time delays, recursive lens equation, and the mass-sheet transformation

111   0   0.0 ( 0 )
 نشر من قبل Peter Schneider
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Peter Schneider




اسأل ChatGPT حول البحث

We consider several aspects of the generalized multi-plane gravitational lens theory, in which light rays from a distant source are affected by several main deflectors, and in addition by the tidal gravitational field of the large-scale matter distribution in the Universe when propagating between the main deflectors. Specifically, we derive a simple expression for the time-delay function in this case, making use of the general formalism for treating light propagation in inhomogeneous spacetimes which leads to the characterization of distance matrices between main lens planes. Applying Fermats principle, an alternative form of the corresponding lens equation is derived, which connects the impact vectors in three consecutive main lens planes, and we show that this form of the lens equation is equivalent to the more standard one. For this, some general relations for cosmological distance matrices are derived. The generalized multi-plane lens situation admits a generalized mass-sheet transformation, which corresponds to uniform isotropic scaling in each lens plane, a corresponding scaling of the deflection angle, and the addition of a tidal matrix (mass sheet plus external shear) to each main lens. We show that the time delay for sources in all lens planes scale with the same factor under this generalized mass-sheet transformation, thus precluding the use of time-delay ratios to break the mass-sheet transformation.

قيم البحث

اقرأ أيضاً

Strong lensing gravitational time delays are a powerful and cost effective probe of dark energy. Recent studies have shown that a single lens can provide a distance measurement with 6-7 % accuracy (including random and systematic uncertainties), prov ided sufficient data are available to determine the time delay and reconstruct the gravitational potential of the deflector. Gravitational-time delays are a low redshift (z~0-2) probe and thus allow one to break degeneracies in the interpretation of data from higher-redshift probes like the cosmic microwave background in terms of the dark energy equation of state. Current studies are limited by the size of the sample of known lensed quasars, but this situation is about to change. Even in this decade, wide field imaging surveys are likely to discover thousands of lensed quasars, enabling the targeted study of ~100 of these systems and resulting in substantial gains in the dark energy figure of merit. In the next decade, a further order of magnitude improvement will be possible with the 10000 systems expected to be detected and measured with LSST and Euclid. To fully exploit these gains, we identify three priorities. First, support for the development of software required for the analysis of the data. Second, in this decade, small robotic telescopes (1-4m in diameter) dedicated to monitoring of lensed quasars will transform the field by delivering accurate time delays for ~100 systems. Third, in the 2020s, LSST will deliver 1000s of time delays; the bottleneck will instead be the aquisition and analysis of high resolution imaging follow-up. Thus, the top priority for the next decade is to support fast high resolution imaging capabilities, such as those enabled by the James Webb Space Telescope and next generation adaptive optics systems on large ground based telescopes.
Time-delay cosmography with gravitationally lensed quasars plays an important role in anchoring the absolute distance scale and hence measuring the Hubble constant, $H_{0}$, independent of traditional distance ladder methodology. A current potential limitation of time delay distance measurements is the mass-sheet transformation (MST) which leaves the lensed imaging unchanged but changes the distance measurements and the derived value of $H_0$. In this work we show that the standard method of addressing the MST in time delay cosmography, through a combination of high-resolution imaging and the measurement of the stellar velocity dispersion of the lensing galaxy, depends on the assumption that the ratio, $D_{rm s}/D_{rm ds}$, of angular diameter distances to the background quasar and between the lensing galaxy and the quasar can be constrained. This is typically achieved through the assumption of a particular cosmological model. Previous work (TDCOSMO IV) addressed the mass-sheet degeneracy and derived $H_{0}$ under the assumption of $Lambda$CDM model. In this paper we show that the mass sheet degeneracy can be broken without relying on a specific cosmological model by combining lensing with relative distance indicators such as supernovae type Ia and baryon acoustic oscillations, which constrain the shape of the expansion history and hence $D_{rm s}/D_{rm ds}$. With this approach, we demonstrate that the mass-sheet degeneracy can be constrained in a cosmological-model-independent way, and hence model-independent distance measurements in time-delay cosmography under mass-sheet transformations can be obtained.
Accurate and precise measurements of the Hubble constant are critical for testing our current standard cosmological model and revealing possibly new physics. With Hubble Space Telescope (HST) imaging, each strong gravitational lens system with measur ed time delays can allow one to determine the Hubble constant with an uncertainty of $sim 7%$. Since HST will not last forever, we explore adaptive-optics (AO) imaging as an alternative that can provide higher angular resolution than HST imaging but has a less stable point spread function (PSF) due to atmospheric distortion. To make AO imaging useful for time-delay-lens cosmography, we develop a method to extract the unknown PSF directly from the imaging of strongly lensed quasars. In a blind test with two mock data sets created with different PSFs, we are able to recover the important cosmological parameters (time-delay distance, external shear, lens mass profile slope, and total Einstein radius). Our analysis of the Keck AO image of the strong lens system RXJ1131-1231 shows that the important parameters for cosmography agree with those based on HST imaging and modeling within 1-$sigma$ uncertainties. Most importantly, the constraint on the model time-delay distance by using AO imaging with $0.045$resolution is tighter by $sim 50%$ than the constraint of time-delay distance by using HST imaging with $0.09$when a power-law mass distribution for the lens system is adopted. Our PSF reconstruction technique is generic and applicable to data sets that have multiple nearby point sources, enabling scientific studies that require high-precision models of the PSF.
Obtaining lensing time delay measurements requires long-term monitoring campaigns with a high enough resolution (< 1 arcsec) to separate the multiple images. In the radio, a limited number of high-resolution interferometer arrays make these observati ons difficult to schedule. To overcome this problem, we propose a technique for measuring gravitational time delays which relies on monitoring the total flux density with low-resolution but high-sensitivity radio telescopes to follow the variation of the brighter image. This is then used to trigger high-resolution observations in optimal numbers which then reveal the variation in the fainter image. We present simulations to assess the efficiency of this method together with a pilot project observing radio lens systems with the Westerbork Synthesis Radio Telescope (WSRT) to trigger Very Large Array (VLA) observations. This new method is promising for measuring time delays because it uses relatively small amounts of time on high-resolution telescopes. This will be important because instruments that have high sensitivity but limited resolution, together with an optimum usage of followup high-resolution observations from appropriate radio telescopes may in the future be useful for gravitational lensing time delay measurements by means of this new method.
The Hubble constant value is currently known to 10% accuracy unless assumptions are made for the cosmology (Sandage et al. 2006). Gravitational lens systems provide another probe of the Hubble constant using time delay measurements. However, current investigations of ~20 time delay lenses, albeit of varying levels of sophistication, have resulted in different values of the Hubble constant ranging from 50-80 km/s/Mpc. In order to reduce uncertainties, more time delay measurements are essential together with better determined mass models (Oguri 2007, Saha et al. 2006). We propose a more efficient technique for measuring time delays which does not require regular monitoring with a high-resolution interferometer array. The method uses double image and long-axis quadruple lens systems in which the brighter component varies first and dominates the total flux density. Monitoring the total flux density with low-resolution but high sensitivity radio telescopes provides the variation of the brighter image and is used to trigger high-resolution observations which can then be used to see the variation in the fainter image. We present simulations of this method together with a pilot project using the WSRT (Westerbork Radio Synthesis Telescope) to trigger VLA (Very Large Array) observations. This new method is promising for measuring time delays because it uses relatively small amounts of time on high-resolution telescopes. This will be important because many SKA pathfinder telescopes, such as MeerKAT (Karoo Array Telescope) and ASKAP (Australian Square Kilometre Array Pathfinder), have high sensitivity but limited resolution.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا