No Arabic abstract
Strong gravitational lensing enables a wide range of science: probing cosmography; testing dark matter models; understanding galaxy evolution; and magnifying the faint, small and distant Universe. However to date exploiting strong lensing as a tool for these numerous cosmological and astrophysical applications has been severely hampered by limited sample sized. LSST will drive studies of strongly lensed galaxies, galaxy groups and galaxy clusters into the statistical age. Time variable lensing events, e.g. measuring cosmological time delays from strongly lensed supernovae and quasars, place the strongest constraints on LSSTs observing strategy and have been considered in the DESC observing strategy white papers. Here we focus on aspects of `static lens discovery that will be affected by the observing strategy. In summary, we advocate (1) ensuring comparable (sub-arcsecond) seeing in the g-band as in r and i to facilitate discovery of gravitational lenses, and (2) initially surveying the entire observable extragalactic sky as rapidly as possible to enable early science spanning a broad range of static and transient interests.
The LSST survey will provide unprecedented statistical power for measurements of dark energy. Consequently, controlling systematic uncertainties is becoming more important than ever. The LSST observing strategy will affect the statistical uncertainty and systematics control for many science cases; here, we focus on weak lensing systematics. The fact that the LSST observing strategy involves hundreds of visits to the same sky area provides new opportunities for systematics mitigation. We explore these opportunities by testing how different dithering strategies (pointing offsets and rotational angle of the camera in different exposures) affect additive weak lensing shear systematics on a baseline operational simulation, using the $rho-$statistics formalism. Some dithering strategies improve systematics control at the end of the survey by a factor of up to $sim 3-4$ better than others. We find that a random translational dithering strategy, applied with random rotational dithering at every filter change, is the most effective of those strategies tested in this work at averaging down systematics. Adopting this dithering algorithm, we explore the effect of varying the area of the survey footprint, exposure time, number of exposures in a visit, and exposure to the Galactic plane. We find that any change that increases the average number of exposures (in filters relevant to weak lensing) reduces the additive shear systematics. Some ways to achieve this increase may not be favorable for the weak lensing statistical constraining power or for other probes, and we explore the relative trade-offs between these options given constraints on the overall survey parameters.
The Large Synoptic Survey Telescope is designed to provide an unprecedented optical imaging dataset that will support investigations of our Solar System, Galaxy and Universe, across half the sky and over ten years of repeated observation. However, exactly how the LSST observations will be taken (the observing strategy or cadence) is not yet finalized. In this dynamically-evolving community white paper, we explore how the detailed performance of the anticipated science investigations is expected to depend on small changes to the LSST observing strategy. Using realistic simulations of the LSST schedule and observation properties, we design and compute diagnostic metrics and Figures of Merit that provide quantitative evaluations of different observing strategies, analyzing their impact on a wide range of proposed science projects. This is work in progress: we are using this white paper to communicate to each other the relative merits of the observing strategy choices that could be made, in an effort to maximize the scientific value of the survey. The investigation of some science cases leads to suggestions for new strategies that could be simulated and potentially adopted. Notably, we find motivation for exploring departures from a spatially uniform annual tiling of the sky: focusing instead on different parts of the survey area in different years in a rolling cadence is likely to have significant benefits for a number of time domain and moving object astronomy projects. The communal assembly of a suite of quantified and homogeneously coded metrics is the vital first step towards an automated, systematic, science-based assessment of any given cadence simulation, that will enable the scheduling of the LSST to be as well-informed as possible.
The Wide-Field Infrared Survey Telescope (WFIRST) is expected to launch in the mid-2020s. With its wide-field near-infrared (NIR) camera, it will survey the sky to unprecedented detail. As part of normal operations and as the result of multiple expected dedicated surveys, WFIRST will produce several relatively wide-field (tens of square degrees) deep (limiting magnitude of 28 or fainter) fields. In particular, a planned supernova survey is expected to image 3 deep fields in the LSST footprint roughly every 5 days over 2 years. Stacking all data, this survey will produce, over all WFIRST supernova fields in the LSST footprint, ~12-25 deg^2 and ~5-15 deg^2 regions to depths of ~28 mag and ~29 mag, respectively. We suggest LSST undertake mini-surveys that will match the WFIRST cadence and simultaneously observe the supernova survey fields during the 2-year WFIRST supernova survey, achieving a stacked depth similar to that of the WFIRST data. We also suggest additional observations of these same regions throughout the LSST survey to get deep images earlier, have long-term monitoring in the fields, and produce deeper images overall. These fields will provide a legacy for cosmology, extragalactic, and transient/variable science.
The generation-defining Vera C. Rubin Observatory will make state-of-the-art measurements of both the static and transient universe through its Legacy Survey for Space and Time (LSST). With such capabilities, it is immensely challenging to optimize the LSST observing strategy across the surveys wide range of science drivers. Many aspects of the LSST observing strategy relevant to the LSST Dark Energy Science Collaboration, such as survey footprint definition, single visit exposure time and the cadence of repeat visits in different filters, are yet to be finalized. Here, we present metrics used to assess the impact of observing strategy on the cosmological probes considered most sensitive to survey design; these are large-scale structure, weak lensing, type Ia supernovae, kilonovae and strong lens systems (as well as photometric redshifts, which enable many of these probes). We evaluate these metrics for over 100 different simulated potential survey designs. Our results show that multiple observing strategy decisions can profoundly impact cosmological constraints with LSST; these include adjusting the survey footprint, ensuring repeat nightly visits are taken in different filters and enforcing regular cadence. We provide public code for our metrics, which makes them readily available for evaluating further modifications to the survey design. We conclude with a set of recommendations and highlight observing strategy factors that require further research.
Cosmology is one of the four science pillars of LSST, which promises to be transformative for our understanding of dark energy and dark matter. The LSST Dark Energy Science Collaboration (DESC) has been tasked with deriving constraints on cosmological parameters from LSST data. Each of the cosmological probes for LSST is heavily impacted by the choice of observing strategy. This white paper is written by the LSST DESC Observing Strategy Task Force (OSTF), which represents the entire collaboration, and aims to make recommendations on observing strategy that will benefit all cosmological analyses with LSST. It is accompanied by the DESC DDF (Deep Drilling Fields) white paper (Scolnic et al.). We use a variety of metrics to understand the effects of the observing strategy on measurements of weak lensing, large-scale structure, clusters, photometric redshifts, supernovae, strong lensing and kilonovae. In order to reduce systematic uncertainties, we conclude that the current baseline observing strategy needs to be significantly modified to result in the best possible cosmological constraints. We provide some key recommendations: moving the WFD (Wide-Fast-Deep) footprint to avoid regions of high extinction, taking visit pairs in different filters, changing the 2x15s snaps to a single exposure to improve efficiency, focusing on strategies that reduce long gaps (>15 days) between observations, and prioritizing spatial uniformity at several intervals during the 10-year survey.