No Arabic abstract
Weak lensing by large-scale structure is a powerful technique to probe the dark components of the universe. To understand the measurement process of weak lensing and the associated systematic effects, image simulations are becoming increasingly important. For this purpose we present a first implementation of the $textit{Monte Carlo Control Loops}$ ($textit{MCCL}$; Refregier & Amara 2014), a coherent framework for studying systematic effects in weak lensing. It allows us to model and calibrate the shear measurement process using image simulations from the Ultra Fast Image Generator (UFig; Berge et al. 2013). We apply this framework to a subset of the data taken during the Science Verification period (SV) of the Dark Energy Survey (DES). We calibrate the UFig simulations to be statistically consistent with DES images. We then perform tolerance analyses by perturbing the simulation parameters and study their impact on the shear measurement at the one-point level. This allows us to determine the relative importance of different input parameters to the simulations. For spatially constant systematic errors and six simulation parameters, the calibration of the simulation reaches the weak lensing precision needed for the DES SV survey area. Furthermore, we find a sensitivity of the shear measurement to the intrinsic ellipticity distribution, and an interplay between the magnitude-size and the pixel value diagnostics in constraining the noise model. This work is the first application of the $textit{MCCL}$ framework to data and shows how it can be used to methodically study the impact of systematics on the cosmic shear measurement.
We present an analysis of supernova light curves simulated for the upcoming Dark Energy Survey (DES) supernova search. The simulations employ a code suite that generates and fits realistic light curves in order to obtain distance modulus/redshift pairs that are passed to a cosmology fitter. We investigated several different survey strategies including field selection, supernova selection biases, and photometric redshift measurements. Using the results of this study, we chose a 30 square degree search area in the griz filter set. We forecast 1) that this survey will provide a homogeneous sample of up to 4000 Type Ia supernovae in the redshift range 0.05<z<1.2, and 2) that the increased red efficiency of the DES camera will significantly improve high-redshift color measurements. The redshift of each supernova with an identified host galaxy will be obtained from spectroscopic observations of the host. A supernova spectrum will be obtained for a subset of the sample, which will be utilized for control studies. In addition, we have investigated the use of combined photometric redshifts taking into account data from both the host and supernova. We have investigated and estimated the likely contamination from core-collapse supernovae based on photometric identification, and have found that a Type Ia supernova sample purity of up to 98% is obtainable given specific assumptions. Furthermore, we present systematic uncertainties due to sample purity, photometric calibration, dust extinction priors, filter-centroid shifts, and inter-calibration. We conclude by estimating the uncertainty on the cosmological parameters that will be measured from the DES supernova data.
We present simulations for the Dark Energy Survey (DES) using a new code suite (SNANA) that generates realistic supernova light curves accounting for atmospheric seeing conditions and intrinsic supernova luminosity variations using MLCS2k2 or SALT2 models. Errors include stat-noise from photo-statistics and sky noise. We applied SNANA to simulate DES supernova observations and employed an MLCS-based fitter to obtain the distance modulus for each simulated light curve. We harnessed the light curves in order to study selection biases for high-redshift supernovae and to constrain the optimal DES observing strategy using the Dark Energy Task Force figure of merit.
The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.
This overview article describes the legacy prospect and discovery potential of the Dark Energy Survey (DES) beyond cosmological studies, illustrating it with examples from the DES early data. DES is using a wide-field camera (DECam) on the 4m Blanco Telescope in Chile to image 5000 sq deg of the sky in five filters (grizY). By its completion the survey is expected to have generated a catalogue of 300 million galaxies with photometric redshifts and 100 million stars. In addition, a time-domain survey search over 27 sq deg is expected to yield a sample of thousands of Type Ia supernovae and other transients. The main goals of DES are to characterise dark energy and dark matter, and to test alternative models of gravity; these goals will be pursued by studying large scale structure, cluster counts, weak gravitational lensing and Type Ia supernovae. However, DES also provides a rich data set which allows us to study many other aspects of astrophysics. In this paper we focus on additional science with DES, emphasizing areas where the survey makes a difference with respect to other current surveys. The paper illustrates, using early data (from `Science Verification, and from the first, second and third seasons of observations), what DES can tell us about the solar system, the Milky Way, galaxy evolution, quasars, and other topics. In addition, we show that if the cosmological model is assumed to be Lambda+ Cold Dark Matter (LCDM) then important astrophysics can be deduced from the primary DES probes. Highlights from DES early data include the discovery of 34 Trans Neptunian Objects, 17 dwarf satellites of the Milky Way, one published z > 6 quasar (and more confirmed) and two published superluminous supernovae (and more confirmed).
Cosmic voids and their corresponding redshift-aggregated projections of mass densities, known as troughs, play an important role in our attempt to model the large-scale structure of the Universe. Understanding these structures leads to tests comparing the standard model with alternative cosmologies, constraints on the dark energy equation of state, and provides evidence to differentiate among gravitational theories. In this paper, we extend the subspace-constrained mean shift algorithm, a recently introduced method to estimate density ridges, and apply it to 2D weak-lensing mass density maps from the Dark Energy Survey Y1 data release to identify curvilinear filamentary structures. We compare the obtained ridges with previous approaches to extract trough structure in the same data, and apply curvelets as an alternative wavelet-based method to constrain densities. We then invoke the Wasserstein distance between noisy and noiseless simulations to validate the denoising capabilities of our method. Our results demonstrate the viability of ridge estimation as a precursor for denoising weak lensing quantities to recover the large-scale structure, paving the way for a more versatile and effective search for troughs.