No Arabic abstract
We describe an algorithm for identifying point-source transients and moving objects on reference-subtracted optical images containing artifacts of processing and instrumentation. The algorithm makes use of the supervised machine learning technique known as Random Forest. We present results from its use in the Dark Energy Survey Supernova program (DES-SN), where it was trained using a sample of 898,963 signal and background events generated by the transient detection pipeline. After reprocessing the data collected during the first DES-SN observing season (Sep. 2013 through Feb. 2014) using the algorithm, the number of transient candidates eligible for human scanning decreased by a factor of 13.4, while only 1 percent of the artificial Type Ia supernovae (SNe) injected into search images to monitor survey efficiency were lost, most of which were very faint events. Here we characterize the algorithms performance in detail, and we discuss how it can inform pipeline design decisions for future time-domain imaging surveys, such as the Large Synoptic Survey Telescope and the Zwicky Transient Facility. An implementation of the algorithm and the training data used in this paper are available at http://portal.nersc.gov/project/dessn/autoscan.
We describe the difference imaging pipeline (DiffImg) used to detect transients in deep images from the Dark Energy Survey Supernova program (DES-SN) in its first observing season from Aug 2013 through Feb 2014. DES-SN is a search for transients in which ten 3-deg^2 fields are repeatedly observed in the g,r,i,z passbands with a cadence of about 1 week. The observing strategy has been optimized to measure high-quality light curves and redshifts for thousands of Type Ia supernova (SN Ia) with the goal of measuring dark energy parameters. The essential DiffImg functions are to align each search image to a deep reference image, do a pixel-by-pixel subtraction, and then examine the subtracted image for significant positive detections of point-source objects. The vast majority of detections are subtraction artifacts, but after selection requirements and image filtering with an automated scanning program, there are 130 detections per deg^2 per observation in each band, of which only 25% are artifacts. Of the 7500 transients discovered by DES-SN in its first observing season, each requiring a detection on at least 2 separate nights, Monte Carlo simulations predict that 27% are expected to be supernova. Another 30% of the transients are artifacts, and most of the remaining transients are AGN and variable stars. Fake SNe Ia are overlaid onto the images to rigorously evaluate detection efficiencies, and to understand the DiffImg performance. The DiffImg efficiency measured with fake SNe agrees well with expectations from a Monte Carlo simulation that uses analytical calculations of the fluxes and their uncertainties. In our 8 shallow fields with single-epoch 50% completeness depth 23.5, the SN Ia efficiency falls to 1/2 at redshift z 0.7, in our 2 deep fields with mag-depth 24.5, the efficiency falls to 1/2 at z 1.1.
Cosmic voids and their corresponding redshift-aggregated projections of mass densities, known as troughs, play an important role in our attempt to model the large-scale structure of the Universe. Understanding these structures leads to tests comparing the standard model with alternative cosmologies, constraints on the dark energy equation of state, and provides evidence to differentiate among gravitational theories. In this paper, we extend the subspace-constrained mean shift algorithm, a recently introduced method to estimate density ridges, and apply it to 2D weak-lensing mass density maps from the Dark Energy Survey Y1 data release to identify curvilinear filamentary structures. We compare the obtained ridges with previous approaches to extract trough structure in the same data, and apply curvelets as an alternative wavelet-based method to constrain densities. We then invoke the Wasserstein distance between noisy and noiseless simulations to validate the denoising capabilities of our method. Our results demonstrate the viability of ridge estimation as a precursor for denoising weak lensing quantities to recover the large-scale structure, paving the way for a more versatile and effective search for troughs.
The Dark Energy Survey (DES) is a project with the goal of building, installing and exploiting a new 74 CCD-camera at the Blanco telescope, in order to study the nature of cosmic acceleration. It will cover 5000 square degrees of the southern hemisphere sky and will record the positions and shapes of 300 million galaxies up to redshift 1.4. The survey will be completed using 525 nights during a 5-year period starting in 2012. About O(1 TB) of raw data will be produced every night, including science and calibration images. The DES data management system has been designed for the processing, calibration and archiving of these data. It is being developed by collaborating DES institutions, led by NCSA. In this contribution, we describe the basic functions of the system, what kind of scientific codes are involved and how the Data Challenge process works, to improve simultaneously the Data Management system algorithms and the Science Working Group analysis codes.
The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.
We describe the first public data release of the Dark Energy Survey, DES DR1, consisting of reduced single epoch images, coadded images, coadded source catalogs, and associated products and services assembled over the first three years of DES science operations. DES DR1 is based on optical/near-infrared imaging from 345 distinct nights (August 2013 to February 2016) by the Dark Energy Camera mounted on the 4-m Blanco telescope at Cerro Tololo Inter-American Observatory in Chile. We release data from the DES wide-area survey covering ~5,000 sq. deg. of the southern Galactic cap in five broad photometric bands, grizY. DES DR1 has a median delivered point-spread function of g = 1.12, r = 0.96, i = 0.88, z = 0.84, and Y = 0.90 arcsec FWHM, a photometric precision of < 1% in all bands, and an astrometric precision of 151 mas. The median coadded catalog depth for a 1.95 diameter aperture at S/N = 10 is g = 24.33, r = 24.08, i = 23.44, z = 22.69, and Y = 21.44 mag. DES DR1 includes nearly 400M distinct astronomical objects detected in ~10,000 coadd tiles of size 0.534 sq. deg. produced from ~39,000 individual exposures. Benchmark galaxy and stellar samples contain ~310M and ~ 80M objects, respectively, following a basic object quality selection. These data are accessible through a range of interfaces, including query web clients, image cutout servers, jupyter notebooks, and an interactive coadd image visualization tool. DES DR1 constitutes the largest photometric data set to date at the achieved depth and photometric precision.