No Arabic abstract
Many scientific goals for the Dark Energy Survey (DES) require calibration of optical/NIR broadband $b = grizY$ photometry that is stable in time and uniform over the celestial sky to one percent or better. It is also necessary to limit to similar accuracy systematic uncertainty in the calibrated broadband magnitudes due to uncertainty in the spectrum of the source. Here we present a Forward Global Calibration Method (FGCM) for photometric calibration of the DES, and we present results of its application to the first three years of the survey (Y3A1). The FGCM combines data taken with auxiliary instrumentation at the observatory with data from the broad-band survey imaging itself and models of the instrument and atmosphere to estimate the spatial- and time-dependence of the passbands of individual DES survey exposures. Standard passbands are chosen that are typical of the passbands encountered during the survey. The passband of any individual observation is combined with an estimate of the source spectral shape to yield a magnitude $m_b^{mathrm{std}}$ in the standard system. This chromatic correction to the standard system is necessary to achieve sub-percent calibrations. The FGCM achieves reproducible and stable photometric calibration of standard magnitudes $m_b^{mathrm{std}}$ of stellar sources over the multi-year Y3A1 data sample with residual random calibration errors of $sigma=5-6,mathrm{mmag}$ per exposure. The accuracy of the calibration is uniform across the $5000,mathrm{deg}^2$ DES footprint to within $sigma=7,mathrm{mmag}$. The systematic uncertainties of magnitudes in the standard system due to the spectra of sources are less than $5,mathrm{mmag}$ for main sequence stars with $0.5<g-i<3.0$.
The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching characteristic photometric depths of 24th magnitude (10 sigma) and enabling accurate photometry and morphology of objects ten times fainter than in SDSS. Preparations for DES have included building a dedicated 3 deg2 CCD camera (DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new high performance computing (HPC) enabled data management system (DESDM). The DESDM system will be used for processing, calibrating and serving the DES data. The total data volumes are high (~2PB), and so considerable effort has gone into designing an automated processing and quality control system. Special purpose image detrending and photometric calibration codes have been developed to meet the data quality requirements, while survey astrometric calibration, coaddition and cataloging rely on new extensions of the AstrOmatic codes which now include tools for PSF modeling, PSF homogenization, PSF corrected model fitting cataloging and joint model fitting across multiple input images. The DESDM system has been deployed on dedicated development clusters and HPC systems in the US and Germany. An extensive program of testing with small rapid turn-around and larger campaign simulated datasets has been carried out. The system has also been tested on large real datasets, including Blanco Cosmology Survey data from the Mosaic2 camera. In Fall 2012 the DESDM system will be used for DECam commissioning, and, thereafter, the system will go into full science operations.
We present the photometric calibration of the Supernova Legacy Survey (SNLS) fields. The SNLS aims at measuring the distances to SNe Ia at (0.3<z<1) using MegaCam, the 1 deg^2 imager on the Canada-France-Hawaii Telescope (CFHT). The uncertainty affecting the photometric calibration of the survey dominates the systematic uncertainty of the key measurement of the survey, namely the dark energy equation of state. The photometric calibration of the SNLS requires obtaining a uniform response across the imager, calibrating the science field stars in each survey band (SDSS-like ugriz bands) with respect to standards with known flux in the same bands, and binding the calibration to the UBVRI Landolt standards used to calibrate the nearby SNe from the literature necessary to produce cosmological constraints. The spatial non-uniformities of the imager photometric response are mapped using dithered observations of dense stellar fields. Photometric zero-points against Landolt standards are obtained. The linearity of the instrument is studied. We show that the imager filters and photometric response are not uniform and publish correction maps. We present models of the effective passbands of the instrument as a function of the position on the focal plane. We define a natural magnitude system for MegaCam. We show that the systematics affecting the magnitude-to-flux relations can be reduced if we use the spectrophotometric standard star BD +17 4708 instead of Vega as a fundamental flux standard. We publish ugriz catalogs of tertiary standards for all the SNLS fields.
PreCam, a precursor observational campaign supporting the Dark Energy Survey (DES), is designed to produce a photometric and astrometric catalog of nearly a hundred thousand standard stars within the DES footprint, while the PreCam instrument also serves as a prototype testbed for the Dark Energy Camera (DECam)s hardware and software. This catalog represents a potential 100-fold increase in Southern Hemisphere photometric standard stars, and therefore will be an important component in the calibration of the Dark Energy Survey. We provide details on the PreCam instruments design, construction and testing, as well as results from a subset of the 51 nights of PreCam survey observations on the University of Michigan Department of Astronomys Curtis-Schmidt telescope at Cerro Tololo Inter-American Observatory. We briefly describe the preliminary data processing pipeline that has been developed for PreCam data and the preliminary results of the instrument performance, as well as astrometry and photometry of a sample of stars previously included in other southern sky surveys.
We characterize the ability of the Dark Energy Camera (DECam) to perform relative astrometry across its 500~Mpix, 3 deg^2 science field of view, and across 4 years of operation. This is done using internal comparisons of ~4x10^7 measurements of high-S/N stellar images obtained in repeat visits to fields of moderate stellar density, with the telescope dithered to move the sources around the array. An empirical astrometric model includes terms for: optical distortions; stray electric fields in the CCD detectors; chromatic terms in the instrumental and atmospheric optics; shifts in CCD relative positions of up to ~10 um when the DECam temperature cycles; and low-order distortions to each exposure from changes in atmospheric refraction and telescope alignment. Errors in this astrometric model are dominated by stochastic variations with typical amplitudes of 10-30 mas (in a 30 s exposure) and 5-10 arcmin coherence length, plausibly attributed to Kolmogorov-spectrum atmospheric turbulence. The size of these atmospheric distortions is not closely related to the seeing. Given an astrometric reference catalog at density ~0.7 arcmin^{-2}, e.g. from Gaia, the typical atmospheric distortions can be interpolated to 7 mas RMS accuracy (for 30 s exposures) with 1 arcmin coherence length for residual errors. Remaining detectable error contributors are 2-4 mas RMS from unmodelled stray electric fields in the devices, and another 2-4 mas RMS from focal plane shifts between camera thermal cycles. Thus the astrometric solution for a single DECam exposure is accurate to 3-6 mas (0.02 pixels, or 300 nm) on the focal plane, plus the stochastic atmospheric distortion.
The Dark Energy Survey (DES) is a project with the goal of building, installing and exploiting a new 74 CCD-camera at the Blanco telescope, in order to study the nature of cosmic acceleration. It will cover 5000 square degrees of the southern hemisphere sky and will record the positions and shapes of 300 million galaxies up to redshift 1.4. The survey will be completed using 525 nights during a 5-year period starting in 2012. About O(1 TB) of raw data will be produced every night, including science and calibration images. The DES data management system has been designed for the processing, calibration and archiving of these data. It is being developed by collaborating DES institutions, led by NCSA. In this contribution, we describe the basic functions of the system, what kind of scientific codes are involved and how the Data Challenge process works, to improve simultaneously the Data Management system algorithms and the Science Working Group analysis codes.