ترغب بنشر مسار تعليمي؟ اضغط هنا

The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching characteristic photometric depths of 24th magnitude (10 sigma) and enabling accurate photometry and morphology of objects ten times fainter than in SDSS. Preparations for DES have incl uded building a dedicated 3 deg2 CCD camera (DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new high performance computing (HPC) enabled data management system (DESDM). The DESDM system will be used for processing, calibrating and serving the DES data. The total data volumes are high (~2PB), and so considerable effort has gone into designing an automated processing and quality control system. Special purpose image detrending and photometric calibration codes have been developed to meet the data quality requirements, while survey astrometric calibration, coaddition and cataloging rely on new extensions of the AstrOmatic codes which now include tools for PSF modeling, PSF homogenization, PSF corrected model fitting cataloging and joint model fitting across multiple input images. The DESDM system has been deployed on dedicated development clusters and HPC systems in the US and Germany. An extensive program of testing with small rapid turn-around and larger campaign simulated datasets has been carried out. The system has also been tested on large real datasets, including Blanco Cosmology Survey data from the Mosaic2 camera. In Fall 2012 the DESDM system will be used for DECam commissioning, and, thereafter, the system will go into full science operations.
We present a galaxy catalog simulator which turns N-body simulations with subhalos into multiband photometric mocks. The simulator assigns galaxy properties to each subhalo to reproduce the observed cluster galaxy halo occupation distribution, the ra dial and mass dependent variation in fractions of blue galaxies, the luminosity functions in clusters and the field, and the red-sequence in clusters. Moreover, the evolution of these parameters is tuned to match existing observational constraints. Field galaxies are sampled from existing multiband photometric surveys using derived galaxy photometric redshifts. Parametrizing an ensemble of cluster galaxy properties enables us to create mock catalogs with variations in those properties, which in turn allows us to quantify the sensitivity of cluster finding to current observational uncertainties in these properties. We present an application of the catalog simulator to characterize the selection function of a galaxy cluster finder that utilizes the cluster red-sequence galaxy clustering on the sky, in terms of completeness and contamination. We estimate systematic uncertainties due to the observational uncertainties on our simulator parameters in determining the selection function using five different sets of modified catalogs. Our estimates indicate that these uncertainties are at the $le15$% level with current observational constraints on cluster galaxy populations and their evolution. In addition, we examine the $B_{gc}$ parameter as an optical mass indicator and measure the intrinsic scatter of the $B_{gc}$--mass relation to be approximately log normal with $sigma_{log_{10}M}sim0.25$. Finally, we present tests of a red sequence overdensity redshift estimator using both simulated and real data, showing that it delivers redshifts for massive clusters with $sim$2% accuracy out to redshifts $zsim0.5$ with SDSS-like datasets.
The Dark Energy Survey Data Management (DESDM) system will process and archive the data from the Dark Energy Survey (DES) over the five year period of operation. This paper focuses on a new adaptable processing framework developed to perform highly a utomated, high performance data parallel processing. The new processing framework has been used to process 45 nights of simulated DECam supernova imaging data, and was extensively used in the DES Data Challenge 4, where it was used to process thousands of square degrees of simulated DES data.
106 - Joseph J. Mohr 2008
The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the result ing science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا