ترغب بنشر مسار تعليمي؟ اضغط هنا

DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs

121   0   0.0 ( 0 )
 نشر من قبل Yao-Yuan Mao
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.

قيم البحث

اقرأ أيضاً

We present a suite of 18 synthetic sky catalogs designed to support science analysis of galaxies in the Dark Energy Survey Year 1 (DES Y1) data. For each catalog, we use a computationally efficient empirical approach, ADDGALS, to embed galaxies withi n light-cone outputs of three dark matter simulations that resolve halos with masses above ~5x10^12 h^-1 m_sun at z <= 0.32 and 10^13 h^-1 m_sun at z~2. The embedding method is tuned to match the observed evolution of galaxy counts at different luminosities as well as the spatial clustering of the galaxy population. Galaxies are lensed by matter along the line of sight --- including magnification, shear, and multiple images --- using CALCLENS, an algorithm that calculates shear with 0.42 arcmin resolution at galaxy positions in the full catalog. The catalogs presented here, each with the same LCDM cosmology (denoted Buzzard), contain on average 820 million galaxies over an area of 1120 square degrees with positions, magnitudes, shapes, photometric errors, and photometric redshift estimates. We show that the weak-lensing shear catalog, redMaGiC galaxy catalogs and redMaPPer cluster catalogs provide plausible realizations of the same catalogs in the DES Y1 data by comparing their magnitude, color and redshift distributions, angular clustering, and mass-observable relations, making them useful for testing analyses that use these samples. We make public the galaxy samples appropriate for the DES Y1 data, as well as the data vectors used for cosmology analyses on these simulations.
We describe the development of automated emission line detection software for the Fiber Multi-Object Spectrograph (FMOS), which is a near-infrared spectrograph fed by $400$ fibers from the $0.2$ deg$^2$ prime focus field of view of the Subaru Telesco pe. The software, FIELD (FMOS software for Image-based Emission Line Detection), is developed and tested mainly for the FastSound survey, which is targeting H$alpha$ emitting galaxies at $z sim 1.3$ to measure the redshift space distortion as a test of general relativity beyond $z sim 1$. The basic algorithm is to calculate the line signal-to-noise ratio ($S/N$) along the wavelength direction, given by a 2-D convolution of the spectral image and a detection kernel representing a typical emission line profile. A unique feature of FMOS is its use of OH airglow suppression masks, requiring the use of flat-field images to suppress noise around the mask regions. Bad pixels on the detectors and pixels affected by cosmic-rays are efficiently removed by using the information obtained from the FMOS analysis pipeline. We limit the range of acceptable line-shape parameters for the detected candidates to further improve the reliability of line detection. The final performance of line detection is tested using a subset of the FastSound data; the false detection rate of spurious objects is examined by using inverted frames obtained by exchanging object and sky frames. The false detection rate is $< 1$% at $S/N > 5$, allowing an efficient and objective emission line search for FMOS data at the line flux level of $gtrsim 1.0 times 10^{-16}$[erg/cm$^2$/s].
We present a series of new, publicly available mock catalogs of X-ray selected active galactic nuclei (AGNs), non-active galaxies, and clusters of galaxies. They are based on up-to-date observational results on the demographic of extragalactic X-ray sources and their extrapolations.These mocks reach fluxes below 1E-20 erg s-1 cm-2 in the 0.5-2 keV band, i.e., more than an order of magnitude below the predicted limits of future deep fields, and therefore represent an important tool for simulating extragalactic X-ray surveys with both current and future telescopes. We use our mocks to perform a set of end-to-end simulations of X-ray surveys with the forthcoming Athena mission and with the AXIS probe, a sub-arcsecond resolution X-ray mission concept proposed to the Astro 2020 Decadal Survey. We find that these proposed, next generation surveys may transform our knowledge of the deep X-ray Universe. As an example, in a total observing time of 15,Ms, AXIS would detect ~225,000 AGNs and ~$50,000 non-active galaxies, reaching a flux limit f_0.5-2~5E-19 erg/s/cm2 in the 0.5-2 keV band, with an improvement of over an order of magnitude with respect to surveys with current X-ray facilities. Consequently, 90% of these sources would be detected for the first time in the X-rays. Furthermore, we show that deep and wide X-ray surveys with instruments like AXIS and Athena are expected to detect ~20,000 z>3 AGNs and ~250 sources at redshift z>6, thus opening a new window of knowledge on the evolution of AGNs over cosmic time and putting strong constraints on the predictions of theoretical models of black hole seed accretion in the early universe.
The all-sky 408 MHz map of Haslam et al. is one the most important total-power radio surveys. It has been widely used to study diffuse synchrotron radiation from our Galaxy and as a template to remove foregrounds in cosmic microwave background data. However, there are a number of issues associated with it that must be dealt with, including large-scale striations and contamination from extragalactic radio sources. We have re-evaluated and re-processed the rawest data available to produce a new and improved 408 MHz all-sky map. We first quantify the positional accuracy ($approx 7$ arcmin) and effective beam ($56.0pm1.0$ arcmin) of the four individual surveys from which it was assembled. Large-scale striations associated with $1/f$ noise in the scan direction are reduced to a level $ll 1$ K using a Fourier-based filtering technique. The most important improvement results from the removal of extragalactic sources. We have used an iterative combination of two techniques -- two-dimensional Gaussian fitting and minimum curvature spline surface inpainting -- to remove the brightest sources ($gtrsim 2$ Jy), which provides a significant improvement over previo
62 - T. A. Perera 2013
A new technique for reliably identifying point sources in millimeter/sub-millimeter wavelength maps is presented. This method accounts for the frequency dependence of noise in the Fourier domain as well as non-uniformities in the coverage of a field. This optimal filter is an improvement over commonly-used matched filters that ignore coverage gradients. Treating noise variations in the Fourier domain as well as map space is traditionally viewed as a computationally intensive problem. We show that the penalty incurred in terms of computing time is quite small due to casting many of the calculations in terms of FFTs and exploiting the absence of sharp features in the noise spectra of observations. Practical aspects of implementing the optimal filter are presented in the context of data from the AzTEC bolometer camera. The advantages of using the new filter over the standard matched filter are also addressed in terms of a typical AzTEC map.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا