ترغب بنشر مسار تعليمي؟ اضغط هنا

131 - E. Rozo , E. S. Rykoff , A. Abate 2015
We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photozs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshift range $zin[0.2,0.8]$. Our fiducial sample has a comoving space density of $10^{-3} (h^{-1} Mpc)^{-3}$, and a median photoz bias ($z_{spec}-z_{photo}$) and scatter $(sigma_z/(1+z))$ of 0.005 and 0.017 respectively. The corresponding $5sigma$ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photoz biases at the 0.1% level.
FAUST$^2$ is a software tool that generates formal abstractions of (possibly non-deterministic) discrete-time Markov processes (dtMP) defined over uncountable (continuous) state spaces. A dtMP model is specified in MATLAB and abstracted as a finite-s tate Markov chain or Markov decision processes. The abstraction procedure runs in MATLAB and employs parallel computations and fast manipulations based on vector calculus. The abstract model is formally put in relationship with the concrete dtMP via a user-defined maximum threshold on the approximation error introduced by the abstraction procedure. FAUST$^2$ allows exporting the abstract model to well-known probabilistic model checkers, such as PRISM or MRMC. Alternatively, it can handle internally the computation of PCTL properties (e.g. safety or reach-avoid) over the abstract model, and refine the outcomes over the concrete dtMP via a quantified error that depends on the abstraction procedure and the given formula. The toolbox is available at http://sourceforge.net/projects/faust2/
99 - J. Newman , A. Abate , F. Abdalla 2013
Large sets of objects with spectroscopic redshift measurements will be needed for imaging dark energy experiments to achieve their full potential, serving two goals:_training_, i.e., the use of objects with known redshift to develop and optimize phot ometric redshift algorithms; and_calibration_, i.e., the characterization of moments of redshift (or photo-z error) distributions. Better training makes cosmological constraints from a given experiment stronger, while highly-accurate calibration is needed for photo-z systematics not to dominate errors. In this white paper, we investigate the required scope of spectroscopic datasets which can serve both these purposes for ongoing and next-generation dark energy experiments, as well as the time required to obtain such data with instruments available in the next decade. Large time allocations on kilo-object spectrographs will be necessary, ideally augmented by infrared spectroscopy from space. Alternatively, precision calibrations could be obtained by measuring cross-correlation statistics using samples of bright objects from a large baryon acoustic oscillation experiment such as DESI. We also summarize the additional work on photometric redshift methods needed to prepare for ongoing and future dark energy experiments.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا