Do you want to publish a course? Click here

SPOKES: an End-to-End Simulation Facility for Spectroscopic Cosmological Surveys

112   0   0.0 ( 0 )
 Added by Brian Nord
 Publication date 2016
  fields Physics
and research's language is English




Ask ChatGPT about the research

The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). We discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.



rate research

Read More

Massive stars can explode as supernovae at the end of their life cycle, releasing neutrinos whose total energy reaches $10^{53}$ erg. Moreover, neutrinos play key roles in supernovae, heating and reviving the shock wave as well as cooling the resulting protoneutron star. Therefore, neutrino detectors are waiting to observe the next galactic supernova and several theoretical simulations of supernova neutrinos are underway. While these simulation concentrate mainly on only the first one second after the supernova bounce, the only observation of a supernova with neutrinos, SN 1987A, revealed that neutrino emission lasts for more than 10 seconds. For this reason, long-time simulation and analysis tools are needed to compare theories with the next observation. Our study is to develop an integrated supernova analysis framework to prepare an analysis pipeline for treating galactic supernovae observations in the near future. This framework deals with the core-collapse, bounce and proto-neutron star cooling processes, as well as with neutrino detection on earth in a consistent manner. We have developed a new long-time supernova simulation in one dimension that explodes successfully and computes the neutrino emission for up to 20 seconds. Using this model we estimate the resulting neutrino signal in the Super-Kamiokande detector to be about 1,800 events for an explosion at 10 kpc and discuss its implications in this paper. We compare this result with the SN 1987A observation to test its reliability.
We present end-to-end simulations of SCALES, the third generation thermal-infrared diffraction limited imager and low/med-resolution integral field spectrograph (IFS) being designed for Keck. The 2-5 micron sensitivity of SCALES enables detection and characterization of a wide variety of exoplanets, including exoplanets detected through long-baseline astrometry, radial-velocity planets on wide orbits, accreting protoplanets in nearby star-forming regions, and reflected-light planets around the nearest stars. The simulation goal is to generate high-fidelity mock data to assess the scientific capabilities of the SCALES instrument at current and future design stages. The simulation processes arbitrary-resolution input intensity fields with a proposed observation pattern into an entire mock dataset of raw detector read-out lenslet-based IFS frames with calibrations and metadata, which are then reduced by the IFS data reduction pipeline to be analyzed by the user.
Realistic synthetic observations of theoretical source models are essential for our understanding of real observational data. In using synthetic data, one can verify the extent to which source parameters can be recovered and evaluate how various data corruption effects can be calibrated. These studies are important when proposing observations of new sources, in the characterization of the capabilities of new or upgraded instruments, and when verifying model-based theoretical predictions in a comparison with observational data. We present the SYnthetic Measurement creator for long Baseline Arrays (SYMBA), a novel synthetic data generation pipeline for Very Long Baseline Interferometry (VLBI) observations. SYMBA takes into account several realistic atmospheric, instrumental, and calibration effects. We used SYMBA to create synthetic observations for the Event Horizon Telescope (EHT), a mm VLBI array, which has recently captured the first image of a black hole shadow. After testing SYMBA with simple source and corruption models, we study the importance of including all corruption and calibration effects. Based on two example general relativistic magnetohydrodynamics (GRMHD) model images of M87, we performed case studies to assess the attainable image quality with the current and future EHT array for different weather conditions. The results show that the effects of atmospheric and instrumental corruptions on the measured visibilities are significant. Despite these effects, we demonstrate how the overall structure of the input models can be recovered robustly after performing calibration steps. With the planned addition of new stations to the EHT array, images could be reconstructed with higher angular resolution and dynamic range. In our case study, these improvements allowed for a distinction between a thermal and a non-thermal GRMHD model based on salient features in reconstructed images.
For nearly a century, imaging and spectroscopic surveys of galaxies have given us information about the contents of the universe. We attempt to define the logical endpoint of such surveys by defining not the next galaxy survey, but the final galaxy survey at NIR wavelengths; this would be the galaxy survey that exhausts the information content useful for addressing extant questions. Such a survey would require incredible advances in a number of technologies and the survey details will depend on the as yet poorly constrained properties of the earliest galaxies. Using an exposure time calculator, we define nominal surveys for extracting the useful information for three science cases: dark energy cosmology, galaxy evolution, and supernovae. We define scaling relations that trade off sky background, telescope aperture, and focal plane size to allow for a survey of a given depth over a given area. For optimistic assumptions, a 280m telescope with a marginally resolved focal plane of 20 deg$^2$ operating at L2 could potentially exhaust the cosmological information content of galaxies in a 10 year survey. For galaxy evolution (making use of gravitational lensing to magnify the earliest galaxies) and SN, the same telescope would suffice. We discuss the technological advances needed to complete the last galaxy survey. While the final galaxy survey remains well outside of our technical reach today, we present scaling relations that show how we can progress toward the goal of exhausting the information content encoded in the shapes, positions, and colors of galaxies.
Large multi-object spectroscopic surveys require automated algorithms to optimise their observing strategy. One of the most ambitious upcoming spectroscopic surveys is the 4MOST survey. The 4MOST survey facility is a fibre-fed spectroscopic instrument on the VISTA telescope with a large enough field of view to survey a large fraction of the southern sky within a few years. Several Galactic and extragalactic surveys will be carried out simultaneously, so the combined target density will strongly vary. In this paper, we describe a new tiling algorithm that can naturally deal with the large target density variations on the sky and which automatically handles the different exposure times of targets. The tiling pattern is modelled as a marked point process, which is characterised by a probability density that integrates the requirements imposed by the 4MOST survey. The optimal tilling pattern with respect to the defined model is estimated by the tiles configuration that maximises the proposed probability density. In order to achieve this maximisation a simulated annealing algorithm is implemented. The algorithm automatically finds an optimal tiling pattern and assigns a tentative sky brightness condition and exposure time for each tile, while minimising the total execution time that is needed to observe the list of targets in the combined input catalogue of all surveys. Hence, the algorithm maximises the long-term observing efficiency and provides an optimal tiling solution for the survey. While designed for the 4MOST survey, the algorithm is flexible and can with simple modifications be applied to any other multi-object spectroscopic survey.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا