ترغب بنشر مسار تعليمي؟ اضغط هنا

A Big Sky Approach to Cadence Diplomacy

75   0   0.0 ( 0 )
 نشر من قبل Knut Olsen
 تاريخ النشر 2018
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The LSST survey was designed to deliver transformative results for four primary objectives: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. While the LSST Wide-Fast-Deep survey and accompanying Deep Drilling and mini-surveys will be ground-breaking for each of these areas, there remain competing demands on the survey area, depth, and temporal coverage amid a desire to maximize all three. In this white paper, we seek to address a principal source of tension between the different LSST science collaborations, that of the survey area and depth that they each need in the parts of the sky that they care about. We present simple tools which can be used to explore trades between the area surveyed by LSST and the number of visits available per field and then use these tools to propose a change to the baseline survey strategy. Specifically, we propose to reconfigure the WFD footprint to consist of low-extinction regions (limited by galactic latitude), with the number of visits per field in WFD limited by the LSST Science Requirements Document (SRD) design goal, and suggest assignment of the remaining LSST visits to the full visible LSST sky. This proposal addresses concerns with the WFD footprint raised by the DESC (as 25 percent of the current baseline WFD region is not usable for dark energy science due to MW dust extinction), eases the time required for the NES and SCP mini-surveys (since in our proposal they would partially fall into the modified WFD footprint), raises the number of visits previously assigned to the GP region, and increases the overlap with DESI and other Northern hemisphere follow-up facilities. This proposal alleviates many of the current concerns of Science Collaborations that represent the four scientific pillars of LSST and provides a Big Sky approach to cadence diplomacy.

قيم البحث

اقرأ أيضاً

Technology has advanced to the point that it is possible to image the entire sky every night and process the data in real time. The sky is hardly static: many interesting phenomena occur, including variable stationary objects such as stars or QSOs, t ransient stationary objects such as supernovae or M dwarf flares, and moving objects such as asteroids and the stars themselves. Funded by NASA, we have designed and built a sky survey system for the purpose of finding dangerous near-Earth asteroids (NEAs). This system, the Asteroid Terrestrial-impact Last Alert System (ATLAS), has been optimized to produce the best survey capability per unit cost, and therefore is an efficient and competitive system for finding potentially hazardous asteroids (PHAs) but also for tracking variables and finding transients. While carrying out its NASA mission, ATLAS now discovers more bright ($m < 19$) supernovae candidates than any ground based survey, frequently detecting very young explosions due to its 2 day cadence. ATLAS discovered the afterglow of a gamma-ray burst independent of the high energy trigger and has released a variable star catalogue of 5$times10^{6}$ sources. This, the first of a series of articles describing ATLAS, is devoted to the design and performance of the ATLAS system. Subsequent articles will describe in more detail the software, the survey strategy, ATLAS-derived NEA population statistics, transient detections, and the first data release of variable stars and transient lightcurves.
The increasing volumes of astronomical data require practical methods for data exploration, access and visualisation. The Hierarchical Progressive Survey (HiPS) is a HEALPix based scheme that enables a multi-resolution approach to astronomy data from the individual pixels up to the whole sky. We highlight the decisions and approaches that have been taken to make this scheme a practical solution for managing large volumes of heterogeneous data. Early implementors of this system have formed a network of HiPS nodes, with some 250 diverse data sets currently available, with multiple mirror implementations for important data sets. This hierarchical approach can be adapted to expose Big Data in different ways. We describe how the ease of implementation, and local customisation of the Aladin Lite embeddable HiPS visualiser have been keys for promoting collaboration on HiPS.
New mass-produced, wide-field, small-aperture telescopes have the potential to revolutionize ground-based astronomy by greatly reducing the cost of collecting area. In this paper, we introduce a new class of large telescope based on these advances: a n all-sky, arcsecond-resolution, 1000-telescope array which builds a simultaneously high-cadence and deep survey by observing the entire sky all night. As a concrete example, we describe the Argus Array, a 5m-class telescope with an all-sky field of view and the ability to reach extremely high cadences using low-noise CMOS detectors. Each 55 GPix Argus exposure covers 20% of the entire sky to g=19.6 each minute and g=21.9 each hour; a high-speed mode will allow sub-second survey cadences for short times. Deep coadds will reach g=23.6 every five nights over 47% of the sky; a larger-aperture array telescope, with an etendue close to the Rubin Observatory, could reach g=24.3 in five nights. These arrays can build two-color, million-epoch movies of the sky, enabling sensitive and rapid searches for high-speed transients, fast-radio-burst counterparts, gravitational-wave counterparts, exoplanet microlensing events, occultations by distant solar system bodies, and myriad other phenomena. An array of O(1,000) telescopes, however, would be one of the most complex astronomical instruments yet built. Standard arrays with hundreds of tracking mounts entail thousands of moving parts and exposed optics, and maintenance costs would rapidly outpace the mass-produced-hardware cost savings compared to a monolithic large telescope. We discuss how to greatly reduce operations costs by placing all optics in a thermally controlled, sealed dome with a single moving part. Coupled with careful software scope control and use of existing pipelines, we show that the Argus Array could become the deepest and fastest Northern sky survey, with total costs below $20M.
The millihertz gravitational-wave frequency band is expected to contain a rich symphony of signals with sources ranging from galactic white dwarf binaries to extreme mass ratio inspirals. Many of these gravitational-wave signals will not be individua lly resolvable. Instead, they will incoherently add to produce stochastic gravitational-wave confusion noise whose frequency content will be governed by the dynamics of the sources. The angular structure of the power of the confusion noise will be modulated by the distribution of the sources across the sky. Measurement of this structure can yield important information about the distribution of sources on galactic and extra-galactic scales, their astrophysics and their evolution over cosmic timescales. Moreover, since the confusion noise is part of the noise budget of LISA, mapping it will also be essential for studying resolvable signals. In this paper, we present a Bayesian algorithm to probe the angular distribution of the stochastic gravitational-wave confusion noise with LISA using a spherical harmonic basis. We develop a technique based on Clebsch-Gordan coefficients to mathematically constrain the spherical harmonics to yield a non-negative distribution, making them optimal for expanding the gravitational-wave power and amenable to Bayesian inference. We demonstrate these techniques using a series of simulations and analyses, including recovery of simulated distributed and localized sources of gravitational-wave power. We also apply this method to map the gravitational-wave foreground from galactic white-dwarfs using a simplified model of the galactic white dwarf distribution.
As current- and next-generation astronomical instruments come online, they will generate an unprecedented deluge of data. Analyzing these data in real time presents unique conceptual and computational challenges, and their long-term storage and archi ving is scientifically essential for generating reliable, reproducible results. We present here the real-time processing (RTP) system for the Hydrogen Epoch of Reionization Array (HERA), a radio interferometer endeavoring to provide the first detection of the highly redshifted 21 cm signal from Cosmic Dawn and the Epoch of Reionization by an interferometer. The RTP system consists of analysis routines run on raw data shortly after they are acquired, such as calibration and detection of radio-frequency interference (RFI) events. RTP works closely with the Librarian, the HERA data storage and transfer manager which automatically ingests data and transfers copies to other clusters for post-processing analysis. Both the RTP system and the Librarian are public and open source software, which allows for them to be modified for use in other scientific collaborations. When fully constructed, HERA is projected to generate over 50 terabytes (TB) of data each night, and the RTP system enables the successful scientific analysis of these data.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا