ترغب بنشر مسار تعليمي؟ اضغط هنا

A strategy for LSST to unveil a population of kilonovae without gravitational-wave triggers

53   0   0.0 ( 0 )
 نشر من قبل Igor Andreoni
 تاريخ النشر 2018
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a cadence optimization strategy to unveil a large population of kilonovae using optical imaging alone. These transients are generated during binary neutron star and potentially neutron star-black hole mergers and are electromagnetic counterparts to gravitational-wave signals detectable in nearby events with Advanced LIGO, Advanced Virgo, and other interferometers that will come online in the near future. Discovering a large population of kilonovae will allow us to determine how heavy element production varies with the intrinsic parameters of the merger and across cosmic time. The rate of binary neutron star mergers is still uncertain, but only few (less than 15) events with associated kilonovae may be detectable per year within the horizon of next-generation ground-based interferometers. The rapid evolution (hours to days) at optical/infrared wavelengths, relatively low luminosity, and the low volumetric rate of kilonovae makes their discovery difficult, especially during blind surveys of the sky. We propose future large surveys to adopt a rolling cadence in which g-i observations are taken nightly for blocks of 10 consecutive nights. With the current baseline2018a cadence designed for the Large Synoptic Survey Telescope (LSST), less than 7.5 poorly-sampled kilonovae are expected to be detected in both the Wide Fast Deep (WFD) and Deep Drilling Fields (DDF) surveys per year, under optimistic assumptions on their rate, duration, and luminosity. We estimate the proposed strategy to return up to about 272 GW170817-like kilonovae throughout the LSST WFD survey, discovered independently from gravitational-wave triggers.



قيم البحث

اقرأ أيضاً

With the detection of a binary neutron star system and its corresponding electromagnetic counterparts, a new window of transient astronomy has opened. Due to the size of the error regions, which can span hundreds to thousands of square degrees, there are significant benefits to optimizing tilings for these large sky areas. The rich science promised by gravitational-wave astronomy has led to the proposal for a variety of tiling and time allocation schemes, and for the first time, we make a systematic comparison of some of these methods. We find that differences of a factor of 2 or more in efficiency are possible, depending on the algorithm employed. For this reason, for future surveys searching for electromagnetic counterparts, care should be taken when selecting tiling, time allocation, and scheduling algorithms to maximize the probability of counterpart detection.
We investigate the ability of the Large Synoptic Survey Telescope (LSST) to discover kilonovae (kNe) from binary neutron star (BNS) and neutron star-black hole (NSBH) mergers, focusing on serendipitous detections in the Wide-Fast-Deep (WFD) survey. W e simulate observations of kNe with proposed LSST survey strategies, paying particular attention to cadence choices that are compatible with the broader LSST cosmology programme. We find that if all kNe are identical to GW170817, the baseline survey strategy will yield 58 kNe over the survey lifetime. If we instead assume a representative population model of BNS kNe, we expect to detect only 27 kNe. However, we find the choice of survey strategy significantly impacts these numbers and can increase them to 254 kNe and 82 kNe over the survey lifetime, respectively. This improvement arises from an increased cadence of observations between different filters with respect to the baseline. We then consider the ability of the Advanced LIGO/Virgo (ALV) detector network to detect these BNS mergers. If the optimal survey strategy is adopted, 202 of the GW170817-like kNe and 56 of the BNS population model kNe are detected with LSST but are below the threshold for detection by the ALV network. This represents, for both models, an increase by a factor greater than 4.5 in the number of detected sub-threshold events over the baseline survey strategy. Such a population of sub-threshold events would provide an opportunity to conduct electromagnetic-triggered searches for signals in gravitational-wave detector data and assess selection effects in measurements of the Hubble constant from standard sirens, e.g., related to viewing angle effects.
The Wide-Field Infrared Survey Telescope (WFIRST) is expected to launch in the mid-2020s. With its wide-field near-infrared (NIR) camera, it will survey the sky to unprecedented detail. As part of normal operations and as the result of multiple expec ted dedicated surveys, WFIRST will produce several relatively wide-field (tens of square degrees) deep (limiting magnitude of 28 or fainter) fields. In particular, a planned supernova survey is expected to image 3 deep fields in the LSST footprint roughly every 5 days over 2 years. Stacking all data, this survey will produce, over all WFIRST supernova fields in the LSST footprint, ~12-25 deg^2 and ~5-15 deg^2 regions to depths of ~28 mag and ~29 mag, respectively. We suggest LSST undertake mini-surveys that will match the WFIRST cadence and simultaneously observe the supernova survey fields during the 2-year WFIRST supernova survey, achieving a stacked depth similar to that of the WFIRST data. We also suggest additional observations of these same regions throughout the LSST survey to get deep images earlier, have long-term monitoring in the fields, and produce deeper images overall. These fields will provide a legacy for cosmology, extragalactic, and transient/variable science.
We combine hierarchical Bayesian modeling with a flow-based deep generative network, in order to demonstrate that one can efficiently constraint numerical gravitational wave (GW) population models at a previously intractable complexity. Existing tech niques for comparing data to simulation,such as discrete model selection and Gaussian process regression, can only be applied efficiently to moderate-dimension data. This limits the number of observable (e.g. chirp mass, spins.) and hyper-parameters (e.g. common envelope efficiency) one can use in a population inference. In this study, we train a network to emulate a phenomenological model with 6 observables and 4 hyper-parameters, use it to infer the properties of a simulated catalogue and compare the results to the phenomenological model. We find that a 10-layer network can emulate the phenomenological model accurately and efficiently. Our machine enables simulation-based GW population inferences to take on data at a new complexity level.
Fink is a broker designed to enable science with large time-domain alert streams such as the one from the upcoming Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST). It exhibits traditional astronomy broker features such as automatised ingestion, annotation, selection and redistribution of promising alerts for transient science. It is also designed to go beyond traditional broker features by providing real-time transient classification which is continuously improved by using state-of-the-art Deep Learning and Adaptive Learning techniques. These evolving added values will enable more accurate scientific output from LSST photometric data for diverse science cases while also leading to a higher incidence of new discoveries which shall accompany the evolution of the survey. In this paper we introduce Fink, its science motivation, architecture and current status including first science verification cases using the Zwicky Transient Facility alert stream.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا