Do you want to publish a course? Click here

Real-Time Value-Driven Data Augmentation in the Era of LSST

55   0   0.0 ( 0 )
 Added by Niharika Sravan
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

The deluge of data from time-domain surveys is rendering traditional human-guided data collection and inference techniques impractical. We propose a novel approach for conducting data collection for science inference in the era of massive large-scale surveys that uses value-based metrics to autonomously strategize and co-ordinate follow-up in real-time. We demonstrate the underlying principles in the Recommender Engine For Intelligent Transient Tracking (REFITT) that ingests live alerts from surveys and value-added inputs from data brokers to predict the future behavior of transients and design optimal data augmentation strategies given a set of scientific objectives. The prototype presented in this paper is tested to work given simulated Rubin Observatory Legacy Survey of Space and Time (LSST) core-collapse supernova (CC SN) light-curves from the PLAsTiCC dataset. CC SNe were selected for the initial development phase as they are known to be difficult to classify, with the expectation that any learning techniques for them should be at least as effective for other transients. We demonstrate the behavior of REFITT on a random LSST night given ~32000 live CC SNe of interest. The system makes good predictions for the photometric behavior of the events and uses them to plan follow-up using a simple data-driven metric. We argue that machine-directed follow-up maximizes the scientific potential of surveys and follow-up resources by reducing downtime and bias in data collection.



rate research

Read More

A community meeting on the topic of Radio Astronomy in the LSST Era was hosted by the National Radio Astronomy Observatory in Charlottesville, VA (2013 May 6--8). The focus of the workshop was on time domain radio astronomy and sky surveys. For the time domain, the extent to which radio and visible wavelength observations are required to understand several classes of transients was stressed, but there are also classes of radio transients for which no visible wavelength counterpart is yet known, providing an opportunity for discovery. From the LSST perspective, the LSST is expected to generate as many as 1 million alerts nightly, which will require even more selective specification and identification of the classes and characteristics of transients that can warrant follow up, at radio or any wavelength. The LSST will also conduct a deep survey of the sky, producing a catalog expected to contain over 38 billion objects in it. Deep radio wavelength sky surveys will also be conducted on a comparable time scale, and radio and visible wavelength observations are part of the multi-wavelength approach needed to classify and understand these objects. Radio wavelengths are valuable because they are unaffected by dust obscuration and, for galaxies, contain contributions both from star formation and from active galactic nuclei. The workshop touched on several other topics, on which there was consensus including the placement of other LSST Deep Drilling Fields, inter-operability of software tools, and the challenge of filtering and exploiting the LSST data stream. There were also topics for which there was insufficient time for full discussion or for which no consensus was reached, which included the procedures for following up on LSST observations and the nature for future support of researchers desiring to use LSST data products.
Astrophysical observations currently provide the only robust, empirical measurements of dark matter. In the coming decade, astrophysical observations will guide other experimental efforts, while simultaneously probing unique regions of dark matter parameter space. This white paper summarizes astrophysical observations that can constrain the fundamental physics of dark matter in the era of LSST. We describe how astrophysical observations will inform our understanding of the fundamental properties of dark matter, such as particle mass, self-interaction strength, non-gravitational interactions with the Standard Model, and compact object abundances. Additionally, we highlight theoretical work and experimental/observational facilities that will complement LSST to strengthen our understanding of the fundamental characteristics of dark matter.
In the multi-messenger era, space and ground-based observatories usually develop real-time analysis (RTA) pipelines to rapidly detect transient events and promptly share information with the scientific community to enable follow-up observations. These pipelines can also react to science alerts shared by other observatories through networks such as the Gamma-Ray Coordinates Network (GCN) and the Astronomers Telegram (ATels). AGILE is a space mission launched in 2007 to study X-ray and gamma-ray phenomena. This contribution presents the technologies used to develop two types of AGILE pipelines using the RTApipe framework and an overview of the main scientific results. The first type performs automated analyses on new AGILE data to detect transient events and automatically sends AGILE notices to the GCN network. Since May 2019, this pipeline has sent more than 50 automated notices with a few minutes delay since data arrival. The second type of pipeline reacts to multi-messenger external alerts (neutrinos, gravitational waves, GRBs, and other transients) received through the GCN network and performs hundreds of analyses searching for counterparts in all AGILE instruments data. The AGILE Team uses these pipelines to perform fast follow-up of science alerts reported by other facilities, which resulted in the publishing of several ATels and GCN circulars.
Experience suggests that structural issues in how institutional Astrophysics approaches data-driven science and the development of discovery technology may be hampering the communitys ability to respond effectively to a rapidly changing environment in which increasingly complex, heterogeneous datasets are challenging our existing information infrastructure and traditional approaches to analysis. We stand at the confluence of a new epoch of multimessenger science, remote co-location of data and processing power and new observing strategies based on miniaturized spacecraft. Significant effort will be required by the community to adapt to this rapidly evolving range of possible discovery moduses. In the suggested creation of a new Astrophysics element, Advanced Astrophysics Discovery Technology, we offer an affirmative solution that places the visibility of discovery technologies at a level that we suggest is fully commensurate with their importance to the future of the field.
The Large Synoptic Survey Telescope is designed to provide an unprecedented optical imaging dataset that will support investigations of our Solar System, Galaxy and Universe, across half the sky and over ten years of repeated observation. However, exactly how the LSST observations will be taken (the observing strategy or cadence) is not yet finalized. In this dynamically-evolving community white paper, we explore how the detailed performance of the anticipated science investigations is expected to depend on small changes to the LSST observing strategy. Using realistic simulations of the LSST schedule and observation properties, we design and compute diagnostic metrics and Figures of Merit that provide quantitative evaluations of different observing strategies, analyzing their impact on a wide range of proposed science projects. This is work in progress: we are using this white paper to communicate to each other the relative merits of the observing strategy choices that could be made, in an effort to maximize the scientific value of the survey. The investigation of some science cases leads to suggestions for new strategies that could be simulated and potentially adopted. Notably, we find motivation for exploring departures from a spatially uniform annual tiling of the sky: focusing instead on different parts of the survey area in different years in a rolling cadence is likely to have significant benefits for a number of time domain and moving object astronomy projects. The communal assembly of a suite of quantified and homogeneously coded metrics is the vital first step towards an automated, systematic, science-based assessment of any given cadence simulation, that will enable the scheduling of the LSST to be as well-informed as possible.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا