Do you want to publish a course? Click here

Historical astronomical data: urgent need for preservation, digitization enabling scientific exploration

99   0   0.0 ( 0 )
 Added by Alexei Pevtsov
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

Over the past decades and even centuries, the astronomical community has accumulated a signif-icant heritage of recorded observations of a great many astronomical objects. Those records con-tain irreplaceable information about long-term evolutionary and non-evolutionary changes in our Universe, and their preservation and digitization is vital. Unfortunately, most of those data risk becoming degraded and thence totally lost. We hereby call upon the astronomical community and US funding agencies to recognize the gravity of the situation, and to commit to an interna-tional preservation and digitization efforts through comprehensive long-term planning supported by adequate resources, prioritizing where the expected scientific gains, vulnerability of the origi-nals and availability of relevant infrastructure so dictates. The importance and urgency of this issue has been recognized recently by General Assembly XXX of the International Astronomical Union (IAU) in its Resolution B3: on preservation, digitization and scientific exploration of his-torical astronomical data. We outline the rationale of this promotion, provide examples of new science through successful recovery efforts, and review the potential losses to science if nothing it done.



rate research

Read More

We present CosmoHub (https://cosmohub.pic.es), a web application based on Hadoop to perform interactive exploration and distribution of massive cosmological datasets. Recent Cosmology seeks to unveil the nature of both dark matter and dark energy mapping the large-scale structure of the Universe, through the analysis of massive amounts of astronomical data, progressively increasing during the last (and future) decades with the digitization and automation of the experimental techniques. CosmoHub, hosted and developed at the Port dInformacio Cientifica (PIC), provides support to a worldwide community of scientists, without requiring the end user to know any Structured Query Language (SQL). It is serving data of several large international collaborations such as the Euclid space mission, the Dark Energy Survey (DES), the Physics of the Accelerating Universe Survey (PAUS) and the Marenostrum Institut de Ci`encies de lEspai (MICE) numerical simulations. While originally developed as a PostgreSQL relational database web frontend, this work describes the current version of CosmoHub, built on top of Apache Hive, which facilitates scalable reading, writing and managing huge datasets. As CosmoHubs datasets are seldomly modified, Hive it is a better fit. Over 60 TiB of catalogued information and $50 times 10^9$ astronomical objects can be interactively explored using an integrated visualization tool which includes 1D histogram and 2D heatmap plots. In our current implementation, online exploration of datasets of $10^9$ objects can be done in a timescale of tens of seconds. Users can also download customized subsets of data in standard formats generated in few minutes.
In the multi-messenger era, astronomical projects share information about transients phenomena issuing science alerts to the Scientific Community through different communications networks. This coordination is mandatory to understand the nature of these physical phenomena. For this reason, astrophysical projects rely on real-time analysis software pipelines to identify as soon as possible transients (e.g. GRBs), and to speed up external alerts reaction time. These pipelines can share and receive the science alerts through the Gamma-ray Coordinates Network. This work presents a framework designed to simplify the development of real-time scientific analysis pipelines. The framework provides the architecture and the required automatisms to develop a real-time analysis pipeline, allowing the researchers to focus more on the scientific aspects. The framework has been successfully used to develop real-time pipelines for the scientific analysis of the AGILE space mission data. It is planned to reuse this framework for the Super-GRAWITA and AFISS projects. A possible future use for the Cherenkov Telescope Array (CTA) project is under evaluation.
The fields of Astronomy and Astrophysics are technology limited, where the advent and application of new technologies to astronomy usher in a flood of discoveries altering our understanding of the Universe (e.g., recent cases include LIGO and the GRAVITY instrument at the VLTI). Currently, the field of astronomical spectroscopy is rapidly approaching an impasse: the size and cost of instruments, especially multi-object and integral field spectrographs for extremely large telescopes (ELTs), are pushing the limits of what is feasible, requiring optical components at the very edge of achievable size and performance. For these reasons, astronomers are increasingly looking for innovative solutions like photonic technologies that promote instrument miniaturization and simplification, while providing superior performance. Astronomers have long been aware of the potential of photonic technologies. The goal of this white paper is to draw attention to key photonic technologies and developments over the past two decades and demonstrate there is new momentum in this arena. We outline where the most critical efforts should be focused over the coming decade in order to move towards realizing a fully photonic instrument. A relatively small investment in this technology will advance astronomical photonics to a level where it can reliably be used to solve challenging instrument design limitations. For the benefit of both ground and space borne instruments alike, an endorsement from the National Academy of Sciences decadal survey will ensure that such solutions are set on a path to their full scientific exploitation, which may one day address a broad range of science cases outlined in the KSPs.
We present a new framework to detect various types of variable objects within massive astronomical time-series data. Assuming that the dominant population of objects is non-variable, we find outliers from this population by using a non-parametric Bayesian clustering algorithm based on an infinite GaussianMixtureModel (GMM) and the Dirichlet Process. The algorithm extracts information from a given dataset, which is described by six variability indices. The GMM uses those variability indices to recover clusters that are described by six-dimensional multivariate Gaussian distributions, allowing our approach to consider the sampling pattern of time-series data, systematic biases, the number of data points for each light curve, and photometric quality. Using the Northern Sky Variability Survey data, we test our approach and prove that the infinite GMM is useful at detecting variable objects, while providing statistical inference estimation that suppresses false detection. The proposed approach will be effective in the exploration of future surveys such as GAIA, Pan-Starrs, and LSST, which will produce massive time-series data.
209 - R. J. Hanisch 2015
The U.S. Virtual Astronomical Observatory was a software infrastructure and development project designed both to begin the establishment of an operational Virtual Observatory (VO) and to provide the U.S. coordination with the international VO effort. The concept of the VO is to provide the means by which an astronomer is able to discover, access, and process data seamlessly, regardless of its physical location. This paper describes the origins of the VAO, including the predecessor efforts within the U.S. National Virtual Observatory, and summarizes its main accomplishments. These accomplishments include the development of both scripting toolkits that allow scientists to incorporate VO data directly into their reduction and analysis environments and high-level science applications for data discovery, integration, analysis, and catalog cross-comparison. Working with the international community, and based on the experience from the software development, the VAO was a major contributor to international standards within the International Virtual Observatory Alliance. The VAO also demonstrated how an operational virtual observatory could be deployed, providing a robust operational environment in which VO services worldwide were routinely checked for aliveness and compliance with international standards. Finally, the VAO engaged in community outreach, developing a comprehensive web site with on-line tutorials, announcements, links to both U.S. and internationally developed tools and services, and exhibits and hands-on training .... All digital products of the VAO Project, including software, documentation, and tutorials, are stored in a repository for community access. The enduring legacy of the VAO is an increasing expectation that new telescopes and facilities incorporate VO capabilities during the design of their data management systems.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا