No Arabic abstract
The field of astronomy has arrived at a turning point in terms of size and complexity of both datasets and scientific collaboration. Commensurately, algorithms and statistical models have begun to adapt --- e.g., via the onset of artificial intelligence --- which itself presents new challenges and opportunities for growth. This white paper aims to offer guidance and ideas for how we can evolve our technical and collaborative frameworks to promote efficient algorithmic development and take advantage of opportunities for scientific discovery in the petabyte era. We discuss challenges for discovery in large and complex data sets; challenges and requirements for the next stage of development of statistical methodologies and algorithmic tool sets; how we might change our paradigms of collaboration and education; and the ethical implications of scientists contributions to widely applicable algorithms and computational modeling. We start with six distinct recommendations that are supported by the commentary following them. This white paper is related to a larger corpus of effort that has taken place within and around the Petabytes to Science Workshops (https://petabytestoscience.github.io/).
We outline the challenges faced by the planetary science community in the era of next-generation large-scale astronomical surveys, and highlight needs that must be addressed in order for the community to maximize the quality and quantity of scientific output from archival, existing, and future surveys, while satisfying NASAs and NSFs goals.
Machine Learning algorithms are good tools for both classification and prediction purposes. These algorithms can further be used for scientific discoveries from the enormous data being collected in our era. We present ways of discovering and understanding astronomical phenomena by applying machine learning algorithms to data collected with radio telescopes. We discuss the use of supervised machine learning algorithms to predict the free parameters of star formation histories and also better understand the relations between the different input and output parameters. We made use of Deep Learning to capture the non-linearity in the parameters. Our models are able to predict with low error rates and give the advantage of predicting in real time once the model has been trained. The other class of machine learning algorithms viz. unsupervised learning can prove to be very useful in finding patterns in the data. We explore how we use such unsupervised techniques on solar radio data to identify patterns and variations, and also link such findings to theories, which help to better understand the nature of the system being studied. We highlight the challenges faced in terms of data size, availability, features, processing ability and importantly, the interpretability of results. As our ability to capture and store data increases, increased use of machine learning to understand the underlying physics in the information captured seems inevitable.
Astronomy is entering a new era of discovery, coincident with the establishment of new facilities for observation and simulation that will routinely generate petabytes of data. While an increasing reliance on automated data analysis is anticipated, a critical role will remain for visualization-based knowledge discovery. We have investigated scientific visualization applications in astronomy through an examination of the literature published during the last two decades. We identify the two most active fields for progress - visualization of large-N particle data and spectral data cubes - discuss open areas of research, and introduce a mapping between astronomical sources of data and data representations used in general purpose visualization tools. We discuss contributions using high performance computing architectures (e.g: distributed processing and GPUs), collaborative astronomy visualization, the use of workflow systems to store metadata about visualization parameters, and the use of advanced interaction devices. We examine a number of issues that may be limiting the spread of scientific visualization research in astronomy and identify six grand challenges for scientific visualization research in the Petascale Astronomy Era.
This report provides an overview of recent work that harnesses the Big Data Revolution and Large Scale Computing to address grand computational challenges in Multi-Messenger Astrophysics, with a particular emphasis on real-time discovery campaigns. Acknowledging the transdisciplinary nature of Multi-Messenger Astrophysics, this document has been prepared by members of the physics, astronomy, computer science, data science, software and cyberinfrastructure communities who attended the NSF-, DOE- and NVIDIA-funded Deep Learning for Multi-Messenger Astrophysics: Real-time Discovery at Scale workshop, hosted at the National Center for Supercomputing Applications, October 17-19, 2018. Highlights of this report include unanimous agreement that it is critical to accelerate the development and deployment of novel, signal-processing algorithms that use the synergy between artificial intelligence (AI) and high performance computing to maximize the potential for scientific discovery with Multi-Messenger Astrophysics. We discuss key aspects to realize this endeavor, namely (i) the design and exploitation of scalable and computationally efficient AI algorithms for Multi-Messenger Astrophysics; (ii) cyberinfrastructure requirements to numerically simulate astrophysical sources, and to process and interpret Multi-Messenger Astrophysics data; (iii) management of gravitational wave detections and triggers to enable electromagnetic and astro-particle follow-ups; (iv) a vision to harness future developments of machine and deep learning and cyberinfrastructure resources to cope with the scale of discovery in the Big Data Era; (v) and the need to build a community that brings domain experts together with data scientists on equal footing to maximize and accelerate discovery in the nascent field of Multi-Messenger Astrophysics.
To take advantage of the astrophysical potential of Gamma-Ray Bursts (GRBs), Chinese and French astrophysicists have engaged the SVOM mission (Space-based multi-band astronomical Variable Objects Monitor). Major advances in GRB studies resulting from the synergy between space and ground observations, the SVOM mission implements space and ground instrumentation. The scientific objectives of the mission put a special emphasis on two categories of GRBs: very distant GRBs at z$>$5 which constitute exceptional cosmological probes, and faint/soft nearby GRBs which allow probing the nature of the progenitors and the physics at work in the explosion. These goals have a major impact on the design of the mission: the on-board hard X-ray imager is sensitive down to 4 keV and computes on line image and rate triggers, and the follow-up telescopes on the ground are sensitive in the NIR. At the beginning of the next decade, SVOM will be the main provider of GRB positions and spectral parameters on very short time scale. The SVOM instruments will operate simultaneously with a wide range of powerful astronomical devices. This rare instrumental conjunction, combined with the relevance of the scientific topics connected with GRB studies, warrants a remarkable scientific return for SVOM. In addition, the SVOM instrumentation, primarily designed for GRB studies, composes a unique multi-wavelength observatory with rapid slew capability that will find multiple applications for the whole astronomy community beyond the specific objectives linked to GRBs. This report lists the scientific themes that will benefit from observations made with SVOM, whether they are specific GRB topics, or more generally all the issues that can take advantage of the multi-wavelength capabilities of SVOM.