Astrophysical explosions such as supernovae are fascinating events that require sophisticated algorithms and substantial computational power to model. Castro and MAESTROeX are nuclear astrophysics codes that simulate thermonuclear fusion in the context of supernovae and X-ray bursts. Examining these nuclear burning processes using high resolution simulations is critical for understanding how these astrophysical explosions occur. In this paper we describe the changes that have been made to these codes to transform them from standard MPI + OpenMP codes targeted at petascale CPU-based systems into a form compatible with the pre-exascale systems now online and the exascale systems coming soon. We then discuss what new science is possible to run on systems such as Summit and Perlmutter that could not have been achieved on the previous generation of supercomputers.
Performance tools for forthcoming heterogeneous exascale platforms must address two principal challenges when analyzing execution measurements. First, measurement of extreme-scale executions generates large volumes of performance data. Second, performance metrics for heterogeneous applications are significantly sparse across code regions. To address these challenges, we developed a novel streaming aggregation approach to post-mortem analysis that employs both shared and distributed memory parallelism to aggregate sparse performance measurements from every rank, thread and GPU stream of a large-scale application execution. Analysis results are stored in a pair of sparse formats designed for efficient access to related data elements, supporting responsive interactive presentation and scalable data analytics. Empirical analysis shows that our implementation of this approach in HPCToolkit effectively processes measurement data from thousands of threads using a fraction of the compute resources employed by the application itself. Our approach is able to perform analysis up to 9.4 times faster and store analysis results 23 times smaller than HPCToolkit, providing a key building block for scalable exascale performance tools.
High performance computing numerical simulations are today one of the more effective instruments to implement and study new theoretical models, and they are mandatory during the preparatory phase and operational phase of any scientific experiment. New challenges in Cosmology and Astrophysics will require a large number of new extremely computationally intensive simulations to investigate physical processes at different scales. Moreover, the size and complexity of the new generation of observational facilities also implies a new generation of high performance data reduction and analysis tools pushing toward the use of Exascale computing capabilities. Exascale supercomputers cannot be produced today. We discuss the major technological challenges in the design, development and use of such computing capabilities and we will report on the progresses that has been made in the last years in Europe, in particular in the framework of the ExaNeSt European funded project. We also discuss the impact of this new computing resources on the numerical codes in Astronomy and Astrophysics.
The origin of the elements is a fascinating question that scientists have been trying to answer for the last seven decades. The formation of light elements in the primordial universe and heavier elements in astrophysical sources occurs through nuclear reactions. We can say that nuclear processes are responsible for the production of energy and synthesis of elements in the various astrophysical sites. Thus, nuclear reactions have a determining role in the existence and evolution of several astrophysical environments, from the Sun to the spectacular explosions of supernovae. Nuclear astrophysics attempts to address the most basic and important questions of our existence and future. There are still many issues that are unresolved such as, how stars and our Galaxy have formed and how they evolve, how and where are the heaviest elements made, what is the abundance of nuclei in the universe and what is the nucleosynthesis output of the various production processes and why the amount of lithium-7 observed is less than predicted. In this paper, we review our current understanding of the different astrophysical nuclear processes leading to the formation of chemical elements and pay particular attention to the formation of heavy elements occurring during high-energy astrophysical events. Thanks to the recent multi-messenger observation of a binary neutron star merger, which also confirmed production of heavy elements, explosive scenarios such as short gamma-ray bursts and the following kilonovae are now strongly supported as nucleosynthesis sites.
In this review, we emphasize the interplay between astrophysical observations, modeling, and nuclear physics laboratory experiments. Several important nuclear cross sections for astrophysics have long been identified e.g. 12C(alpha,gamma)16O for stellar evolution, or 13C(alpha,n)16O and 22Ne(alpha,n)25Mg as neutron sources for the s-process. More recently, observations of lithium abundances in the oldest stars, or of nuclear gamma-ray lines from space, have required new laboratory experiments. New evaluation of thermonuclear reaction rates now includes the associated rate uncertainties that are used in astrophysical models to i) estimate final uncertainties on nucleosynthesis yields and ii) identify those reactions that require further experimental investigation. Sometimes direct cross section measurements are possible, but more generally the use of indirect methods is compulsory in view of the very low cross sections. Non-thermal processes are often overlooked but are also important for nuclear astrophysics, e.g. in gamma-ray emission from solar flares or in the interaction of cosmic rays with matter, and also motivate laboratory experiments. Finally, we show that beyond the historical motivations of nuclear astrophysics, understanding i) the energy sources that drive stellar evolution and ii) the origin of the elements can also be used to give new insights into physics beyond the standard model.
Astro-COLIBRI is a novel tool that evaluates alerts of transient observations in real time, filters them by user-specified criteria, and puts them into their multiwavelength and multimessenger context. Through fast generation of an overview of persistent sources as well as transient events in the relevant phase space, Astro-COLIBRI contributes to an enhanced discovery potential of both serendipitous and follow-up observations of the transient sky. The softwares architecture comprises a Representational State Transfer Application Programming Interface, both a static and a real-time database, a cloud-based alert system, as well as a website and apps for iOS and Android as clients for users. The latter provide a graphical representation with a summary of the relevant data to allow for the fast identification of interesting phenomena along with an assessment of observing conditions at a large selection of observatories around the world.