Neural networks and standard cosmography with newly calibrated high redshift GRB observations


Abstract in English

Gamma-ray bursts (GRBs) detected at high redshift can be used to trace the cosmic expansion history. However, the calibration of their luminosity distances is not an easy task in comparison to Type Ia Supernovae (SNeIa). To calibrate these data, correlations between their luminosity and other observed properties of GRBs need to be identified, and we must consider the validity of our assumptions about these correlations over their entire observed redshift range. In this work, we propose a new method to calibrate GRBs as cosmological distance indicators using SNeIa observations with a completely model-independent deep learning architecture. An overview of this machine learning technique was developed in [1] to study the evolution of dark energy models at high redshift. The aim of the method developed in this work is to combine two networks: a Recurrent Neural Network (RNN) and a Bayesian Neural Network (BNN). Using this computational approach, denoted RNN+BNN, we extend the networks efficacy by adding the computation of covariance matrices to the Bayesian process. Once this is done, the SNeIa distance-redshift relation can be tested on the full GRB sample and therefore used to implement a cosmographic reconstruction of the distance-redshift relation in different regimes. Thus, our newly-trained neural network is used to constrain the parameters describing the kinematical state of the Universe via a cosmographic approach at high redshifts (up to $zapprox 10$), wherein we require a very minimal set of assumptions that do not rely on dynamical equations for any specific theory of gravity.

Download