GeV emission from Gamma Ray Bursts: a radiative fireball?


الملخص بالإنكليزية

We study the emission observed at energies greater than 100 MeV of 11 Gamma Ray Bursts (GRBs) detected by the Fermi/Large Area Telescope (LAT) until October 2009. The GeV emission has three main properties: (i) its duration is often longer than the duration of the softer emission detected by the Gamma Burst Monitor (GBM) onboard Fermi [this confirms earlier results from the Energetic Gamma-Ray Experiment Telescope (EGRET)]; (ii) its spectrum is consistent with F(v) propto v^(-1) and does not show strong spectral evolution; (iii) for the brightest bursts, the flux detected by the LAT decays as a power law with a typical slope: t^(-1.5). We argue that the observed >0.1 GeV flux can be interpreted as afterglow emission shortly following the start of the prompt phase emission as seen at smaller frequencies. The decay slope is what expected if the fireball emission is produced in the radiative regime, i.e. all dissipated energy is radiated away. We also argue that the detectability in the GeV energy range depends on the bulk Lorentz factor Gamma of the bursts, being strongly favoured in the case of large Gamma. This implies that the fraction of bursts detected at high energies corresponds to the fraction of bursts having the largest Gamma. The radiative interpretation can help to explain why the observed X-ray and optical afterglow energetics are much smaller than the energetics emitted during the prompt phase, despite the fact that the collision with the external medium should be more efficient than internal shocks in producing the radiation we see.

تحميل البحث