Luminosity Selection for Gamma Ray Bursts


Abstract in English

There exists an inevitable scatter in intrinsic luminosity of Gamma Ray Bursts(GRBs). If there is relativistic beaming in the source, viewing angle variation necessarily introduces variation in the intrinsic luminosity function(ILF). Scatter in the ILF can cause a selection bias where distant sources that are detected have a larger median luminosity than those detected close by. Median luminosity, as we know, divides any given population into equal halves. When the functional form of a distribution is unknown, it can be a more robust diagnostic than any that use trial functional forms. In this work we employ a statistical test based on median luminosity and apply it to test a class of models for GRBs. We assume that the GRB jet has a finite opening angle and that the orientation of the GRB jet is random relative to the observer. We parameterize the jet with constant Lorentz factor $Gamma$ and opening angle $theta_0$. We calculate $L_{median}$ as a function of redshift with an average of 17 grbs in each redshift bin($dz=0.01$) empirically, theoretically and use Fermi GBM data, noting that SWIFT data is problematic as it is biased, specially at high redshifts. We find that $L_{median}$ is close to $L_{max}$ for sufficiently extended GRB jet and does not fit the data. We find an acceptable fit with the data when $Gamma$ is between $100$ and $200$, $theta_0leq 0.1$, provided that the jet material along the line of sight to the on axis observer is optically thick, such that the shielded maximum luminosity is well below the bare $L_{max}$. If we associate an on-axis observer with a classically projected monotonically decreasing afterglow, we find that their ILF is similar to those of off-jet observer which we associate with flat phase afterglows.

Download