No Arabic abstract
X-ray absorption of $gamma$-ray burst (GRB) afterglows is prevalent yet poorly understood. X-ray derived neutral hydrogen column densities ($N_{rm H}$) of GRB X-ray afterglows show an increase with redshift, which might give a clue for the origin of this absorption. We use more than 350 X-ray afterglows with spectroscopic redshift ($z$) from the Swift XRT repository as well as over 100 Ly,$alpha$ absorption measurements in $z>1.6$ sources. The observed trend of the average optical depth $tau$ at 0.5 keV is consistent with both a sharp increase of host $N_{rm H}(z)$, and an absorbing diffuse intergalactic medium, along with decreasing host contribution to $tau$. We analyze a sub-sample of high-$z$ GRBs with $N_{rm H}$ derived both from the X-ray afterglow and the Ly,$alpha$ line. The increase of X-ray derived $N_{rm H}(z)$ is contrasted by no such increase in the Ly,$alpha$ derived column density. We argue that this discrepancy implies a lack of association between the X-ray and Ly,$alpha$ absorbers at high-$z$. This points towards the X-ray absorption at high $z$ being dominated by an intervening absorber, which lends credibility to an absorbing intergalactic medium contribution.
Spectropolarimetric measurements of gamma-ray burst (GRB) optical afterglows contain polarization information for both continuum and absorption lines. Based on the Zeeman effect, an absorption line in a strong magnetic field is polarized and split into a triplet. In this paper, we solve the polarization radiative transfer equations of the absorption lines, and obtain the degree of linear polarization of the absorption lines as a function of the optical depth. In order to effectively measure the degree of linear polarization for the absorption lines, a magnetic field strength of at least $10^3$ G is required. The metal elements that produce the polarized absorption lines should be sufficiently abundant and have large oscillation strengths or Einstein absorption coefficients. We encourage both polarization measurements and high-dispersion observations of the absorption lines in order to detect the triplet structure in early GRB optical afterglows.
We derive basic analytical results for the timing and decay of the GRB-counterpart and delayed-afterglow light-curves for a brief emission episode from a relativistic surface endowed with angular structure, consisting of a uniform Core of size theta_c (Lorentz factor Gamma_c and surface emissivity i_nu are angle-independent) and an axially-symmetric power-law Envelope (Gamma ~ theta^{-g}). In this Large-Angle Emission (LAE) model, radiation produced during the prompt emission phase (GRB) at angles theta > theta_c arrives at observer well after the burst (delayed emission). The dynamical time-range of the very fast-decaying GRB tail and of the flat afterglow plateau, and the morphology of GRB counterpart/afterglow, are all determined by two parameters: the Cores parameter Gamma_c*theta_c and the Envelopes Lorentz factor index g, leading to three types of light-curves that display three post-GRB phases (type 1: tail, plateau/slow-decay, post-plateau/normal-decay), two post-GRB phases (type 2: tail and fast-decay), or just one (type 3: normal decay). We show how X-ray light-curve features can be used to determine Core and Envelope dynamical and spectral parameters. Testing of the LAE model is done using the Swift/XRT X-ray emission of two afterglows of type 1 (060607A, 061121), one of type 2 (061110A), and one of type 3 (061007). We find that the X-ray afterglows with plateaus require an Envelope Lorentz factor Gamma ~ theta^{-2} and a comoving-frame emissivity i_nu ~ theta^2, thus, for a typical afterglow spectrum F_nu ~ nu^{-1}, the lab-frame energy release is uniform over the emitting surface.
The afterglow emission from gamma-ray bursts (GRBs) is believed to originate from a relativistic blast wave driven into the circumburst medium. Although the afterglow emission from radio up to X-ray frequencies is thought to originate from synchrotron radiation emitted by relativistic, non-thermal electrons accelerated by the blast wave, the origin of the emission at high energies (HE; $gtrsim$~GeV) remains uncertain. The recent detection of sub-TeV emission from GRB~190114C by MAGIC raises further debate on what powers the very high-energy (VHE; $gtrsim 300$GeV) emission. Here, we explore the inverse Compton scenario as a candidate for the HE and VHE emissions, considering two sources of seed photons for scattering: synchrotron photons from the blast wave (synchrotron self-Compton or SSC) and isotropic photon fields external to the blast wave (external Compton). For each case, we compute the multi-wavelength afterglow spectra and light curves. We find that SSC will dominate particle cooling and the GeV emission, unless a dense ambient infrared photon field, typical of star-forming regions, is present. Additionally, considering the extragalactic background light attenuation, we discuss the detectability of VHE afterglows by existing and future gamma-ray instruments for a wide range of model parameters. Studying GRB~190114C, we find that its afterglow emission in the fermi-LAT band is synchrotron-dominated.The late-time fermi-LAT measurement (i.e., $tsim 10^4$~s), and the MAGIC observation also set an upper limit on the energy density of a putative external infrared photon field (i.e. $lesssim 3times 10^{-9},{rm erg,cm^{-3}}$), making the inverse Compton dominant in the sub-TeV energies.
We study thermal emission from circumstellar structures heated by gamma-ray burst (GRB) radiation and ejecta and calculate its contribution to GRB optical and X-ray afterglows using the modified radiation hydro-code small STELLA. It is shown that thermal emission originating in heated dense shells around the GRB progenitor star can reproduce X-ray plateaus (like observed in GRB 050904, 070110) as well as deviations from a power law fading observed in optical afterglows of some GRBs (e.g. 020124, 030328, 030429X, 050904). Thermal radiation pressure in the heated circumburst shell dominates the gas pressure, producing rapid expansion of matter similar to supenova-like explosions close to opacity or radiation flux density jumps in the circumburst medium. This phenomenon can be responsible for so-called supernova bumps in optical afterglows of several GRBs. Such a `quasi-supernova suggests interpretation of the GRB-SN connection which does not directly involve the explosion of the GRB progenitor star.
For gamma-ray bursts (GRBs) with a plateau phase in the X-ray afterglow, a so called $L-T-E$ correlation has been found which tightly connects the isotropic energy of the prompt GRB ($E_{gamma,rm{iso}}$) with the end time of the X-ray plateau ($T_{a}$) and the corresponding X-ray luminosity at the end time ($L_{X}$). Here we show that there is a clear redshift evolution in the correlation. Furthermore, since the power-law indices of $L_{X}$ and $E_{gamma,rm{iso}}$ in the correlation function are almost identical, the $L-T-E$ correlation is insensitive to cosmological parameters and cannot be used as a satisfactory standard candle. On the other hand, based on a sample including 121 long GRBs, we establish a new three parameter correlation that connects $L_{X}$, $T_{a}$ and the spectral peak energy $E_{rm{p}}$, i.e. the $L-T-E_{rm{p}}$ correlation. This correlation strongly supports the so-called Combo-relation established by Izzo et al. (2015). After correcting for the redshift evolution, we show that the de-evolved $L-T-E_{rm{p}}$ correlation can be used as a standard candle. By using this correlation alone, we are able to constrain the cosmological parameters as $Omega_{m}=0.389^{+0.202}_{-0.141}$ ($1sigma$) for the flat $Lambda$CDM model, or $Omega_{m}=0.369^{+0.217}_{-0.191}$, $w=-0.966^{+0.513}_{-0.678}$ ($1sigma$) for the flat $w$CDM model. Combining with other cosmological probes, more accurate constraints on the cosmology models are presented.