No Arabic abstract
The detection of the gamma-ray burst (GRB) X-ray emission line is important for studying the GRB physics and constraining the GRB redshift. Since the line-like feature in the GRB X-ray spectrum was first reported in 1999, several works on line searching have been published over the past two decades. Even though some observations on the X-ray line-like feature were performed, the significance remains controversial to date. In this paper, we utilize the down-Comptonization mechanism and present the time evolution of the Fe K$alpha$ line emitted near the GRB central engine. The line intensity decreases with the evolution time, and the time evolution depends on the the electron density and the electron temperature. In addition, the initial line with a larger broadening decreases less over time. For instance, when the emission line penetrates material with the an electron density above $10^{12}$ cm$^{-3}$ at 1 keV, it generally becomes insignificant enough after 100 s for it not to be detected. The line-like profile deviates from the Gaussian form, and it finally changes to be similar to a blackbody shape at the time of the thermal equilibrium between the line photons and the surrounding material.
We derive basic analytical results for the timing and decay of the GRB-counterpart and delayed-afterglow light-curves for a brief emission episode from a relativistic surface endowed with angular structure, consisting of a uniform Core of size theta_c (Lorentz factor Gamma_c and surface emissivity i_nu are angle-independent) and an axially-symmetric power-law Envelope (Gamma ~ theta^{-g}). In this Large-Angle Emission (LAE) model, radiation produced during the prompt emission phase (GRB) at angles theta > theta_c arrives at observer well after the burst (delayed emission). The dynamical time-range of the very fast-decaying GRB tail and of the flat afterglow plateau, and the morphology of GRB counterpart/afterglow, are all determined by two parameters: the Cores parameter Gamma_c*theta_c and the Envelopes Lorentz factor index g, leading to three types of light-curves that display three post-GRB phases (type 1: tail, plateau/slow-decay, post-plateau/normal-decay), two post-GRB phases (type 2: tail and fast-decay), or just one (type 3: normal decay). We show how X-ray light-curve features can be used to determine Core and Envelope dynamical and spectral parameters. Testing of the LAE model is done using the Swift/XRT X-ray emission of two afterglows of type 1 (060607A, 061121), one of type 2 (061110A), and one of type 3 (061007). We find that the X-ray afterglows with plateaus require an Envelope Lorentz factor Gamma ~ theta^{-2} and a comoving-frame emissivity i_nu ~ theta^2, thus, for a typical afterglow spectrum F_nu ~ nu^{-1}, the lab-frame energy release is uniform over the emitting surface.
Plateaus are common in X-ray afterglows of gamma-ray bursts. Among a few scenarios for the origin of them, the leading one is that there exists a magnetar inside and persistently injects its spin-down energy into an afterglow. In previous studies, the radiation efficiency of this process is assumed to be a constant $gtrsim0.1$, which is quite simple and strong. In this work we obtain the efficiency from a physical point of view and find that this efficiency strongly depends on the injected luminosity. One implication of this result is that those X-ray afterglow light curves which show steeper temporal decay than $t^{-2}$ after the plateau phase can be naturally understood now. Also, the braking indexes deduced from afterglow fitting are found to be larger than those in previous studies, which are more reasonable for newborn magnetars.
We report on a detailed spectral characterization of the non-thermal X-ray emission for a large sample of gamma-ray pulsars in the second Fermi-LAT catalogue. We outline the criteria adopted for the selection of our sample, its completeness, and critically describe different approaches to estimate the spectral shape and flux of pulsars. We perform a systematic modelling of the pulsars X-ray spectra using archival observations with XMM-Newton, Chandra, and NuSTAR and extract the corresponding non-thermal X-ray spectral distributions. This set of data is made available online and is useful to confront with predictions of theoretical models.
We aim to obtain a measure of the curvature of time-resolved spectra that can be compared directly to theory. This tests the ability of models such as synchrotron emission to explain the peaks or breaks of GBM prompt emission spectra. We take the burst sample from the official Fermi GBM GRB time-resolved spectral catalog. We re-fit all spectra with a measured peak or break energy in the catalog best-fit models in various energy ranges, which cover the curvature around the spectral peak or break, resulting in a total of 1,113 spectra being analysed. We compute the sharpness angles under the peak or break of the triangle constructed under the model fit curves and compare to the values obtained from various representative emission models: blackbody, single-electron synchrotron, synchrotron emission from a Maxwellian or power-law electron distribution. We find that 35% of the time-resolved spectra are inconsistent with the single-electron synchrotron function, and 91% are inconsistent with the Maxwellian synchrotron function. The single temperature, single emission time and location blackbody function is found to be sharper than all the spectra. No general evolutionary trend of the sharpness angle is observed, neither per burst nor for the whole population. It is found that the limiting case, a single temperature Maxwellian synchrotron function, can only contribute up to $58^{+23}_{-18}$% of the peak flux. Our results show that even the sharpest but non-realistic case, the single-electron synchrotron function, cannot explain a large fraction of the observed GRB prompt spectra. Because of the fact that any combination of physically possible synchrotron spectra added together will always further broaden the spectrum, emission mechanisms other than optically thin synchrotron radiation are likely required in a full explanation of the spectral peaks or breaks of the GRB prompt emission phase.
X-ray absorption of $gamma$-ray burst (GRB) afterglows is prevalent yet poorly understood. X-ray derived neutral hydrogen column densities ($N_{rm H}$) of GRB X-ray afterglows show an increase with redshift, which might give a clue for the origin of this absorption. We use more than 350 X-ray afterglows with spectroscopic redshift ($z$) from the Swift XRT repository as well as over 100 Ly,$alpha$ absorption measurements in $z>1.6$ sources. The observed trend of the average optical depth $tau$ at 0.5 keV is consistent with both a sharp increase of host $N_{rm H}(z)$, and an absorbing diffuse intergalactic medium, along with decreasing host contribution to $tau$. We analyze a sub-sample of high-$z$ GRBs with $N_{rm H}$ derived both from the X-ray afterglow and the Ly,$alpha$ line. The increase of X-ray derived $N_{rm H}(z)$ is contrasted by no such increase in the Ly,$alpha$ derived column density. We argue that this discrepancy implies a lack of association between the X-ray and Ly,$alpha$ absorbers at high-$z$. This points towards the X-ray absorption at high $z$ being dominated by an intervening absorber, which lends credibility to an absorbing intergalactic medium contribution.