ﻻ يوجد ملخص باللغة العربية
The observed delay in the arrival times between high and low energy photons in gamma-ray bursts (GRBs) has been shown by Norris et al. to be correlated to the absolute luminosity of a GRB. Despite the apparent importance of this spectral lag, there has yet to be a full explanation of its origin. We put forth that the lag is directly due to the evolution of the GRB spectra. In particular, as the energy at which the GRBs $ u F_{ u}$ spectra is a maximum ($E_{pk}$) decays through the four BATSE channels, the photon flux peak in each individual channel will inevitably be offset producing what we measure as lag. We test this hypothesis by measuring the rate of $E_{pk}$ decay ($Phi_{o}$) for a sample of clean single peaked bursts with measured lag. We find a direct correlation between the decay timescale and the spectral lag, demonstrating the relationship between time delay of the low energy photons and the decay of $E_{pk}$. This implies that the luminosity of a GRB is directly related to the bursts rate of spectral evolution, which we believe begins to reveal the underlying physics behind the lag-luminosity correlation. We discuss several possible mechanisms that could cause the observed evolution and its connection to the luminosity of the burst.
Using a pulse-fit method, we investigate the spectral lags between the traditional gamma-ray band (50-400 keV) and the X-ray band (6-25 keV) for 8 GRBs with known redshifts (GRB 010921, GRB 020124, GRB 020127, GRB 021211, GRB 030528, GRB 040924, GRB
Violations of Lorentz invariance can lead to an energy-dependent vacuum dispersion of light, which results in arrival-time differences of photons arising with different energies from a given transient source. In this work, direction-dependent dispers
The spectral lags of gamma-ray bursts (GRBs) have been viewed as the most promising probes of the possible violations of Lorentz invariance (LIV). However, these constraints usually depend on the assumption of the unknown intrinsic time lag in differ
As starburst galaxies show a star formation rate up to several hundred times larger than the one in a typical galaxy, the expected supernova rate is higher than average. This in turn implies a high rate of long gamma ray bursts (GRBs), which are extr
We present the first systematic investigation of spectral properties of 17 Type Ic Supernovae (SNe Ic), 10 broad-lined SNe Ic (SNe Ic-bl) without observed Gamma-Ray Bursts (GRBs) and 11 SNe Ic-bl with GRBs (SN-GRBs) as a function of time in order to