No Arabic abstract
We consider a model in which the ultra-relativistic jet in a gamma-ray burst (GRB) is cold and magnetically accelerated. We assume that the energy flux in the outflowing material is partially thermalized via internal shocks or a reverse shock, and we estimate the maximum amount of radiation that could be produced in such magnetized shocks. We compare this estimate with the available observational data on prompt gamma-ray emission in GRBs. We find that, even with highly optimistic assumptions, the magnetized jet model is radiatively too inefficient to be consistent with observations. One way out is to assume that much of the magnetic energy in the post-shock, or even pre-shock, jet material is converted to particle thermal energy by some unspecified process, and then radiated. This can increase the radiative efficiency sufficiently to fit observations. Alternatively, jet acceleration may be driven by thermal pressure rather than magnetic fields. In this case, which corresponds to the traditional fireball model, sufficient prompt GRB emission could be produced either from shocks at a large radius or from the jet photosphere closer to the center.
Some Quantum Gravity (QG) theories allow for a violation of Lorentz invariance (LIV), manifesting as a dependence of the velocity of light in vacuum on its energy. If such a dependence exists, then photons of different energies emitted together by a distant source will arrive at the Earth at different times. High-energy (GeV) transient emissions from distant astrophysical sources such as Gamma-ray Bursts (GRBs) and Active Galaxy Nuclei can be used to search for and constrain LIV. The Fermi collaboration has previously analyzed two GRBs in order to put constraints on the dispersion parameter in vacuum, and on the energy scale at which QG effects causing LIV may arise. We used three different methods on four bright GRBs observed by the Fermi-LAT to get more stringent and robust constraints. No delays have been detected and strong limits on the QG energy scale are derived: for linear dispersion we set tight constraints placing the QG energy scale above the Planck mass; a quadratic leading LIV effect is also constrained.
Recent analytical and numerical work argue that successful relativistic Fermi acceleration requires a weak magnetization of the unshocked plasma, all the more so at high Lorentz factors. The present paper tests this conclusion by computing the afterglow of a gamma-ray burst outflow propagating in a magnetized stellar wind using ab initio principles regarding the microphysics of relativistic Fermi acceleration. It is shown that in magnetized environments, one expects a drop-out in the X-ray band on sub-day scales as the synchrotron emission of the shock heated electrons exits the frequency band. At later times, Fermi acceleration becomes operative when the blast Lorentz factor drops below a certain critical value, leading to the recovery of the standard afterglow light curve. Interestingly, the observed drop-out bears resemblance with the fast decay found in gamma-ray bursts early X-ray afterglows.
It is now more than 40 years since the discovery of gamma-ray bursts (GRBs) and in the last two decades there has been major progress in the observations of bursts, the afterglows and their host galaxies. This recent progress has been fueled by the ability of gamma-ray telescopes to quickly localise GRBs and the rapid follow-up observations with multi-wavelength instruments in space and on the ground. A total of 674 GRBs have been localised to date using the coded aperture masks of the four gamma-ray missions, BeppoSAX, HETE II, INTEGRAL and Swift. As a result there are now high quality observations of more than 100 GRBs, including afterglows and host galaxies, revealing the richness and progress in this field. The observations of GRBs cover more than 20 orders of magnitude in energy, from 10^-5 eV to 10^15 eV and also in two non-electromagnetic channels, neutrinos and gravitational waves. However the continuation of progress relies on space based instruments to detect and rapidly localise GRBs and distribute the coordinates.
We analyze the MeV/GeV emission from four bright Gamma-Ray Bursts (GRBs) observed by the Fermi-Large Area Telescope to produce robust, stringent constraints on a dependence of the speed of light in vacuo on the photon energy (vacuum dispersion), a form of Lorentz invariance violation (LIV) allowed by some Quantum Gravity (QG) theories. First, we use three different and complementary techniques to constrain the total degree of dispersion observed in the data. Additionally, using a maximally conservative set of assumptions on possible source-intrinsic spectral-evolution effects, we constrain any vacuum dispersion solely attributed to LIV. We then derive limits on the QG energy scale (the energy scale that LIV-inducing QG effects become important, E_QG) and the coefficients of the Standard Model Extension. For the subluminal case (where high energy photons propagate more slowly than lower energy photons) and without taking into account any source-intrinsic dispersion, our most stringent limits (at 95% CL) are obtained from GRB090510 and are E_{QG,1}>7.6 times the Planck energy (E_Pl) and E_{QG,2}>1.3 x 10^11 GeV for linear and quadratic leading order LIV-induced vacuum dispersion, respectively. These limits improve the latest constraints by Fermi and H.E.S.S. by a factor of ~2. Our results disfavor any class of models requiring E_{QG,1} lesssim E_Pl.
Theories of gravity that obey the Weak Equivalence Principle have the same Parametrised Post-Newtonian parameter $gamma$ for all particles at all energies. The large Shapiro time delays of extragalactic sources allow us to put tight constraints on differences in $gamma$ between photons of different frequencies from spectral lag data, since a non-zero $Delta gamma$ would result in a frequency-dependent arrival time. The majority of previous constraints have assumed that the Shapiro time delay is dominated by a few local massive objects, although this is a poor approximation for distant sources. In this work we consider the cosmological context of these sources by developing a source-by-source, Monte Carlo-based forward model for the Shapiro time delays by combining constrained realisations of the local density field using the BORG algorithm with unconstrained large-scale modes. Propagating uncertainties in the density field reconstruction and marginalising over an empirical model describing other contributions to the time delay, we use spectral lag data of Gamma Ray Bursts from the BATSE satellite to constrain $Delta gamma < 3.4 times 10^{-15}$ at $1 sigma$ confidence between photon energies of $25 {rm , keV}$ and $325 {rm , keV}$.