No Arabic abstract
Gamma-ray emission at energies >100MeV has been detected from nine novae using the Fermi-LAT, and it can be explained by particle acceleration at shocks in these systems. Eight out of these nine objects are classical novae in which interaction of the ejecta with a tenuous circumbinary material is not expected to generate detectable gamma-ray emission. We examine whether particle acceleration at internal shocks can account for the gamma-ray emission from these novae. The shocks result from the interaction of a fast wind radiatively-driven by nuclear burning on the white dwarf with material ejected in the initial runaway stage of the nova outburst. We present a one-dimensional model for the dynamics of a forward and reverse shock system in a nova ejecta, and for the associated time-dependent particle acceleration and high-energy gamma-ray emission. Non-thermal proton and electron spectra are calculated by solving a time-dependent transport equation for particle injection, acceleration, losses, and escape from the shock region. The predicted emission is compared to LAT observations of V407 Cyg, V1324 Sco, V959 Mon, V339 Del, V1369 Cen, and V5668 Sgr. The 100MeV gamma-ray emission arises predominantly from particles accelerated up to ~100GeV at the reverse shock and undergoing hadronic interactions in the dense cooling layer downstream of the shock. The internal shock model can account for the gamma-ray emission of the novae detected by Fermi-LAT, including the main features in the observations of the recent gamma-ray nova ASASSN-16ma. Gamma-ray observations hold potential for probing the mechanism of mass ejection in novae, but should be combined to diagnostics of the thermal emission at lower energies to be more constraining. (abridged version)
Gamma-ray bursts are short-lived, luminous explosions at cosmological distances, thought to originate from relativistic jets launched at the deaths of massive stars. They are among the prime candidates to produce the observed cosmic rays at the highest energies. Recent neutrino data have, however, started to constrain this possibility in the simplest models with only one emission zone. In the classical theory of gamma-ray bursts, it is expected that particles are accelerated at mildly relativistic shocks generated by the collisions of material ejected from a central engine. We consider neutrino and cosmic-ray emission from multiple emission regions since these internal collisions must occur at very different radii, from below the photosphere all the way out to the circumburst medium, as a consequence of the efficient dissipation of kinetic energy. We demonstrate that the different messengers originate from different collision radii, which means that multi-messenger observations open windows for revealing the evolving GRB outflows.
The prompt emission of most gamma-ray bursts (GRBs) typically exhibits a non-thermal Band component. The synchrotron radiation in the popular internal shock model is generally put forward to explain such a non-thermal component. However, the low-energy photon index $alpha sim -1.5$ predicted by the synchrotron radiation is inconsistent with the observed value $alpha sim -1$. Here, we investigate the evolution of a magnetic field during propagation of internal shocks within an ultrarelativistic outflow, and revisit the fast cooling of shock-accelerated electrons via synchrotron radiation for this evolutional magnetic field. We find that the magnetic field is first nearly constant and then decays as $Bpropto t^{-1}$, which leads to a reasonable range of the low-energy photon index, $-3/2 < alpha < -2/3$. In addition, if a rising electron injection rate during a GRB is introduced, we find that $alpha$ reaches $-2/3$ more easily. We thus fit the prompt emission spectra of GRB 080916c and GRB~080825c.
In the neutron-rich internal shocks model for gamma-ray bursts (GRBs), the Lorentz factors (LFs) of ion shells are variable, and so are the LFs of accompanying neutron shells. For slow neutron shells with a typical LF of approximate tens, the typical $beta$-decay radius is $sim 10^{14}-10^{15}$ cm. As GRBs last long enough [$T_{90}>14(1+z)$ s], one earlier but slower ejected neutron shell will be swept successively by later ejected ion shells in the range $sim10^{13}-10^{15}$ cm, where slow neutrons have decayed significantly. Part of the thermal energy released in the interaction will be given to the electrons. These accelerated electrons will be mainly cooled by the prompt soft $gamma-$rays and give rise to GeV emission. This kind of GeV emission is particularly important for some very long GRBs and is detectable for the upcoming satellite {it Gamma-Ray Large Area Space Telescope} (GLAST).
The Fermi LAT discovery that classical novae produce >100 MeV gamma-rays establishes that shocks and relativistic particle acceleration are key features of these events. These shocks are likely to be radiative due to the high densities of the nova ejecta at early times coincident with the gamma-ray emission. Thermal X-rays radiated behind the shock are absorbed by neutral gas and reprocessed into optical emission, similar to Type IIn (interacting) supernovae. Gamma-rays are produced by collisions between relativistic protons with the nova ejecta (hadronic scenario) or Inverse Compton/bremsstrahlung emission from relativistic electrons (leptonic scenario), where in both scenarios the efficiency for converting relativistic particle energy into LAT gamma-rays is at most a few tens of per cent. The ratio of gamma-ray and optical luminosities, L_gam/L_opt, thus sets a lower limit on the fraction of the shock power used to accelerate relativistic particles, e_nth. The measured values of L_gam/L_opt for two classical novae, V1324 Sco and V339 Del, constrains e_nth > 1e-2 and > 1e-3, respectively. Inverse Compton models for the gamma-ray emission are disfavored given the low electron acceleration efficiency, e_nth ~ 1e-4-1e-3, inferred from observations of Galactic cosmic rays and particle-in-cell (PIC) numerical simulations. A fraction > 100(0.01/e_nth) and > 10(0.01/e_nth) per cent of the optical luminosity is powered by shocks in V1324 Sco and V339 Del, respectively. Such high fractions challenge standard models that instead attribute all nova optical emission to the direct outwards transport of thermal energy released near the white dwarf surface.
Context. It has been suggested that the bow shocks of runaway stars are sources of high-energy gamma rays (E > 100 MeV). Theoretical models predicting high-energy gamma-ray emission from these sources were followed by the first detection of non-thermal radio emission from the bow shock of BD+43$^deg$ 3654 and non-thermal X-ray emission from the bow shock of AE Aurigae. Aims. We perform the first systematic search for MeV and GeV emission from 27 bow shocks of runaway stars using data collected by the Large Area Telescope (LAT) onboard the Fermi Gamma-ray Space Telescope (Fermi). Methods. We analysed 57 months of Fermi-LAT data at the positions of 27 bow shocks of runaway stars extracted from the Extensive stellar BOw Shock Survey catalogue (E-BOSS). A likelihood analysis was performed to search for gamma-ray emission that is not compatible with diffuse background or emission from neighbouring sources and that could be associated with the bow shocks. Results. None of the bow shock candidates is detected significantly in the Fermi-LAT energy range. We therefore present upper limits on the high-energy emission in the energy range from 100 MeV to 300 GeV for 27 bow shocks of runaway stars in four energy bands. For the three cases where models of the high-energy emission are published we compare our upper limits to the modelled spectra. Our limits exclude the model predictions for Zeta Ophiuchi by a factor $approx$ 5.