No Arabic abstract
Internal shocks occurring in blazars may accelerate both thermal and non-thermal electrons. In this paper we examine the consequences that such a hybrid (thermal/non-thermal) EED has on the spectrum of blazars. Since the thermal component of the EED may extend to very low energies. We replace the standard synchrotron process by the more general magneto-bremsstrahlung (MBS). Significant differences in the energy flux appear at low radio frequencies when considering MBS instead of the standard synchrotron emission. A drop in the spectrum appears in the all the radio band and a prominent valley between the infrared and soft X-rays bands when a hybrid EED is considered, instead of a power-law EED. In the $gamma$-ray band an EED of mostly thermal particles displays significant differences with respect to the one dominated by non-thermal particles. A thermally-dominated EED produces a synchrotron self-Compton (SSC) peak extending only up to a few MeV, and the valley separating the MBS and the SSC peaks is much deeper than if the EED is dominated by non-thermal particles. The combination of these effects modifies the Compton dominance of a blazar, suggesting that the vertical scatter in the distribution of FSRQs and BL Lac objects in the peak synchrotron frequency - Compton dominance parameter space could be attributed to different proportions of thermal/non-thermal particles in the EED of blazars. Finally, the temperature of the electrons in the shocked plasma is shown to be a degenerated quantity for different magentizations of the ejected material.
When two galaxy clusters encounter each other, the interaction results in a collisionless shock that is characterized by a low (1-4) sonic Mach number, and a high Alfv{e}nic Mach number. Our goal is to determine if, and to what extent, such shocks can accelerate particles to sufficient velocities that they can contribute to the cosmic ray spectrum. We combine two different computational methods, magnetohydrodynamics (MHD) and particle-in-cell (PIC) into a single code that allows us to take advantage of the high computational efficiency of MHD while maintaining the ability to model the behaviour of individual non-thermal particles. Using this method, we perform a series of simulations covering the expected parameter space of galaxy cluster collision shocks. Our results show that for shocks with a sonic Mach number below 2.25 no diffusive shock acceleration can take place because of a lack of instabilities in the magnetic field, whereas for shocks with a sonic Mach number $geq,3$ the acceleration is efficient and can accelerate particles to relativistic speeds. In the regime between these two extremes, diffusive shock acceleration can occur but is relatively inefficient because of the time- and space-dependent nature of the instabilities. For those shocks that show efficient acceleration, the instabilities in the upstream gas increase to the point where they change the nature of the shock, which, in turn, will influence the particle injection process.
The development of instabilities leading to the formation of internal shocks is expected in the relativistic outflows of both gamma-ray bursts and blazars. The shocks heat the expanding ejecta, generate a tangled magnetic field and accelerate leptons to relativistic energies. While this scenario has been largely considered for the origin of the spectrum and the fast variability in gamma-ray bursts, here we consider it in the contest of relativistic jets of blazars. We calculate the expected spectra, light curves and time correlations between emission at different wavelengths. The dynamical evolution of the wind explains the minimum distance for dissipation (~10^{17} cm) to avoid $gamma$--$gamma$ collisions and the low radiative efficiency required to transport most of the kinetic energy to the extended radio structures. The internal shock model allows to follow the evolution of changes, both dynamical and radiative, along the entire jet, from the inner part, where the jet becomes radiative and emits at high energies ($gamma$-jet), to the parsec scale, where the emission is mostly in the radio band (radio-jet). We have produced some animations that can be found at http://www.merate.mi.astro.it/~lazzati/3C279/, in which the temporal and spectral informations are shown together.
Fermi acceleration can develop efficiently at relativistic collisionless shock waves provided the upstream (unshocked) plasma is weakly magnetized. At low magnetization, the large size of the shock precursor indeed provides enough time for electromagnetic micro-instabilities to grow and such micro-instabilities generate small scale turbulence that in turn provides the scattering required. The present paper extends our previous analysis on the development of these micro-instabilities to account for the finite angular dispersion of the beam of reflected and accelerated particles and to account for the expected heating of the upstream electrons in the shock precursor. We show that the oblique two stream instability may operate down to values of the shock Lorentz factor gamma_{sh}~10 as long as the electrons of the upstream plasma remain cold, while the filamentation instability is strongly inhibited in this limit; however, as electrons get heated to relativistic temperatures, the situation becomes opposite and the two stream instability becomes inhibited while the filamentation mode becomes efficient, even at moderate values of the shock Lorentz factor. The peak wavelength of these instabilities migrates from the inertial electron scale towards the proton inertial scale as the background electrons get progressively heated during the crossing of the shock precursor. We also discuss the role of current driven instabilities upstream of the shock. In particular, we show that the returning/accelerated particles give rise to a transverse current through their rotation in the background magnetic field. We find that the compensating current in the background plasma can lead to a Buneman instability which provides an efficient source of electron heating. [Abridged]
(Abridged) The main purpose of this paper is to consider the contribution of all three non-thermal components to total mass measurements of galaxy clusters: cosmic rays, turbulence and magnetic pressures. To estimate the thermal pressure we used public XMM-textit{Newton} archival data of 5 Abell clusters. To describe the magnetic pressure, we assume a radial distribution for the magnetic field, $B(r) propto rho_{g}^{alpha}$, to seek generality we assume $alpha$ within the range of 0.5 to 0.9, as indicated by observations and numerical simulations. For the turbulent component, we assumed an isotropic pressure, $P_{rm turb} = {1/3}rho_{rm g}(sigma_{r}^{2}+sigma_{t}^{2})$. We also consider the contribution of cosmic ray pressure, $P_{cr}propto r^{-0.5}$. It follows that a consistent description for the non-thermal component could yield variation in mass estimates that vary from 10% up to $sim$30%. We verified that in the inner parts of cool-core clusters the cosmic ray component is comparable to the magnetic pressure, while in non cool-core cluster the cosmic ray component is dominant. For cool-core clusters the magnetic pressure is the dominant component, contributing with more than 50% of total mass variation due to non-thermal pressure components. However, for non cool-core clusters, the major influence comes from the cosmic ray pressure that accounts with more than 80% of total mass variation due to non-thermal pressure effects. For our sample, the maximum influence of the turbulent component to total mass variation can be almost 20%. We show that this analysis can be regarded as a starting point for a more detailed and refined exploration of the influence of non-thermal pressure in the intra-cluster medium (ICM).
Evidence for shocks in nova outflows include (1) multiple velocity components in the optical spectra; (2) keV X-ray emission weeks to months after the outburst; (3) early radio flare on timescales of months, in excess of that predicted from the freely expanding photo-ionized gas; and (4) ~ GeV gamma-rays. We present a 1D model for the shock interaction between the fast nova outflow and a dense external shell (DES) and its associated thermal X-ray, optical, and radio emission. The forward shock is radiative initially when the density of shocked gas is highest, at which times radio emission originates from the dense cooling layer immediately downstream of the shock. The radio light curve is characterized by sharper rises to maximum and later peak times at progressively lower frequencies, with a peak brightness temperature that is approximately independent of frequency. We apply our model to the recent gamma-ray classical nova V1324 Sco, obtaining an adequate fit to the early radio maximum for reasonable assumptions about the fast nova outflow and assuming the DES possesses a velocity ~1e3 km/s and mass ~ 2e-4 M_sun; the former is consistent with the velocities of narrow line absorption systems observed previously in nova spectra, while the total ejecta mass of the DES and fast outflow is consistent with that inferred independently by modeling the late radio peak. Rapid evolution of the early radio light curves require the DES possess a steep outer density profile, which may indicate that the onset of mass loss from the white dwarf was rapid, providing indirect evidence that the DES was expelled by the thermonuclear runaway event. Reprocessed X-rays from the shock absorbed by the DES at early times may contribute significantly to the optical/UV emission, which we speculate is responsible for the previously unexplained `plateaus and secondary maxima in nova optical light curves.