No Arabic abstract
Anisotropic thermal conduction plays an important role in various astrophysical systems. One of the most stringent tests of thermal conduction can be found in supernova remnants. In this paper we study anisotropic thermal conduction and examine the physical nature of the flux of thermal conduction in the classical and saturated limits. We also present a temporally second-order accurate implicit-explicit scheme for the time-update of thermal conduction terms within a numerical MHD scheme. Several simulations of supernova remnants are presented for a range of ISM parameters. The role of thermal conduction in such remnants has been studied. We find that thermal conduction produces cooler temperatures and higher densities in the hot gas bubbles that form in the remnants. The effect of thermal conduction in changing the thermal characteristics of the hot gas bubble increases as the remnant propagates through denser ISMs. Remnants evolving in denser ISMs are shown to make a faster transition to a centre-bright x-ray morphology, with the trend emerging earlier in hard x-rays than in the soft x-rays.
We model the hydrodynamic interaction of a shock wave of an evolved supernova remnant with a small interstellar gas cloud like the ones observed in the Cygnus loop and in the Vela SNR. We investigate the interplay between radiative cooling and thermal conduction during cloud evolution and their effect on the mass and energy exchange between the cloud and the surrounding medium. Through the study of two cases characterized by different Mach numbers of the primary shock (M = 30 and 50, corresponding to a post-shock temperature $Tapprox 1.7times 10^6$ K and $approx 4.7times 10^6$ K, respectively), we explore two very different physical regimes: for M = 30, the radiative losses dominate the evolution of the shocked cloud which fragments into cold, dense, and compact filaments surrounded by a hot corona which is ablated by the thermal conduction; instead, for M = 50, the thermal conduction dominates the evolution of the shocked cloud, which evaporates in a few dynamical time-scales. In both cases we find that the thermal conduction is very effective in suppressing the hydrodynamic instabilities that would develop at the cloud boundaries.
(Abridged) Heating of the interstellar medium by multiple supernovae (SNe) explosions is at the heart of producing galaxy-scale outflows. We use hydrodynamical simulations to study the efficiency of multiple SNe in heating the interstellar medium (ISM) and filling the volume with gas of high temperatures. We argue that it is important for SNe remnants to have a large filling factor {it and} a large heating efficiency. For this, they have to be clustered in space and time, and keep exploding until the hot gas percolates through the whole region, in order to compensate for the radiative loss. In the case of a limited number of SNe, we find that although the filling factor can be large, the heating efficiency declines after reaching a large value. In the case of a continuous series of SNe, the hot gas ($T ge 3 times 10^6$ K) can percolate through the whole region after the total volume filling factor reaches a threshold of $sim 0.3$. The efficiency of heating the gas to X-ray temperatures can be $ge 0.1$ after this percolation epoch, which occurs after a period of $approx 10$ Myr for a typical starburst SNe rate density of $ u_{rm SN} approx 10^{-9}$ pc$^{-3}$ yr$^{-1}$ and gas density of $napprox 10$ cm$^{-3}$ in starburst nuclei regions. This matches the recent observations of a time delay of similar order between the onset of star formation and galactic outflows. The efficiency to heat gas up to X-ray temperatures ($ge 10^{6.5}$ K) roughly scales as $ u_{rm SN}^{0.2} n^{-0.6}$. For a typical SNe rate density and gas density in starburst nuclei, the heating efficiency is $sim 0.15$, also consistent with previous interpretations from X-ray observations. We discuss the implications of our results with regard to observational diagnostics of ionic ratios and emission measures in starburst nuclei regions.
Supernova remnants (SNR) are now widely believed to be a source of cosmic rays (CRs) up to an energy of 1 PeV. The magnetic fields required to accelerate CRs to sufficiently high energies need to be much higher than can result from compression of the circumstellar medium (CSM) by a factor 4, as is the case in strong shocks. Non-thermal synchrotron maps of these regions indicate that indeed the magnetic field is much stronger, and for young SNRs has a dominant radial component while for old SNRs it is mainly toroidal. How these magnetic fields get enhanced, or why the field orientation is mainly radial for young remnants, is not yet fully understood. We use an adaptive mesh refinement MHD code, AMRVAC, to simulate the evolution of supernova remnants and to see if we can reproduce a mainly radial magnetic field in early stages of evolution. We follow the evolution of the SNR with three different configurations of the initial magnetic field in the CSM: an initially mainly toroidal field, a turbulent magnetic field, and a field parallel to the symmetry axis. Although for the latter two topologies a significant radial field component arises at the contact discontinuity due to the Rayleigh-Taylor instability, no radial component can be seen out to the forward shock. Ideal MHD appears not sufficient to explain observations. Possibly a higher compression ratio and additional turbulence due to dominant presence of CRs can help us to better reproduce the observations in future studies.
I outline the dynamical evolution of the shell remnants of supernovae (SNRs), from initial interaction of supernova ejecta with circumstellar material (CSM) through to the final dissolution of the remnant into the interstellar medium (ISM). Supernova ejecta drive a blast wave through any CSM from the progenitor system; as material is swept up, a reverse shock forms in the ejecta, reheating them. This ejecta-driven phase lasts until ten or more times the ejected mass is swept up, and the remnant approaches the Sedov or self-similar evolutionary phase. The evolution up to this time is approximately adiabatic. Eventually, as the blast wave slows, the remnant age approaches the cooling time for immediate post-shock gas, and the shock becomes radiative and highly compressive. Eventually the shock speed drops below the local ISM sound speed and the remnant dissipates. I then review the various processes by which remnants radiate. At early times, during the adiabatic phases, thermal X-rays and nonthermal radio, X-ray, and gamma-ray emission dominate, while optical emission is faint and confined to a few strong lines of hydrogen and perhaps helium. Once the shock is radiative, prominent optical and infrared emission is produced. Young remnants are profoundly affected by interaction with often anisotropic CSM, while even mature remnants can still show evidence of ejecta.
A systematic study of the synchrotron X-ray emission from supernova remnants (SNRs) has been conducted. We selected a total of 12 SNRs whose synchrotron X-ray spectral parameters are available in the literature with reasonable accuracy, and studied how their luminosities change as a function of radius. It is found that the synchrotron X-ray luminosity tends to drop especially when the SNRs become larger than ~5 pc, despite large scatter. This may be explained by the change of spectral shape caused by the decrease of the synchrotron roll-off energy. A simple evolutionary model of the X-ray luminosity is proposed and is found to reproduce the observed data approximately, with reasonable model parameters. According to the model, the total energy of accelerated electrons is estimated to be 10^(47-48) ergs, which is well below the supernova explosion energy. The maximum energies of accelerated electrons and protons are also discussed.