No Arabic abstract
We present a detailed comparison of the ionizing spectral energy distributions (SEDs) predicted by four modern stellar atmosphere codes, TLUSTY, CMFGEN, WMbasic, and FASTWIND. We consider three sets of stellar parameters representing a late O-type dwarf (O9.5 V), a mid O-type (O7 V) dwarf, and an early O-type dwarf (O5.5 V). We explore two different possibilities for such a comparison, following what we called evolutionary and observational approaches: in the evolutionary approach one compares the SEDs of stars defined by the same values of Teff and logg; in the observational approach the models to be compared do not necessarily have the same Teff and logg, but produce similar H and HeI-II optical lines. We find that there is a better agreement, in terms of Q(H0), the ratio Q(He0)/Q(H0), and the shape of the SEDs predicted by the four codes in the spectral range between 13 and 30 eV, when models are compared following the observational approach. However, even in this case, large differences are found at higher energies. We then discuss how the differences in the SEDs may affect the overall properties of surrounding nebulae in terms of temperature and ionization structure. We find that the effect over the nebular temperature is not larger than 300-350 K. Contrarily, the different SEDs produce significantly different nebular ionization structures. This will lead to important consequences on the establishment of the ionization correction factors that are used in the abundance determination of HII regions, as well as in the characterization of the ionizing stellar population from nebular line ratios.
We present some results of an on-going project aimed at studying a sample of Galactic HII regions ionized by a single massive star to test the predictions of modern generation stellar atmosphere codes in the H Lyman continuum. The observations collected for this study comprise the optical spectra of the corresponding ionizing stars, along with imaging and long-slit spatially resolved nebular observations. The analysis of the stellar spectra allows to obtain the stellar parameters of the ionizing star, while the nebular observations provide constraints on the nebular abundances and gas distribution. All this information is then used to construct tailored photoionization models of the HII regions. The reliability of the stellar ionizing fluxes is hence tested by comparing the photoionization model results with the observations in terms of the spatial variation across the nebula of an appropriate set of nebular line ratios.
The part played by stars in the ionization of the intergalactic medium remains an open question. A key issue is the proportion of the stellar ionizing radiation that escapes the galaxies in which it is produced. Spectroscopy of gamma-ray burst afterglows can be used to determine the neutral hydrogen column-density in their host galaxies and hence the opacity to extreme ultra-violet radiation along the lines-of-sight to the bursts. Thus, making the reasonable assumption that long-duration GRB locations are representative of the sites of massive stars that dominate EUV production, one can calculate an average escape fraction of ionizing radiation in a way that is independent of galaxy size, luminosity or underlying spectrum. Here we present a sample of NH measures for 138 GRBs in the range 1.6<z<6.7 and use it to establish an average escape fraction at the Lyman limit of <fesc>~0.005, with a 98% confidence upper limit of ~0.015. This analysis suggests that stars provide a small contribution to the ionizing radiation budget of the IGM at z<5, where the bulk of the bursts lie. At higher redshifts, z>5, firm conclusions are limited by the small size of the GRB sample, but any decline in average HI column-density seems to be modest. We also find no indication of a significant correlation of NH with galaxy UV luminosity or host stellar mass, for the subset of events for which these are available. We discuss in some detail a number of selection effects and potential biases. Drawing on a range of evidence we argue that such effects, while not negligible, are unlikely to produce systematic errors of more than a factor ~2, and so would not affect the primary conclusions. Given that many GRB hosts are low metallicity, high specific star-formation rate, dwarf galaxies, these results present a particular problem for the hypothesis that such galaxies dominated the reionization of the universe.
We discuss two important effects for the astrospheres of runaway stars: the propagation of ionizing photons far beyond the astropause, and the rapid evolution of massive stars (and their winds) near the end of their lives. Hot stars emit ionizing photons with associated photoheating that has a significant dynamical effect on their surroundings. 3D simulations show that HII regions around runaway O stars drive expanding conical shells and leave underdense wakes in the medium they pass through. For late O stars this feedback to the interstellar medium is more important than that from stellar winds. Late in life, O stars evolve to cool red supergiants more rapidly than their environment can react, producing transient circumstellar structures such as double bow shocks. This provides an explanation for the bow shock and linear bar-shaped structure observed around Betelgeuse.
Using results from high-resolution galaxy formation simulations in a standard Lambda-CDM cosmology and a fully conservative multi-resolution radiative transfer code around point sources, we compute the energy-dependent escape fraction of ionizing photons from a large number of star forming regions in two galaxies at five different redshifts from z=3.8 to 2.39. All escape fractions show a monotonic decline with time, from (at the Lyman-limit) ~6-10% at z=3.6 to ~1-2% at z=2.39, due to higher gas clumping at lower redshifts. It appears that increased feedback can lead to higher f_esc at z>3.4 via evacuation of gas from the vicinity of star forming regions and to lower f_esc at z<2.39 through accumulation of swept-up shells in denser environments. Our results agree well with the observational findings of citet{inoue..06} on redshift evolution of f_esc in the redshift interval z=2-3.6.
The practical viability of any qubit technology stands on long coherence times and high-fidelity operations, with the superconducting qubit modality being a leading example. However, superconducting qubit coherence is impacted by broken Cooper pairs, referred to as quasiparticles, with a density that is empirically observed to be orders of magnitude greater than the value predicted for thermal equilibrium by the Bardeen-Cooper-Schrieffer (BCS) theory of superconductivity. Previous work has shown that infrared photons significantly increase the quasiparticle density, yet even in the best isolated systems, it still remains higher than expected, suggesting that another generation mechanism exists. In this Letter, we provide evidence that ionizing radiation from environmental radioactive materials and cosmic rays contributes to this observed difference, leading to an elevated quasiparticle density that would ultimately limit superconducting qubits of the type measured here to coherence times in the millisecond regime. We further demonstrate that introducing radiation shielding reduces the flux of ionizing radiation and positively correlates with increased coherence time. Albeit a small effect for todays qubits, reducing or otherwise mitigating the impact of ionizing radiation will be critical for realizing fault-tolerant superconducting quantum computers.