The Evolution of Supernovae in Circumstellar Wind-Blown Bubbles I. Introduction and One-Dimensional Calculations


Abstract in English

Mass loss from massive stars ($ga 8 msun$) can result in the formation of circumstellar wind blown cavities surrounding the star, bordered by a thin, dense, cold shell. When the star explodes as a core-collapse supernova (SN), the resulting shock wave will interact with this modified medium around the star, rather than the interstellar medium. In this work we first explore the nature of the circumstellar medium around massive stars in various evolutionary stages. This is followed by a study of the evolution of SNe within these wind-blown bubbles. The evolution depends primarily on a single parameter $Lambda$, the ratio of the mass of the dense shell to that of the ejected material. We investigate the evolution for different values of this parameter. We also plot approximate X-ray surface brightness plots from the simulations. Our results show that in many cases the SN remnant spends a significant amount of time within the bubble. The low density within the bubble can delay the onset of the Sedov stage, and may end up reducing the amount of time spent in the Sedov stage. The complicated density profile within the bubble makes it difficult to infer the mass-loss properties of the pre-SN star by studying the evolution of the resulting supernova remnant.

Download