Light and color curve properties of type Ia supernovae: Theory vs. Observations


الملخص بالإنكليزية

We study optical light curve(LC) relations of type Ia supernovae(SNe~Ia) for their use in cosmology using high-quality photometry published by the Carnegie-Supernovae-Project (CSP-I). We revisit the classical luminosity-decline-rate ($Delta m_{15}$) relation and the Lira-relation, as well as investigate the time evolution of the ($B-V$) color and $B(B-V)$, which serves as the basis of the color-stretch relation and Color-MAGnitude-Intercept-Calibrations(CMAGIC). Our analysis is based on explosion and radiation transport simulations for spherically-symmetric delayed-detonation models(DDT) producing normal-bright and subluminous SNe~Ia. Empirical LC-relations can be understood as having the same physical underpinnings: i.e. the opacities, ionization balances in the photosphere, and radioactive energy deposition changing with time from below to above the photosphere. Some 3-4 weeks past maximum, the photosphere recedes to ${}^{56}$Ni-rich layers of similar density structure, leading to a similar color evolution. An important secondary parameter is the central density $rho_c$ of the WD because at higher densities more electron capture elements are produced at the expense of ${}^{56}$Ni production. This results in a $Delta m_{15}$ spread of 0.1 mag for normal-bright and 0.7 mag in sub-luminous SNe~Ia and $approx0.2$ mag in the Lira-relation. We show why color-magnitude diagrams emphasize the transition between physical regimes, and allow to construct templates depend mostly on $Delta m_{15}$ with little dispersion in both the CSP-I sample and our DDT-models. This allows to separate intrinsic SN~Ia variations from the interstellar reddening characterized by $E(B-V)$ and $R_{B}$. Mixing of different explosion scenarios causes a wide spread in empirical relations which may suggest one dominant scenario.

تحميل البحث