We present a study exploring a systematic effect on the brightness of type Ia supernovae using numerical models that assume the single-degenerate paradigm. Our investigation varied the central density of the progenitor white dwarf at flame ignition, and considered its impact on the explosion yield, particularly the production and distribution of radioactive Ni-56, which powers the light curve. We performed a suite of two-dimensional simulations with randomized initial conditions, allowing us to characterize the statistical trends that we present. The simulations indicate that production of Fe-group material is statistically independent of progenitor central density, but the mass of stable Fe-group isotopes is tightly correlated with central density, with a decrease in the production of Ni-56 at higher central densities. These results imply progenitors with higher central densities produce dimmer events. We provide details of the post-explosion distribution of Ni-56 in the models, including the lack of a consistent centrally-located deficit of Ni-56, which may be compared to observed remnants. By performing a self-consistent extrapolation of our model yields and considering the main-sequence lifetime of the progenitor star and the elapsed time between the formation of the white dwarf and the onset of accretion, we develop a brightness-age relation that improves our prediction of the expected trend for single degenerates and we compare this relation with observations.