Type 1a supernova magnitudes are used to fit cosmological parameters under the assumption the model will fit the observed redshift dependence. We test this assumption with the Union 2.1 compilation of 580 sources. Several independent tests find the existing model fails to account for a significant correlation of supernova color and redshift. The correlation of magnitude residuals relative to the $Lambda CDM$ model and $color times redshift$ has a significance equivalent to 13 standard deviations, as evaluated by randomly shuffling the data. Extending the existing $B-V$ color correction to a relation linear in redshift improves the goodness of fit $chi^{2}$ by more than 50 units, an equivalent 7-$sigma$ significance, while adding only one parameter. The $color-redshift$ correlation is quite robust, cannot be attributed to outliers, and passes several tests of consistency. We review previous hints of redshift dependence in color parameters found in bin-by-bin fits interpreted as parameter bias. We show that neither the bias nor the change $Delta chi^{2}$ of our study can be explained by those effects. The previously known relation that bluer supernovae have larger absolute luminosity tends to empirically flatten out with increasing redshift. The best-fit cosmological dark energy density parameter is revised from $ Omega_{Lambda} =0.71 pm 0.02$ to $ Omega_{Lambda} = 0.74 pm 0.02$ assuming a flat universe. One possible physical interpretation is that supernovae or their environments evolve significantly with increasing redshift.