(Abridged) We examine the evolution of the black hole mass - stellar velocity dispersion (M-sigma) relation over cosmic time using simulations of galaxy mergers that include feedback from supermassive black hole growth. We consider mergers of galaxies varying the properties of the progenitors to match those expected at redshifts z=0-6. We find that the slope of the resulting M-sigma relation is the same at all redshifts considered. For the same feedback efficiency that reproduces the observed amplitude of the M-sigma relation at z=0, there is a weak redshift-dependence to the normalization that results from an increasing velocity dispersion for a given galactic stellar mass. We develop a formalism to connect redshift evolution in the M-sigma relation to the scatter in the local relation at z=0. We show that the scatter in the local relation places severe constraints on the redshift evolution of both the normalization and slope of the M-sigma relation. Furthermore, we demonstrate that cosmic downsizing introduces a black hole mass-dependent dispersion in the M-sigma relation and that the skewness of the distribution about the locally observed M-sigma relation is sensitive to redshift evolution in the normalization and slope. In principle, these various diagnostics provide a method for differentiating between theories for producing the M-sigma relation. In agreement with existing constraints, our simulations imply that hierarchical structure formation should produce the relation with small intrinsic scatter.