The recent development of phase-grating moire neutron interferometry promises a wide range of impactful experiments from dark-field imaging of material microstructure to precise measurements of fundamental constants. However, the contrast of 3 % obtained using this moire interferometer was well below the theoretical prediction of 30 % using ideal gratings. It is suspected that non-ideal aspects of the phase-gratings was a leading contributor to this deficiency and that phase-gratings needed to be quantitatively assessed and optimized. Here we characterize neutron diffraction from phase-gratings using Bragg diffraction crystals to determine the optimal phase-grating orientations. We show well-defined diffraction peaks and explore perturbations to the diffraction peaks and the effects on interferometer contrast as a function of grating alignment. This technique promises to improve the contrast of the grating interferometers by providing in-situ aides to grating alignment.