Employing two samples containing of 56 and 59 well-separated FRED (fast rise and exponential decay) gamma-ray burst (GRB) pulses whose spectra are fitted by the Band spectrum and Compton model, respectively, we have investigated the evolutionary slope of $E_{p}$ (where $E_{p}$ is the peak energy in the $ u F u$ spectrum) with time during the pulse decay phase. The bursts in the samples were observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma-Ray Observatory. We first test the $E_{p}$ evolutionary slope during the pulse decay phase predicted by Lu et al. (2007) based on the model of highly symmetric expanding fireballs in which the curvature effect of the expanding fireball surface is the key factor concerned. It is found that the evolutionary slopes are normally distributed for both samples and concentrated around the values of 0.73 and 0.76 for Band and Compton model, respectively, which is in good agreement with the theoretical expectation of Lu et al. (2007). However, the inconsistence with their results is that the intrinsic spectra of most of bursts may bear the Comptonized or thermal synchrotron spectrum, rather than the Band spectrum. The relationships between the evolutionary slope and the spectral parameters are also checked. We show the slope is correlated with $E_{p}$ of time-integrated spectra as well as the photon flux but anticorrelated with the lower energy index $alpha$. In addition, a correlation between the slope and the intrinsic $E_{p}$ derived by using the pseudo-redshift is also identified. The mechanisms of these correlations are unclear currently and the theoretical interpretations are required.