Slow magnetoacoustic waves are omnipresent in both natural and laboratory plasma systems. The wave-induced misbalance between plasma cooling and heating processes causes the amplification or attenuation, and also dispersion, of slow magnetoacoustic waves. The wave dispersion could be attributed to the presence of characteristic time scales in the system, connected with the plasma heating or cooling due to the competition of the heating and cooling processes in the vicinity of the thermal equilibrium. We analysed linear slow magnetoacoustic waves in a plasma in a thermal equilibrium formed by a balance of optically thin radiative losses, field-align thermal conduction, and an unspecified heating. The dispersion is manifested by the dependence of the effective adiabatic index of the wave on the wave frequency, making the phase and group speeds frequency-dependent. The mutual effect of the wave amplification and dispersion is shown to result into the occurrence of an oscillatory pattern in an initially broadband slow wave, with the characteristic period determined by the thermal misbalance time scales, i.e. by the derivatives of the combined radiation loss and heating function with respect to the density and temperature, evaluated at the equilibrium. This effect is illustrated by estimating the characteristic period of the oscillatory pattern, appearing because of thermal misbalance in the plasma of the solar corona. It is found that by an order of magnitude the period is about the typical periods of slow magnetoacoustic oscillations detected in the corona.