We compute the impact of the running of higher order density correlation functions on the two point functions of CMB spectral distortions (SD). We show that having some levels of running enhances all of the SDs by few orders of magnitude which might make them easier to detect. Taking a reasonable range for $ |n_{f_{NL}} |lesssim 1.1$ and with $f_{NL} = 5$ we show that for PIXIE like experiment, the signal to noise ratio, $(S/N)_{i}$, enhances to $lesssim 4000$ and $lesssim 10$ for $mu T$ and $yT$ toward the upper limit of $n_{f_{NL}}$. In addition, assuming $ |n_{tau_{NL}}|< 1$ and $tau_{NL} = 10^3$, $(S/N)_{i}$ increases to $lesssim 8times 10^{6}$, $lesssim 10^4$ and $lesssim 18$ for $mumu$, $mu y$ and $yy$, respectively. Therefore CMB spectral distortion can be a direct probe of running of higher order correlation functions in the near future.