Lorentzian distributions have been largely employed in statistical mechanics to obtain exact results for heterogeneous systems. Analytic continuation of these results is impossible even for slightly deformed Lorentzian distributions, due to the divergence of all the moments (cumulants). We have solved this problem by introducing a `pseudo-cumulants expansion. This allows us to develop a reduction methodology for heterogeneous spiking neural networks subject to extrinsinc and endogenous noise sources, thus generalizing the mean-field formulation introduced in [E. Montbrio et al., Phys. Rev. X 5, 021028 (2015)].