We re-consider the time dependent Schrodinger-Newton equation as a model for the self-gravitational interaction of a quantum system. We numerically locate the onset of gravitationally induced inhibitions of dispersion of Gaussian wave packets and find them to occur at mass values more than 6 orders of magnitude higher than reported by Salzman and Carlip (2006, 2008), namely at about $10^{10},mathrm{u}$. This fits much better to simple analytical estimates but unfortunately also questions the experimental realisability of the proposed laboratory test of quantum gravity in the foreseeable future, not just because of large masses, but also because of the need to provide sufficiently long coherence times.