Many authors have studied the phenomenon of typically Gaussian marginals of high-dimensional random vectors; e.g., for a probability measure on $R^d$, under mild conditions, most one-dimensional marginals are approximately Gaussian if $d$ is large. In earlier work, the author used entropy techniques and Steins method to show that this phenomenon persists in the bounded-Lipschitz distance for $k$-dimensional marginals of $d$-dimensional distributions, if $k=o(sqrt{log(d)})$. In this paper, a somewhat different approach is used to show that the phenomenon persists if $k<frac{2log(d)}{log(log(d))}$, and that this estimate is best possible.