Prior asymptotic performance analyses are based on the series expansion of the moment-generating function (MGF) or the probability density function (PDF) of channel coefficients. However, these techniques fail for lognormal fading channels because the Taylor series of the PDF of a lognormal random variable is zero at the origin and the MGF does not have an explicit form. Although lognormal fading model has been widely applied in wireless communications and free-space optical communications, few analytical tools are available to provide elegant performance expressions for correlated lognormal channels. In this work, we propose a novel framework to analyze the asymptotic outage probabilities of selection combining (SC), equal-gain combining (EGC) and maximum-ratio combining (MRC) over equally correlated lognormal fading channels. Based on these closed-form results, we reveal the followings: i) the outage probability of EGC or MRC becomes an infinitely small quantity compared to that of SC at large signal-to-noise ratio (SNR); ii) channel correlation can result in an infinite performance loss at large SNR. More importantly, the analyses reveal insights into the long-standing problem of performance analyses over correlated lognormal channels at high SNR, and circumvent the time-consuming Monte Carlo simulation and numerical integration.