Let $X$ and $Y$ be dependent random variables. This paper considers the problem of designing a scalar quantizer for $Y$ to maximize the mutual information between the quantizers output and $X$, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low $I(X;Y)$, where it is shown that, if $X$ is binary, a constant fraction of the mutual information can always be preserved using $mathcal{O}(log(1/I(X;Y)))$ quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets $2 < |mathcal{X}| < infty$, it is established that an $eta$-fraction of the mutual information can be preserved using roughly $(log(| mathcal{X} | /I(X;Y)))^{etacdot(|mathcal{X}| - 1)}$ quantization levels.