A lower bound on the differential entropy of log-concave random vectors with applications


الملخص بالإنكليزية

We derive a lower bound on the differential entropy of a log-concave random variable $X$ in terms of the $p$-th absolute moment of $X$. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure $| x - hat x|^r$, and we establish that the difference between the rate distortion function and the Shannon lower bound is at most $log(sqrt{pi e}) approx 1.5$ bits, independently of $r$ and the target distortion $d$. For mean-square error distortion, the difference is at most $log (sqrt{frac{pi e}{2}}) approx 1$ bits, regardless of $d$. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most $log (sqrt{frac{pi e}{2}}) approx 1$ bits. Our results generalize to the case of vector $X$ with possibly dependent coordinates, and to $gamma$-concave random variables. Our proof technique leverages tools from convex geometry.

تحميل البحث