Approximation by log-concave distributions, with applications to regression


Abstract in English

We study the approximation of arbitrary distributions $P$ on $d$-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if $P$ has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on $P$ with respect to Mallows distance $D_1(cdot,cdot)$. This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response $Y=mu(X)+epsilon$, where $X$ and $epsilon$ are independent, $mu(cdot)$ belongs to a certain class of regression functions while $epsilon$ is a random error with log-concave density and mean zero.

Download