We study the approximation of arbitrary distributions $P$ on $d$-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if $P$ has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on $P$ with respect to Mallows distance $D_1(cdot,cdot)$. This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response $Y=mu(X)+epsilon$, where $X$ and $epsilon$ are independent, $mu(cdot)$ belongs to a certain class of regression functions while $epsilon$ is a random error with log-concave density and mean zero.
The log-concave projection is an operator that maps a d-dimensional distribution P to an approximating log-concave density. Prior work by D{u}mbgen et al. (2011) establishes that, with suitable metrics on the underlying spaces, this projection is continuous, but not uniformly continuous. In this work we prove a local uniform continuity result for log-concave projection -- in particular, establishing that this map is locally H{o}lder-(1/4) continuous. A matching lower bound verifies that this exponent cannot be improved. We also examine the implications of this continuity result for the empirical setting -- given a sample drawn from a distribution P, we bound the squared Hellinger distance between the log-concave projection of the empirical distribution of the sample, and the log-concave projection of P. In particular, this yields interesting statistical results for the misspecified setting, where P is not itself log-concave.
Log-concave distributions include some important distributions such as normal distribution, exponential distribution and so on. In this note, we show inequalities between two Lp-norms for log-concave distributions on the Euclidean space. These inequalities are the generalizations of the upper and lower bound of the differential entropy and are also interpreted as a kind of expansion of the inequality between two Lp-norms on the measurable set with finite measure.
In this paper, we have developed a new class of sampling schemes for estimating parameters of binomial and Poisson distributions. Without any information of the unknown parameters, our sampling schemes rigorously guarantee prescribed levels of precision and confidence.
We introduce new shape-constrained classes of distribution functions on R, the bi-$s^*$-concave classes. In parallel to results of Dumbgen, Kolesnyk, and Wilke (2017) for what they called the class of bi-log-concave distribution functions, we show that every $s$-concave density $f$ has a bi-$s^*$-concave distribution function $F$ for $s^*leq s/(s+1)$. Confidence bands building on existing nonparametric bands, but accounting for the shape constraint of bi-$s^*$-concavity, are also considered. The new bands extend those developed by Dumbgen et al. (2017) for the constraint of bi-log-concavity. We also make connections between bi-$s^*$-concavity and finiteness of the CsorgH{o} - Revesz constant of $F$ which plays an important role in the theory of quantile processes.
We introduce a new shape-constrained class of distribution functions on R, the bi-$s^*$-concave class. In parallel to results of Dumbgen, Kolesnyk, and Wilke (2017) for what they called the class of bi-log-concave distribution functions, we show that every s-concave density f has a bi-$s^*$-concave distribution function $F$ and that every bi-$s^*$-concave distribution function satisfies $gamma (F) le 1/(1+s)$ where finiteness of $$ gamma (F) equiv sup_{x} F(x) (1-F(x)) frac{| f (x)|}{f^2 (x)}, $$ the CsorgH{o} - Revesz constant of F, plays an important role in the theory of quantile processes on $R$.