No Arabic abstract
We study functional inequalities (Poincare, Cheeger, log-Sobolev) for probability measures obtained as perturbations. Several explicit results for general measures as well as log-concave distributions are given.The initial goal of this work was to obtain explicit bounds on the constants in view of statistical applications for instance. These results are then applied to the Langevin Monte-Carlo method used in statistics in order to compute Bayesian estimators.
We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A functional analog of this geometric inequality is derived, giving large and small deviation inequalities from a median, in terms of a modulus of regularity. Our methods are of independent interest, we find that log-affine sequences are the extreme points of the set of log-concave sequences belonging to a half-space slice of the simplex. This amounts to a discrete analog of the localization lemma of Lovasz and Simonovits. Further applications of this lemma are used to produce a discrete version of the Prekopa-Leindler inequality, large deviation inequalities for log-concave measures about their mean, and provide insight on the stability of generalized log-concavity under convolution.
The goal of this paper is to push forward the study of those properties of log-concave measures that help to estimate their Poincar{e} constant. First we revisit E. Milmans result [40] on the link between weak (Poincar{e} or concentration) inequalities and Cheegers inequality in the logconcave cases, in particular extending localization ideas and a result of Latala, as well as providing a simpler proof of the nice Poincar{e} (dimensional) bound in the inconditional case. Then we prove alternative transference principle by concentration or using various distances (total variation, Wasserstein). A mollification procedure is also introduced enabling, in the logconcave case, to reduce to the case of the Poincar{e} inequality for the mollified measure. We finally complete the transference section by the comparison of various probability metrics (Fortet-Mourier, bounded-Lipschitz,...).
In this paper we show that the family P_d of probability distributions on R^d with log-concave densities satisfies a strong continuity condition. In particular, it turns out that weak convergence within this family entails (i) convergence in total variation distance, (ii) convergence of arbitrary moments, and (iii) pointwise convergence of Laplace transforms. Hence the nonparametric model P_d has similar properties as parametric models such as, for instance, the family of all d-variate Gaussian distributions.
We prove that moderate deviations for empirical measures for countable nonhomogeneous Markov chains hold under the assumption of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chains in Ces`aro sense.
We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intrinsic volumes of a convex body, which generalizes and improves a result of Lotz, McCoy, Nourdin, Peccati, and Tropp (2019).