Do you want to publish a course? Click here

Concentration Inequalities for Ultra Log-Concave Distributions

233   0   0.0 ( 0 )
 Added by Arnaud Marsiglietti
 Publication date 2021
  fields
and research's language is English




Ask ChatGPT about the research

We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intrinsic volumes of a convex body, which generalizes and improves a result of Lotz, McCoy, Nourdin, Peccati, and Tropp (2019).

rate research

Read More

Two-sided bounds are explored for concentration functions and Renyi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy power inequalities.
122 - Tomohiro Nishiyama 2019
Log-concave distributions include some important distributions such as normal distribution, exponential distribution and so on. In this note, we show inequalities between two Lp-norms for log-concave distributions on the Euclidean space. These inequalities are the generalizations of the upper and lower bound of the differential entropy and are also interpreted as a kind of expansion of the inequality between two Lp-norms on the measurable set with finite measure.
We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A functional analog of this geometric inequality is derived, giving large and small deviation inequalities from a median, in terms of a modulus of regularity. Our methods are of independent interest, we find that log-affine sequences are the extreme points of the set of log-concave sequences belonging to a half-space slice of the simplex. This amounts to a discrete analog of the localization lemma of Lovasz and Simonovits. Further applications of this lemma are used to produce a discrete version of the Prekopa-Leindler inequality, large deviation inequalities for log-concave measures about their mean, and provide insight on the stability of generalized log-concavity under convolution.
We prove that if ${(P_x)}_{xin mathscr X}$ is a family of probability measures which satisfy the log-Sobolev inequality and whose pairwise chi-squared divergences are uniformly bounded, and $mu$ is any mixing distribution on $mathscr X$, then the mixture $int P_x , mathrm{d} mu(x)$ satisfies a log-Sobolev inequality. In various settings of interest, the resulting log-Sobolev constant is dimension-free. In particular, our result implies a conjecture of Zimmermann and Bardet et al. that Gaussian convolutions of measures with bounded support enjoy dimension-free log-Sobolev inequalities.
In this paper we show that the family P_d of probability distributions on R^d with log-concave densities satisfies a strong continuity condition. In particular, it turns out that weak convergence within this family entails (i) convergence in total variation distance, (ii) convergence of arbitrary moments, and (iii) pointwise convergence of Laplace transforms. Hence the nonparametric model P_d has similar properties as parametric models such as, for instance, the family of all d-variate Gaussian distributions.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا