ﻻ يوجد ملخص باللغة العربية
We prove that if ${(P_x)}_{xin mathscr X}$ is a family of probability measures which satisfy the log-Sobolev inequality and whose pairwise chi-squared divergences are uniformly bounded, and $mu$ is any mixing distribution on $mathscr X$, then the mixture $int P_x , mathrm{d} mu(x)$ satisfies a log-Sobolev inequality. In various settings of interest, the resulting log-Sobolev constant is dimension-free. In particular, our result implies a conjecture of Zimmermann and Bardet et al. that Gaussian convolutions of measures with bounded support enjoy dimension-free log-Sobolev inequalities.
We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intri
In his work about hypocercivity, Villani [18] considers in particular convergence to equilibrium for the kinetic Langevin process. While his convergence results in L 2 are given in a quite general setting, convergence in entropy requires some bounded
A central tool in the study of nonhomogeneous random matrices, the noncommutative Khintchine inequality of Lust-Piquard and Pisier, yields a nonasymptotic bound on the spectral norm of general Gaussian random matrices $X=sum_i g_i A_i$ where $g_i$ ar
Firstly, we derive in dimension one a new covariance inequality of $L_{1}-L_{infty}$ type that characterizes the isoperimetric constant as the best constant achieving the inequality. Secondly, we generalize our result to $L_{p}-L_{q}$ bounds for the
Mixtures are convex combinations of laws. Despite this simple definition, a mixture can be far more subtle than its mixed components. For instance, mixing Gaussian laws may produce a potential with multiple deep wells. We study in the present work fi