We establish a sharp moment comparison inequality between an arbitrary negative moment and the second moment for sums of independent uniform random variables, which extends Balls cube slicing inequality.
We prove Khinchin-type inequalities with sharp constants for type L random variables and all even moments. Our main tool is Hadamards factorisation theorem from complex analysis, combined with Newtons inequalities for elementary symmetric functions. Besides the case of independent summands, we also treat ferromagnetic dependencies in a nonnegative external magnetic field (thanks to Newmans generalisation of the Lee-Yang theorem). Lastly, we compare the notions of type L, ultra sub-Gaussianity (introduced by Nayar and Oleszkiewicz) and strong log-concavity (introduced by Gurvits), with the latter two being equivalent.
We prove a quantitative form of the celebrated Balls theorem on cube slicing in $mathbb{R}^n$ and obtain, as a consequence, equality cases in the min-entropy power inequality. Independently, we also give a quantitative form of Khintchines inequality in the special case $p=1$.
We establish several optimal moment comparison inequalities (Khinchin-type inequalities) for weighted sums of independent identically distributed symmetric discrete random variables which are uniform on sets of consecutive integers. Specifically, we obtain sharp constants for the second moment and any moment of order at least 3 (using convex dominance by Gaussian random variables). In the case of only 3 atoms, we also establish a Schur-convexity result. For moments of order less than 2, we get sharp constants in two cases by exploiting Haagerups arguments for random signs.
In [A dozen de {F}inetti-style results in search of a theory, Ann. Inst. H. Poincar{e} Probab. Statist. 23(2)(1987), 397--423], Diaconis and Freedman studied low-dimensional projections of random vectors from the Euclidean unit sphere and the simplex in high dimensions, noting that the individual coordinates of these random vectors look like Gaussian and exponential random variables respectively. In subsequent works, Rachev and Ruschendorf and Naor and Romik unified these results by establishing a connection between $ell_p^N$ balls and a $p$-generalized Gaussian distribution. In this paper, we study similar questions in a significantly generalized and unifying setting, looking at low-dimensional projections of random vectors uniformly distributed on sets of the form [B_{phi,t}^N := Big{(s_1,ldots,s_N)inmathbb{R}^N : sum_{ i =1}^Nphi(s_i)leq t NBig},] where $phi:mathbb{R}to [0,infty]$ is a potential (including the case of Orlicz functions). Our method is different from both Rachev-Ruschendorf and Naor-Romik, based on a large deviation perspective in the form of quantitati
Mixtures are convex combinations of laws. Despite this simple definition, a mixture can be far more subtle than its mixed components. For instance, mixing Gaussian laws may produce a potential with multiple deep wells. We study in the present work fine properties of mixtures with respect to concentration of measure and Sobolev type functional inequalities. We provide sharp Laplace bounds for Lipschitz functions in the case of generic mixtures, involving a transportation cost diameter of the mixed family. Additionally, our analysis of Sobolev type inequalities for two-component mixtures reveals natural relations with some kind of band isoperimetry and support constrained interpolation via mass transportation. We show that the Poincare constant of a two-component mixture may remain bounded as the mixture proportion goes to 0 or 1 while the logarithmic Sobolev constant may surprisingly blow up. This counter-intuitive result is not reducible to support disconnections, and appears as a reminiscence of the variance-entropy comparison on the two-point space. As far as mixtures are concerned, the logarithmic Sobolev inequality is less stable than the Poincare inequality and the sub-Gaussian concentration for Lipschitz functions. We illustrate our results on a gallery of concrete two-component mixtures. This work leads to many open questions.