Do you want to publish a course? Click here

Dimension-free log-Sobolev inequalities for mixture distributions

240   0   0.0 ( 0 )
 Added by Sinho Chewi
 Publication date 2021
  fields
and research's language is English




Ask ChatGPT about the research

We prove that if ${(P_x)}_{xin mathscr X}$ is a family of probability measures which satisfy the log-Sobolev inequality and whose pairwise chi-squared divergences are uniformly bounded, and $mu$ is any mixing distribution on $mathscr X$, then the mixture $int P_x , mathrm{d} mu(x)$ satisfies a log-Sobolev inequality. In various settings of interest, the resulting log-Sobolev constant is dimension-free. In particular, our result implies a conjecture of Zimmermann and Bardet et al. that Gaussian convolutions of measures with bounded support enjoy dimension-free log-Sobolev inequalities.



rate research

Read More

We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intrinsic volumes of a convex body, which generalizes and improves a result of Lotz, McCoy, Nourdin, Peccati, and Tropp (2019).
In his work about hypocercivity, Villani [18] considers in particular convergence to equilibrium for the kinetic Langevin process. While his convergence results in L 2 are given in a quite general setting, convergence in entropy requires some boundedness condition on the Hessian of the Hamiltonian. We will show here how to get rid of this assumption in the study of the hypocoercive entropic relaxation to equilibrium for the Langevin diffusion. Our method relies on a generalization to entropy of the multipliers method and an adequate functional inequality. As a byproduct, we also give tractable conditions for this functional inequality, which is a particular instance of a weighted logarithmic Sobolev inequality, to hold.
A central tool in the study of nonhomogeneous random matrices, the noncommutative Khintchine inequality of Lust-Piquard and Pisier, yields a nonasymptotic bound on the spectral norm of general Gaussian random matrices $X=sum_i g_i A_i$ where $g_i$ are independent standard Gaussian variables and $A_i$ are matrix coefficients. This bound exhibits a logarithmic dependence on dimension that is sharp when the matrices $A_i$ commute, but often proves to be suboptimal in the presence of noncommutativity. In this paper, we develop nonasymptotic bounds on the spectrum of arbitrary Gaussian random matrices that can capture noncommutativity. These bounds quantify the degree to which the deterministic matrices $A_i$ behave as though they are freely independent. This intrinsic freeness phenomenon provides a powerful tool for the study of various questions that are outside the reach of classical methods of random matrix theory. Our nonasymptotic bounds are easily applicable in concrete situations, and yield sharp results in examples where the noncommutative Khintchine inequality is suboptimal. When combined with a linearization argument, our bounds imply strong asymptotic freeness (in the sense of Haagerup-Thorbj{o}rnsen) for a remarkably general class of Gaussian random matrix models, including matrices that may be very sparse and that lack any special symmetries. Beyond the Gaussian setting, we develop matrix concentration inequalities that capture noncommutativity for general sums of independent random matrices, which arise in many problems of pure and applied mathematics.
Firstly, we derive in dimension one a new covariance inequality of $L_{1}-L_{infty}$ type that characterizes the isoperimetric constant as the best constant achieving the inequality. Secondly, we generalize our result to $L_{p}-L_{q}$ bounds for the covariance. Consequently, we recover Cheegers inequality without using the co-area formula. We also prove a generalized weighted Hardy type inequality that is needed to derive our covariance inequalities and that is of independent interest. Finally, we explore some consequences of our covariance inequalities for $L_{p}$-Poincar{e} inequalities and moment bounds. In particular, we obtain optimal constants in general $L_{p}$-Poincar{e} inequalities for measures with finite isoperimetric constant, thus generalizing in dimension one Cheegers inequality, which is a $L_{p}$-Poincar{e} inequality for $p=2$, to any real $pgeq 1$.
120 - Djalil Chafai 2009
Mixtures are convex combinations of laws. Despite this simple definition, a mixture can be far more subtle than its mixed components. For instance, mixing Gaussian laws may produce a potential with multiple deep wells. We study in the present work fine properties of mixtures with respect to concentration of measure and Sobolev type functional inequalities. We provide sharp Laplace bounds for Lipschitz functions in the case of generic mixtures, involving a transportation cost diameter of the mixed family. Additionally, our analysis of Sobolev type inequalities for two-component mixtures reveals natural relations with some kind of band isoperimetry and support constrained interpolation via mass transportation. We show that the Poincare constant of a two-component mixture may remain bounded as the mixture proportion goes to 0 or 1 while the logarithmic Sobolev constant may surprisingly blow up. This counter-intuitive result is not reducible to support disconnections, and appears as a reminiscence of the variance-entropy comparison on the two-point space. As far as mixtures are concerned, the logarithmic Sobolev inequality is less stable than the Poincare inequality and the sub-Gaussian concentration for Lipschitz functions. We illustrate our results on a gallery of concrete two-component mixtures. This work leads to many open questions.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا