ترغب بنشر مسار تعليمي؟ اضغط هنا

Some probability inequalities for multivariate gamma and normal distributions

139   0   0.0 ( 0 )
 نشر من قبل Thomas Royen
 تاريخ النشر 2015
  مجال البحث
والبحث باللغة English
 تأليف Thomas Royen




اسأل ChatGPT حول البحث

The Gaussian correlation inequality for multivariate zero-mean normal probabilities of symmetrical n-rectangles can be considered as an inequality for multivariate gamma distributions (in the sense of Krishnamoorthy and Parthasarathy [5]) with one degree of freedom. Its generalization to all integer degrees of freedom and sufficiently large non-integer degrees of freedom was recently proved in [10]. Here, this inequality is partly extended to smaller non-integer degrees of freedom and in particular - in a weaker form - to all infinitely divisible multivariate gamma distributions. A further monotonicity property - sometimes called more PLOD (positively lower orthant dependent) - for increasing correlations is proved for multivariate gamma distributions with integer or sufficiently large degrees of freedom.



قيم البحث

اقرأ أيضاً

We employ stabilization methods and second order Poincare inequalities to establish rates of multivariate normal convergence for a large class of vectors $(H_s^{(1)},...,H_s^{(m)})$, $s geq 1$, of statistics of marked Poisson processes on $mathbb{R}^ d$, $d geq 2$, as the intensity parameter $s$ tends to infinity. Our results are applicable whenever the constituent functionals $H_s^{(i)}$, $iin{1,...,m}$, are expressible as sums of exponentially stabilizing score functions satisfying a moment condition. The rates are for the $d_2$-, $d_3$-, and $d_{convex}$-distances. When we compare with a centered Gaussian random vector, whose covariance matrix is given by the asymptotic covariances, the rates are in general unimprovable and are governed by the rate of convergence of $s^{-1} {rm Cov}( H_s^{(i)}, H_s^{(j)})$, $i,jin{1,...,m}$, to the limiting covariance, shown to be of order $s^{-1/d}$. We use the general results to deduce rates of multivariate normal convergence for statistics arising in random graphs and topological data analysis as well as for multivariate statistics used to test equality of distributions. Some of our results hold for stabilizing functionals of Poisson input on suitable metric spaces.
A central tool in the study of nonhomogeneous random matrices, the noncommutative Khintchine inequality of Lust-Piquard and Pisier, yields a nonasymptotic bound on the spectral norm of general Gaussian random matrices $X=sum_i g_i A_i$ where $g_i$ ar e independent standard Gaussian variables and $A_i$ are matrix coefficients. This bound exhibits a logarithmic dependence on dimension that is sharp when the matrices $A_i$ commute, but often proves to be suboptimal in the presence of noncommutativity. In this paper, we develop nonasymptotic bounds on the spectrum of arbitrary Gaussian random matrices that can capture noncommutativity. These bounds quantify the degree to which the deterministic matrices $A_i$ behave as though they are freely independent. This intrinsic freeness phenomenon provides a powerful tool for the study of various questions that are outside the reach of classical methods of random matrix theory. Our nonasymptotic bounds are easily applicable in concrete situations, and yield sharp results in examples where the noncommutative Khintchine inequality is suboptimal. When combined with a linearization argument, our bounds imply strong asymptotic freeness (in the sense of Haagerup-Thorbj{o}rnsen) for a remarkably general class of Gaussian random matrix models, including matrices that may be very sparse and that lack any special symmetries. Beyond the Gaussian setting, we develop matrix concentration inequalities that capture noncommutativity for general sums of independent random matrices, which arise in many problems of pure and applied mathematics.
We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A functional analog of this geometric inequality is derived, giving large and small deviation inequalities from a median, in terms of a modulus of regularity. Our methods are of independent interest, we find that log-affine sequences are the extreme points of the set of log-concave sequences belonging to a half-space slice of the simplex. This amounts to a discrete analog of the localization lemma of Lovasz and Simonovits. Further applications of this lemma are used to produce a discrete version of the Prekopa-Leindler inequality, large deviation inequalities for log-concave measures about their mean, and provide insight on the stability of generalized log-concavity under convolution.
Given a vector $F=(F_1,dots,F_m)$ of Poisson functionals $F_1,dots,F_m$, we investigate the proximity between $F$ and an $m$-dimensional centered Gaussian random vector $N_Sigma$ with covariance matrix $Sigmainmathbb{R}^{mtimes m}$. Apart from findin g proximity bounds for the $d_2$- and $d_3$-distances, based on classes of smooth test functions, we obtain proximity bounds for the $d_{convex}$-distance, based on the less tractable test functions comprised of indicators of convex sets. The bounds for all three distances are shown to be of the same order, which is presumably optimal. The bounds are multivariate counterparts of the univariate second order Poincare inequalities and, as such, are expressed in terms of integrated moments of first and second order difference operators. The derived second order Poincare inequalities for indicators of convex sets are made possible by a new bound on the second derivatives of the solution to the Stein equation for the multivariate normal distribution. We present applications to the multivariate normal approximation of first order Poisson integrals and of statistics of Boolean models.
We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intri nsic volumes of a convex body, which generalizes and improves a result of Lotz, McCoy, Nourdin, Peccati, and Tropp (2019).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا