ترغب بنشر مسار تعليمي؟ اضغط هنا

New error bounds in multivariate normal approximations via exchangeable pairs with applications to Wishart matrices and fourth moment theorems

49   0   0.0 ( 0 )
 نشر من قبل Xiao Fang
 تاريخ النشر 2020
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We extend Steins celebrated Wasserstein bound for normal approximation via exchangeable pairs to the multi-dimensional setting. As an intermediate step, we exploit the symmetry of exchangeable pairs to obtain an error bound for smooth test functions. We also obtain a continuous version of the multi-dimensional Wasserstein bound in terms of fourth moments. We apply the main results to multivariate normal approximations to Wishart matrices of size $n$ and degree $d$, where we obtain the optimal convergence rate $sqrt{n^3/d}$ under only moment assumptions, and to quadratic forms and Poisson functionals, where we strengthen a few of the fourth moment bounds in the literature on the Wasserstein distance.



قيم البحث

اقرأ أيضاً

Given a graph sequence ${G_n}_{n geq 1}$ denote by $T_3(G_n)$ the number of monochromatic triangles in a uniformly random coloring of the vertices of $G_n$ with $c geq 2$ colors. This arises as a generalization of the birthday paradox, where $G_n$ co rresponds to a friendship network and $T_3(G_n)$ counts the number of triples of friends with matching birthdays. In this paper we prove a central limit theorem (CLT) for $T_3(G_n)$ with explicit error rates. The proof involves constructing a martingale difference sequence by carefully ordering the vertices of $G_n$, based on a certain combinatorial score function, and using a quantitive version of the martingale CLT. We then relate this error term to the well-known fourth moment phenomenon, which, interestingly, holds only when the number of colors $c geq 5$. We also show that the convergence of the fourth moment is necessary to obtain a Gaussian limit for any $c geq 2$, which, together with the above result, implies that the fourth-moment condition characterizes the limiting normal distribution of $T_3(G_n)$, whenever $c geq 5$. Finally, to illustrate the promise of our approach, we include an alternative proof of the CLT for the number of monochromatic edges, which provides quantitative rates for the results obtained in Bhattacharya et al. (2017).
84 - Xiao Fang , Yuta Koike 2020
We prove the large-dimensional Gaussian approximation of a sum of $n$ independent random vectors in $mathbb{R}^d$ together with fourth-moment error bounds on convex sets and Euclidean balls. We show that compared with classical third-moment bounds, o ur bounds have near-optimal dependence on $n$ and can achieve improved dependence on the dimension $d$. For centered balls, we obtain an additional error bound that has a sub-optimal dependence on $n$, but recovers the known result of the validity of the Gaussian approximation if and only if $d=o(n)$. We discuss an application to the bootstrap. We prove our main results using Steins method.
We consider ensembles of real symmetric band matrices with entries drawn from an infinite sequence of exchangeable random variables, as far as the symmetry of the matrices permits. In general the entries of the upper triangular parts of these matrice s are correlated and no smallness or sparseness of these correlations is assumed. It is shown that the eigenvalue distribution measures still converge to a semicircle but with random scaling. We also investigate the asymptotic behavior of the corresponding $ell_2$-operator norms. The key to our analysis is a generalisation of a classic result by de Finetti that allows to represent the underlying probability spaces as averages of Wigner band ensembles with entries that are not necessarily centred. Some of our results appear to be new even for such Wigner band matrices.
119 - R.Sharma , R.Kumar , R.Saini 2019
We derive some inequalities involving first four central moments of discrete and continuous distributions. Bounds for the eigenvalues and spread of a matrix are obtained when all its eigenvalues are real. Likewise, we discuss bounds for the roots and span of a polynomial equation.
Due to their importance in both data analysis and numerical algorithms, low rank approximations have recently been widely studied. They enable the handling of very large matrices. Tight error bounds for the computationally efficient Gaussian eliminat ion based methods (skeleton approximations) are available. In practice, these bounds are useful for matrices with singular values which decrease quickly. Using the Chebyshev norm, this paper provides improved bounds for the errors of the matrix elements. These bounds are substantially better in the practically relevant cases where the eigenvalues decrease polynomially. Results are proven for general real rectangular matrices. Even stronger bounds are obtained for symmetric positive definite matrices. A simple example is given, comparing these new bounds to earlier ones.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا