ﻻ يوجد ملخص باللغة العربية
We prove the large-dimensional Gaussian approximation of a sum of $n$ independent random vectors in $mathbb{R}^d$ together with fourth-moment error bounds on convex sets and Euclidean balls. We show that compared with classical third-moment bounds, our bounds have near-optimal dependence on $n$ and can achieve improved dependence on the dimension $d$. For centered balls, we obtain an additional error bound that has a sub-optimal dependence on $n$, but recovers the known result of the validity of the Gaussian approximation if and only if $d=o(n)$. We discuss an application to the bootstrap. We prove our main results using Steins method.
For probability measures on a complete separable metric space, we present sufficient conditions for the existence of a solution to the Kantorovich transportation problem. We also obtain sufficient conditions (which sometimes also become necessary) fo
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertai
Our purpose is to prove central limit theorem for countable nonhomogeneous Markov chain under the condition of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chain in Ces`aro sense. Furthermore, we obtain a
Given ${X_k}$ is a martingale difference sequence. And given another ${Y_k}$ which has dependency within the sequence. Assume ${X_k}$ is independent with ${Y_k}$, we study the properties of the sums of product of two sequences $sum_{k=1}^{n} X_k Y_k$
We give a new proof of the classical Central Limit Theorem, in the Mallows ($L^r$-Wasserstein) distance. Our proof is elementary in the sense that it does not require complex analysis, but rather makes use of a simple subadditive inequality related t