ترغب بنشر مسار تعليمي؟ اضغط هنا

Large-dimensional Central Limit Theorem with Fourth-moment Error Bounds on Convex Sets and Balls

85   0   0.0 ( 0 )
 نشر من قبل Xiao Fang
 تاريخ النشر 2020
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We prove the large-dimensional Gaussian approximation of a sum of $n$ independent random vectors in $mathbb{R}^d$ together with fourth-moment error bounds on convex sets and Euclidean balls. We show that compared with classical third-moment bounds, our bounds have near-optimal dependence on $n$ and can achieve improved dependence on the dimension $d$. For centered balls, we obtain an additional error bound that has a sub-optimal dependence on $n$, but recovers the known result of the validity of the Gaussian approximation if and only if $d=o(n)$. We discuss an application to the bootstrap. We prove our main results using Steins method.



قيم البحث

اقرأ أيضاً

129 - S.Ekisheva , C. Houdre 2006
For probability measures on a complete separable metric space, we present sufficient conditions for the existence of a solution to the Kantorovich transportation problem. We also obtain sufficient conditions (which sometimes also become necessary) fo r the convergence, in transportation, of probability measures when the cost function is continuous, non-decreasing and depends on the distance. As an application, the CLT in the transportation distance is proved for independent and some dependent stationary sequences.
201 - Shige Peng 2008
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertai nty can be also described. W present our new result of central limit theorem under sublinear expectation. This theorem can be also regarded as a generalization of the law of large number in the case of mean-uncertainty.
Our purpose is to prove central limit theorem for countable nonhomogeneous Markov chain under the condition of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chain in Ces`aro sense. Furthermore, we obtain a corresponding moderate deviation theorem for countable nonhomogeneous Markov chain by Gartner-Ellis theorem and exponential equivalent method.
Given ${X_k}$ is a martingale difference sequence. And given another ${Y_k}$ which has dependency within the sequence. Assume ${X_k}$ is independent with ${Y_k}$, we study the properties of the sums of product of two sequences $sum_{k=1}^{n} X_k Y_k$ . We obtain product-CLT, a modification of classical central limit theorem, which can be useful in the study of random projections. We also obtain the rate of convergence which is similar to the Berry-Essen theorem in the classical CLT.
We give a new proof of the classical Central Limit Theorem, in the Mallows ($L^r$-Wasserstein) distance. Our proof is elementary in the sense that it does not require complex analysis, but rather makes use of a simple subadditive inequality related t o this metric. The key is to analyse the case where equality holds. We provide some results concerning rates of convergence. We also consider convergence to stable distributions, and obtain a bound on the rate of such convergence.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا