Do you want to publish a course? Click here

Limit theorems with rate of convergence under sublinear expectations

361   0   0.0 ( 0 )
 Added by Xiao Fang
 Publication date 2017
  fields
and research's language is English




Ask ChatGPT about the research

Under the sublinear expectation $mathbb{E}[cdot]:=sup_{thetain Theta} E_theta[cdot]$ for a given set of linear expectations ${E_theta: thetain Theta}$, we establish a new law of large numbers and a new central limit theorem with rate of convergence. We present some interesting special cases and discuss a related statistical inference problem. We also give an approximation and a representation of the $G$-normal distribution, which was used as the limit in Peng (2007)s central limit theorem, in a probability space.



rate research

Read More

202 - Shige Peng 2008
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertainty can be also described. W present our new result of central limit theorem under sublinear expectation. This theorem can be also regarded as a generalization of the law of large number in the case of mean-uncertainty.
This short note provides a new and simple proof of the convergence rate for Pengs law of large numbers under sublinear expectations, which improves the corresponding results in Song [15] and Fang et al. [3].
98 - Mingzhou Xu , Kun Cheng 2021
In this paper, we prove the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent, identically distributed random variables under sublinear expectations space. As applications, the Baum-Katz type results for the maximum for partial weighted sums of independent, identically distributed random variables are established under sublinear expectations space. The results obtained in the article are the extensions of the equivalent conditions of complete moment convergence of the maximum under classical linear expectation space.
In this paper, we propose a monotone approximation scheme for a class of fully nonlinear partial integro-differential equations (PIDEs) which characterize the nonlinear $alpha$-stable L{e}vy processes under sublinear expectation space with $alpha in(1,2)$. Two main results are obtained: (i) the error bounds for the monotone approximation scheme of nonlinear PIDEs, and (ii) the convergence rates of a generalized central limit theorem of Bayraktar-Munk for $alpha$-stable random variables under sublinear expectation. Our proofs use and extend techniques introduced by Krylov and Barles-Jakobsen.
85 - Mingzhou Xu , Kun Cheng 2021
The complete convergence for weighted sums of sequences of independent, identically distributed random variables under sublinear expectations space was studied. By moment inequality and truncation methods, we establish the equivalent conditions of complete convergence for weighted sums of sequences of independent, identically distributed random variables under sublinear expectations space. The results extend the corresponding results obtained by Guo (2012) to those for sequences of independent, identically distributed random variables under sublinear expectations space.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا