Do you want to publish a course? Click here

Convergence for sums of i. i. d. random variables under sublinear expectations

99   0   0.0 ( 0 )
 Added by Mingzhou Xu
 Publication date 2021
  fields
and research's language is English




Ask ChatGPT about the research

In this paper, we prove the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent, identically distributed random variables under sublinear expectations space. As applications, the Baum-Katz type results for the maximum for partial weighted sums of independent, identically distributed random variables are established under sublinear expectations space. The results obtained in the article are the extensions of the equivalent conditions of complete moment convergence of the maximum under classical linear expectation space.



rate research

Read More

85 - Mingzhou Xu , Kun Cheng 2021
The complete convergence for weighted sums of sequences of independent, identically distributed random variables under sublinear expectations space was studied. By moment inequality and truncation methods, we establish the equivalent conditions of complete convergence for weighted sums of sequences of independent, identically distributed random variables under sublinear expectations space. The results extend the corresponding results obtained by Guo (2012) to those for sequences of independent, identically distributed random variables under sublinear expectations space.
Under the sublinear expectation $mathbb{E}[cdot]:=sup_{thetain Theta} E_theta[cdot]$ for a given set of linear expectations ${E_theta: thetain Theta}$, we establish a new law of large numbers and a new central limit theorem with rate of convergence. We present some interesting special cases and discuss a related statistical inference problem. We also give an approximation and a representation of the $G$-normal distribution, which was used as the limit in Peng (2007)s central limit theorem, in a probability space.
This short note provides a new and simple proof of the convergence rate for Pengs law of large numbers under sublinear expectations, which improves the corresponding results in Song [15] and Fang et al. [3].
202 - Shige Peng 2008
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertainty can be also described. W present our new result of central limit theorem under sublinear expectation. This theorem can be also regarded as a generalization of the law of large number in the case of mean-uncertainty.
We consider the problem of bounding large deviations for non-i.i.d. random variables that are allowed to have arbitrary dependencies. Previous works typically assumed a specific dependence structure, namely the existence of independent components. Bounds that depend on the degree of dependence between the observations have only been studied in the theory of mixing processes, where variables are time-ordered. Here, we introduce a new way of measuring dependences within an unordered set of variables. We prove concentration inequalities, that apply to any set of random variables, but benefit from the presence of weak dependencies. We also discuss applications and extensions of our results to related problems of machine learning and large deviations.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا