No Arabic abstract
The discounted central limit theorem concerns the convergence of an infinite discounted sum of i.i.d. random variables to normality as the discount factor approaches $1$. We show that, using the Fourier metric on probability distributions, one can obtain the discounted central limit theorem, as well as a quantitative version of it, in a simple and natural way, and under weak assumptions.
We consider the probability distributions of values in the complex plane attained by Fourier sums of the form sum_{j=1}^n a_j exp(-2pi i j nu) /sqrt{n} when the frequency nu is drawn uniformly at random from an interval of length 1. If the coefficients a_j are i.i.d. drawn with finite third moment, the distance of these distributions to an isotropic two-dimensional Gaussian on C converges in probability to zero for any pseudometric on the set of distributions for which the distance between empirical distributions and the underlying distribution converges to zero in probability.
We consider a Moran model with two allelic types, mutation and selection. In this work, we study the behaviour of the proportion of fit individuals when the size of the population tends to infinity, without any rescaling of parameters or time. We first prove that the latter converges, uniformly in compacts in probability, to the solution of an ordinary differential equation, which is explicitly solved. Next, we study the stability properties of its equilibrium points. Moreover, we show that the fluctuations of the proportion of fit individuals, after a proper normalization, satisfy a uniform central limit theorem in $[0,infty)$. As a consequence, we deduce the convergence of the corresponding stationary distributions.
A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of $n$ independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as $ntoinfty$.
For probability measures on a complete separable metric space, we present sufficient conditions for the existence of a solution to the Kantorovich transportation problem. We also obtain sufficient conditions (which sometimes also become necessary) for the convergence, in transportation, of probability measures when the cost function is continuous, non-decreasing and depends on the distance. As an application, the CLT in the transportation distance is proved for independent and some dependent stationary sequences.
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertainty can be also described. W present our new result of central limit theorem under sublinear expectation. This theorem can be also regarded as a generalization of the law of large number in the case of mean-uncertainty.