A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of $n$ independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as $ntoinfty$.
We explore an asymptotic behavior of Renyi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity is addressed under suitable moment hypotheses.
For probability measures on a complete separable metric space, we present sufficient conditions for the existence of a solution to the Kantorovich transportation problem. We also obtain sufficient conditions (which sometimes also become necessary) for the convergence, in transportation, of probability measures when the cost function is continuous, non-decreasing and depends on the distance. As an application, the CLT in the transportation distance is proved for independent and some dependent stationary sequences.
We prove for the rescaled convolution map $fto fcircledast f$ propagation of polynomial, exponential and gaussian localization. The gaussian localization is then used to prove an optimal bound on the rate of entropy production by this map. As an application we prove the convergence of the CLT to be at the optimal rate $1/sqrt{n}$ in the entropy (and $L^1$) sense, for distributions with finite 4th moment.
We consider a Moran model with two allelic types, mutation and selection. In this work, we study the behaviour of the proportion of fit individuals when the size of the population tends to infinity, without any rescaling of parameters or time. We first prove that the latter converges, uniformly in compacts in probability, to the solution of an ordinary differential equation, which is explicitly solved. Next, we study the stability properties of its equilibrium points. Moreover, we show that the fluctuations of the proportion of fit individuals, after a proper normalization, satisfy a uniform central limit theorem in $[0,infty)$. As a consequence, we deduce the convergence of the corresponding stationary distributions.
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertainty can be also described. W present our new result of central limit theorem under sublinear expectation. This theorem can be also regarded as a generalization of the law of large number in the case of mean-uncertainty.