ﻻ يوجد ملخص باللغة العربية
A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of $n$ independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as $ntoinfty$.
We explore an asymptotic behavior of Renyi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity is addressed under suitable moment hypotheses.
For probability measures on a complete separable metric space, we present sufficient conditions for the existence of a solution to the Kantorovich transportation problem. We also obtain sufficient conditions (which sometimes also become necessary) fo
We prove for the rescaled convolution map $fto fcircledast f$ propagation of polynomial, exponential and gaussian localization. The gaussian localization is then used to prove an optimal bound on the rate of entropy production by this map. As an appl
We consider a Moran model with two allelic types, mutation and selection. In this work, we study the behaviour of the proportion of fit individuals when the size of the population tends to infinity, without any rescaling of parameters or time. We fir
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertai