ترغب بنشر مسار تعليمي؟ اضغط هنا

Entropy and the Discrete Central Limit Theorem

96   0   0.0 ( 0 )
 نشر من قبل Lampros Gavalakis
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of $n$ independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as $ntoinfty$.


قيم البحث

اقرأ أيضاً

We explore an asymptotic behavior of Renyi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity is addressed under suitable moment hypotheses.
129 - S.Ekisheva , C. Houdre 2006
For probability measures on a complete separable metric space, we present sufficient conditions for the existence of a solution to the Kantorovich transportation problem. We also obtain sufficient conditions (which sometimes also become necessary) fo r the convergence, in transportation, of probability measures when the cost function is continuous, non-decreasing and depends on the distance. As an application, the CLT in the transportation distance is proved for independent and some dependent stationary sequences.
208 - E. Carlen , A. Soffer 2011
We prove for the rescaled convolution map $fto fcircledast f$ propagation of polynomial, exponential and gaussian localization. The gaussian localization is then used to prove an optimal bound on the rate of entropy production by this map. As an appl ication we prove the convergence of the CLT to be at the optimal rate $1/sqrt{n}$ in the entropy (and $L^1$) sense, for distributions with finite 4th moment.
138 - Fernando Cordero 2015
We consider a Moran model with two allelic types, mutation and selection. In this work, we study the behaviour of the proportion of fit individuals when the size of the population tends to infinity, without any rescaling of parameters or time. We fir st prove that the latter converges, uniformly in compacts in probability, to the solution of an ordinary differential equation, which is explicitly solved. Next, we study the stability properties of its equilibrium points. Moreover, we show that the fluctuations of the proportion of fit individuals, after a proper normalization, satisfy a uniform central limit theorem in $[0,infty)$. As a consequence, we deduce the convergence of the corresponding stationary distributions.
173 - Shige Peng 2008
We describe a new framework of a sublinear expectation space and the related notions and results of distributions, independence. A new notion of G-distributions is introduced which generalizes our G-normal-distribution in the sense that mean-uncertai nty can be also described. W present our new result of central limit theorem under sublinear expectation. This theorem can be also regarded as a generalization of the law of large number in the case of mean-uncertainty.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا