ترغب بنشر مسار تعليمي؟ اضغط هنا

Tight Bounds on the Renyi Entropy via Majorization with Applications to Guessing and Compression

64   0   0.0 ( 0 )
 نشر من قبل Igal Sason
 تاريخ النشر 2018
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Igal Sason




اسأل ChatGPT حول البحث

This paper provides tight bounds on the Renyi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one-to-one. To that end, a tight lower bound on the Renyi entropy of a discrete random variable with a finite support is derived as a function of the size of the support, and the ratio of the maximal to minimal probability masses. This work was inspired by the recently published paper by Cicalese et al., which is focused on the Shannon entropy, and it strengthens and generalizes the results of that paper to Renyi entropies of arbitrary positive orders. In view of these generalized bounds and the works by Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and lossless data compression of discrete memoryless sources.



قيم البحث

اقرأ أيضاً

179 - Igal Sason , Sergio Verdu 2018
This paper provides upper and lower bounds on the optimal guessing moments of a random variable taking values on a finite set when side information may be available. These moments quantify the number of guesses required for correctly identifying the unknown object and, similarly to Arikans bounds, they are expressed in terms of the Arimoto-Renyi conditional entropy. Although Arikans bounds are asymptotically tight, the improvement of the bounds in this paper is significant in the non-asymptotic regime. Relationships between moments of the optimal guessing function and the MAP error probability are also established, characterizing the exact locus of their attainable values. The bounds on optimal guessing moments serve to improve non-asymptotic bounds on the cumulant generating function of the codeword lengths for fixed-to-variable optimal lossless source coding without prefix constraints. Non-asymptotic bounds on the reliability function of discrete memoryless sources are derived as well. Relying on these techniques, lower bounds on the cumulant generating function of the codeword lengths are derived, by means of the smooth Renyi entropy, for source codes that allow decoding errors.
77 - Eshed Ram , Igal Sason 2016
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound on the order-$alpha$ Renyi entropy power of $S_n$ that, up to a multiplicative constant (which may depend in general on $n, alpha, d$), is equal to the sum of the order-$alpha$ Renyi entropy powers of the $n$ random vectors ${X_k}_{k=1}^n$. For $alpha=1$, the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov which relies on the sharpened Youngs inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix.
259 - Igal Sason 2019
This paper is focused on derivations of data-processing and majorization inequalities for $f$-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced witho ut proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form a major part of this manuscript.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
139 - Igal Sason , Sergio Verdu 2015
A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszar and Talata. It is further exten ded to an upper bound on the Renyi divergence of an arbitrary non-negative order (including $infty$) as a function of the total variation distance.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا