ترغب بنشر مسار تعليمي؟ اضغط هنا

Maximum Renyi Entropy Rate

124   0   0.0 ( 0 )
 نشر من قبل Amos Lapidoth
 تاريخ النشر 2015
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Two maximization problems of Renyi entropy rate are investigated: the maximization over all stochastic processes whose marginals satisfy a linear constraint, and the Burg-like maximization over all stochastic processes whose autocovariance function begins with some given values. The solutions are related to the solutions to the analogous maximization problems of Shannon entropy rate.



قيم البحث

اقرأ أيضاً

77 - Eshed Ram , Igal Sason 2016
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound on the order-$alpha$ Renyi entropy power of $S_n$ that, up to a multiplicative constant (which may depend in general on $n, alpha, d$), is equal to the sum of the order-$alpha$ Renyi entropy powers of the $n$ random vectors ${X_k}_{k=1}^n$. For $alpha=1$, the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov which relies on the sharpened Youngs inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
63 - Igal Sason , Sergio Verdu 2017
This paper gives upper and lower bounds on the minimum error probability of Bayesian $M$-ary hypothesis testing in terms of the Arimoto-Renyi conditional entropy of an arbitrary order $alpha$. The improved tightness of these bounds over their specializ
63 - Igal Sason 2018
This paper provides tight bounds on the Renyi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one-to-one. To that end, a tight lower bound on the Renyi entropy of a dis crete random variable with a finite support is derived as a function of the size of the support, and the ratio of the maximal to minimal probability masses. This work was inspired by the recently published paper by Cicalese et al., which is focused on the Shannon entropy, and it strengthens and generalizes the results of that paper to Renyi entropies of arbitrary positive orders. In view of these generalized bounds and the works by Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and lossless data compression of discrete memoryless sources.
It is shown that adding hair like electric charge or angular momentum to the black hole decreases the amount of entropy emission. This motivates us to study the emission rate of entropy from black holes and conjecture a maximum limit (upper bound) on the rate of local entropy emission ($dot{S}$) for thermal systems in four dimensional space time and argue that this upper bound is $dot{S}simeq k_{B} sqrt{frac{c^5}{hbar G}}$. Also by considering R`{e}nyi entropy, it is shown that Bekenstein-Hawking entropy leads to a maximum limit for the rate of entropy emission. We also suggest an upper bound on the surface gravity of the black holes which is called Planck surface gravity. Finally we obtain a relation between maximum rate of entropy emission, Planck surface gravity and Planck temperature of black holes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا