ﻻ يوجد ملخص باللغة العربية
We establish a discrete analog of the Renyi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within log e of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and establish a sharp Renyi version for certain parameters in both the continuous and discrete cases
We investigate the Renyi entropy of independent sums of integer valued random variables through Fourier theoretic means, and give sharp comparisons between the variance and the Renyi entropy, for Poisson-Bernoulli variables. As applications we prove
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound
Feature selection, in the context of machine learning, is the process of separating the highly predictive feature from those that might be irrelevant or redundant. Information theory has been recognized as a useful concept for this task, as the predi
Recently a new quantum generalization of the Renyi divergence and the corresponding conditional Renyi entropies was proposed. Here we report on a surprising relation between conditional Renyi entropies based on this new generalization and conditional
We show that the sequence of moments of order less than 1 of averages of i.i.d. positive random variables is log-concave. For moments of order at least 1, we conjecture that the sequence is log-convex and show that this holds eventually for integer m