Do you want to publish a course? Click here

Further investigations of Renyi entropy power inequalities and an entropic characterization of s-concave densities

102   0   0.0 ( 0 )
 Added by Arnaud Marsiglietti
 Publication date 2019
  fields
and research's language is English




Ask ChatGPT about the research

We investigate the role of convexity in Renyi entropy power inequalities. After proving that a general Renyi entropy power inequality in the style of Bobkov-Chistyakov (2015) fails when the Renyi parameter $rin(0,1)$, we show that random vectors with $s$-concave densities do satisfy such a Renyi entropy power inequality. Along the way, we establish the convergence in the Central Limit Theorem for Renyi entropies of order $rin(0,1)$ for log-concave densities and for compactly supported, spherically symmetric and unimodal densities, complementing a celebrated result of Barron (1986). Additionally, we give an entropic characterization of the class of $s$-concave densities, which extends a classical result of Cover and Zhang (1994).



rate research

Read More

77 - Eshed Ram , Igal Sason 2016
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound on the order-$alpha$ Renyi entropy power of $S_n$ that, up to a multiplicative constant (which may depend in general on $n, alpha, d$), is equal to the sum of the order-$alpha$ Renyi entropy powers of the $n$ random vectors ${X_k}_{k=1}^n$. For $alpha=1$, the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov which relies on the sharpened Youngs inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix.
We investigate the Renyi entropy of independent sums of integer valued random variables through Fourier theoretic means, and give sharp comparisons between the variance and the Renyi entropy, for Poisson-Bernoulli variables. As applications we prove that a discrete ``min-entropy power is super additive on independent variables up to a universal constant, and give new bounds on an entropic generalization of the Littlewood-Offord problem that are sharp in the ``Poisson regime.
We establish a discrete analog of the Renyi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within log e of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and establish a sharp Renyi version for certain parameters in both the continuous and discrete cases
106 - Yiming Ding , Xuyan Xiang 2016
Long memory or long range dependency is an important phenomenon that may arise in the analysis of time series or spatial data. Most of the definitions of long memory of a stationary process $X={X_1, X_2,cdots,}$ are based on the second-order properties of the process. The excess entropy of a stationary process is the summation of redundancies which relates to the rate of convergence of the conditional entropy $H(X_n|X_{n-1},cdots, X_1)$ to the entropy rate. It is proved that the excess entropy is identical to the mutual information between the past and the future when the entropy $H(X_1)$ is finite. We suggest the definition that a stationary process is long memory if the excess entropy is infinite. Since the definition of excess entropy of a stationary process requires very weak moment condition on the distribution of the process, it can be applied to processes whose distributions without bounded second moment. A significant property of excess entropy is that it is invariant under invertible transformation, which enables us to know the excess entropy of a stationary process from the excess entropy of other process. For stationary Guassian process, the excess entropy characterization of long memory relates to popular characterization well. It is proved that the excess entropy of fractional Gaussian noise is infinite if the Hurst parameter $H in (1/2, 1)$.
159 - Nathael Gozlan 2015
We study an optimal weak transport cost related to the notion of convex order between probability measures. On the real line, we show that this weak transport cost is reached for a coupling that does not depend on the underlying cost function. As an application, we give a necessary and sufficient condition for weak transport-entropy inequalities in dimension one. In particular, we obtain a weak transport-entropy form of the convex Poincar{e} inequality in dimension one.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا