An inequality is derived for the correlation of two univariate functions operating on symmetric bivariate normal random variables. The inequality is a simple consequence of the Cauchy-Schwarz inequality.
We study the Han-Kobayashi (HK) achievable sum rate for the two-user symmetric Gaussian interference channel. We find the optimal power split ratio between the common and private messages (assuming no time-sharing), and derive a closed form expressio
n for the corresponding sum rate. This provides a finer understanding of the achievable HK sum rate, and allows for precise comparisons between this sum rate and that of orthogonal signaling. One surprising finding is that despite the fact that the channel is symmetric, allowing for asymmetric power split ratio at both users (i.e., asymmetric rates) can improve the sum rate significantly. Considering the high SNR regime, we specify the interference channel value above which the sum rate achieved using asymmetric power splitting outperforms the symmetric case.
Tight bounds for several symmetric divergence measures are introduced, given in terms of the total variation distance. Each of these bounds is attained by a pair of 2 or 3-element probability distributions. An application of these bounds for lossless
source coding is provided, refining and improving a certain bound by Csiszar. A new inequality relating $f$-divergences is derived, and its use is exemplified. The last section of this conference paper is not included in the recent journal paper that was published in the February 2015 issue of the IEEE Trans. on Information Theory (see arXiv:1403.7164), as well as some new paragraphs throughout the paper which are linked to new references.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel
ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
It was recently proposed to encode the one-sided exponential source X via K parallel channels, Y1, ..., YK , such that the error signals X - Yi, i = 1,...,K, are one-sided exponential and mutually independent given X. Moreover, it was shown that the
optimal estimator hat{Y} of the source X with respect to the one-sided error criterion, is simply given by the maximum of the outputs, i.e., hat{Y} = max{Y1,..., YK}. In this paper, we show that the distribution of the resulting estimation error X - hat{Y} , is equivalent to that of the optimum noise in the backward test-channel of the one-sided exponential source, i.e., it is one-sided exponentially distributed and statistically independent of the joint output Y1,...,YK.
Two new proofs of the Fisher information inequality (FII) using data processing inequalities for mutual information and conditional variance are presented.