ترغب بنشر مسار تعليمي؟ اضغط هنا

An Inequality for the Correlation of Two Functions Operating on Symmetric Bivariate Normal Variables

93   0   0.0 ( 0 )
 نشر من قبل Uri Erez
 تاريخ النشر 2017
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

An inequality is derived for the correlation of two univariate functions operating on symmetric bivariate normal random variables. The inequality is a simple consequence of the Cauchy-Schwarz inequality.



قيم البحث

اقرأ أيضاً

We study the Han-Kobayashi (HK) achievable sum rate for the two-user symmetric Gaussian interference channel. We find the optimal power split ratio between the common and private messages (assuming no time-sharing), and derive a closed form expressio n for the corresponding sum rate. This provides a finer understanding of the achievable HK sum rate, and allows for precise comparisons between this sum rate and that of orthogonal signaling. One surprising finding is that despite the fact that the channel is symmetric, allowing for asymmetric power split ratio at both users (i.e., asymmetric rates) can improve the sum rate significantly. Considering the high SNR regime, we specify the interference channel value above which the sum rate achieved using asymmetric power splitting outperforms the symmetric case.
103 - Igal Sason 2015
Tight bounds for several symmetric divergence measures are introduced, given in terms of the total variation distance. Each of these bounds is attained by a pair of 2 or 3-element probability distributions. An application of these bounds for lossless source coding is provided, refining and improving a certain bound by Csiszar. A new inequality relating $f$-divergences is derived, and its use is exemplified. The last section of this conference paper is not included in the recent journal paper that was published in the February 2015 issue of the IEEE Trans. on Information Theory (see arXiv:1403.7164), as well as some new paragraphs throughout the paper which are linked to new references.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
59 - Uri Erez , Jan {O}stergaard , 2020
It was recently proposed to encode the one-sided exponential source X via K parallel channels, Y1, ..., YK , such that the error signals X - Yi, i = 1,...,K, are one-sided exponential and mutually independent given X. Moreover, it was shown that the optimal estimator hat{Y} of the source X with respect to the one-sided error criterion, is simply given by the maximum of the outputs, i.e., hat{Y} = max{Y1,..., YK}. In this paper, we show that the distribution of the resulting estimation error X - hat{Y} , is equivalent to that of the optimum noise in the backward test-channel of the one-sided exponential source, i.e., it is one-sided exponentially distributed and statistically independent of the joint output Y1,...,YK.
Two new proofs of the Fisher information inequality (FII) using data processing inequalities for mutual information and conditional variance are presented.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا