ترغب بنشر مسار تعليمي؟ اضغط هنا

Equality in the Matrix Entropy-Power Inequality and Blind Separation of Real and Complex sources

65   0   0.0 ( 0 )
 نشر من قبل Olivier Rioul
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The matrix version of the entropy-power inequality for real or complex coefficients and variables is proved using a transportation argument that easily settles the equality case. An application to blind source extraction is given.

قيم البحث

اقرأ أيضاً

The paper establishes the equality condition in the I-MMSE proof of the entropy power inequality (EPI). This is done by establishing an exact expression for the deficit between the two sides of the EPI. Interestingly, a necessary condition for the eq uality is established by making a connection to the famous Cauchy functional equation.
An extension of the entropy power inequality to the form $N_r^alpha(X+Y) geq N_r^alpha(X) + N_r^alpha(Y)$ with arbitrary independent summands $X$ and $Y$ in $mathbb{R}^n$ is obtained for the Renyi entropy and powers $alpha geq (r+1)/2$.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
This paper considers an entropy-power inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationshi p between the derivative of mutual information and the minimum mean-square error (MMSE) estimate in linear vector Gaussian channels. As an application, a new extremal entropy inequality is derived from the generalized Costa EPI and then used to establish the secrecy capacity regions of the degraded vector Gaussian broadcast channel with layered confidential messages.
The distributed remote source coding (so-called CEO) problem is studied in the case where the underlying source, not necessarily Gaussian, has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance) whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannons work. While the bounds hold generally, their value is most pronounced when interpreted as a function of the number of agents in the CEO problem.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا