ﻻ يوجد ملخص باللغة العربية
The distributed remote source coding (so-called CEO) problem is studied in the case where the underlying source, not necessarily Gaussian, has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance) whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannons work. While the bounds hold generally, their value is most pronounced when interpreted as a function of the number of agents in the CEO problem.
An extension of the entropy power inequality to the form $N_r^alpha(X+Y) geq N_r^alpha(X) + N_r^alpha(Y)$ with arbitrary independent summands $X$ and $Y$ in $mathbb{R}^n$ is obtained for the Renyi entropy and powers $alpha geq (r+1)/2$.
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound
Source-channel coding for an energy limited wireless sensor node is investigated. The sensor node observes independent Gaussian source samples with variances changing over time slots and transmits to a destination over a flat fading channel. The fadi
This note contributes to the understanding of generalized entropy power inequalities. Our main goal is to construct a counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave ra
We determine the rate region of the vector Gaussian one-helper source-coding problem under a covariance matrix distortion constraint. The rate region is achieved by a simple scheme that separates the lossy vector quantization from the lossless spatia