ترغب بنشر مسار تعليمي؟ اضغط هنا

Information-Distilling Quantizers

118   0   0.0 ( 0 )
 نشر من قبل Or Ordentlich
 تاريخ النشر 2018
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Let $X$ and $Y$ be dependent random variables. This paper considers the problem of designing a scalar quantizer for $Y$ to maximize the mutual information between the quantizers output and $X$, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low $I(X;Y)$, where it is shown that, if $X$ is binary, a constant fraction of the mutual information can always be preserved using $mathcal{O}(log(1/I(X;Y)))$ quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets $2 < |mathcal{X}| < infty$, it is established that an $eta$-fraction of the mutual information can be preserved using roughly $(log(| mathcal{X} | /I(X;Y)))^{etacdot(|mathcal{X}| - 1)}$ quantization levels.



قيم البحث

اقرأ أيضاً

334 - Tobias Koch , Amos Lapidoth 2011
We study channel capacity when a one-bit quantizer is employed at the output of the discrete-time average-power-limited Rayleigh-fading channel. We focus on the low signal-to-noise ratio regime, where communication at very low spectral efficiencies t akes place, as in Spread Spectrum and Ultra-Wideband communications. We demonstrate that, in this regime, the best one-bit quantizer does not reduce the asymptotic capacity of the coherent channel, but it does reduce that of the noncoherent channel.
106 - Tobias Koch , Amos Lapidoth 2012
We study the capacity of the discrete-time Gaussian channel when its output is quantized with a one-bit quantizer. We focus on the low signal-to-noise ratio (SNR) regime, where communication at very low spectral efficiencies takes place. In this regi me a symmetric threshold quantizer is known to reduce channel capacity by a factor of 2/pi, i.e., to cause an asymptotic power loss of approximately two decibels. Here it is shown that this power loss can be avoided by using asymmetric threshold quantizers and asymmetric signaling constellations. To avoid this power loss, flash-signaling input distributions are essential. Consequently, one-bit output quantization of the Gaussian channel reduces spectral efficiency. Threshold quantizers are not only asymptotically optimal: at every fixed SNR a threshold quantizer maximizes capacity among all one-bit output quantizers. The picture changes on the Rayleigh-fading channel. In the noncoherent case a one-bit output quantizer causes an unavoidable low-SNR asymptotic power loss. In the coherent case, however, this power loss is avoidable provided that we allow the quantizer to depend on the fading level.
130 - John Kieffer , Yu Liao 2010
Let G be a finite strongly connected aperiodic directed graph in which each edge carries a label from a finite alphabet A. Then G induces a trellis coded quantizer for encoding an alphabet A memoryless source. A source sequence of long finite length is encoded by finding a path in G of that length whose sequence of labels is closest in Hamming distance to the source sequence; finding the minimum distance path is a dynamic programming problem that is solved using the Viterbi algorithm. We show how a Markov chain can be used to obtain a closed form expression for the asymptotic expected Hamming distortion per sample that results as the number of encoded source samples increases without bound.
While Kolmogorov complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal information metric, based on length of shortest programs for either ordinary computations or reversible (dissipationless) computations. It turns out that these definitions are equivalent up to an additive logarithmic term. We show that the information distance is a universal cognitive similarity distance. We investigate the maximal correlation of the shortest programs involved, the maximal uncorrelation of programs (a generalization of the Slepian-Wolf theorem of classical information theory), and the density properties of the discrete metric spaces induced by the information distances. A related distance measures the amount of nonreversibility of a computation. Using the physical theory of reversible computation, we give an appropriate (universal, anti-symmetric, and transitive) measure of the thermodynamic work required to transform one object in another object by the most efficient process. Information distance between individual objects is needed in pattern recognition where one wants to express effective notions of pattern similarity or cognitive similarity between individual objects and in thermodynamics of computation where one wants to analyse the energy dissipation of a computation from a particular input to a particular output.
The information that two random variables $Y$, $Z$ contain about a third random variable $X$ can have aspects of shared information (contained in both $Y$ and $Z$), of complementary information (only available from $(Y,Z)$ together) and of unique inf ormation (contained exclusively in either $Y$ or $Z$). Here, we study measures $widetilde{SI}$ of shared, $widetilde{UI}$ unique and $widetilde{CI}$ complementary information introduced by Bertschinger et al., which are motivated from a decision theoretic perspective. We find that in most cases the intuitive rule that more variables contain more information applies, with the exception that $widetilde{SI}$ and $widetilde{CI}$ information are not monotone in the target variable $X$. Additionally, we show that it is not possible to extend the bivariate information decomposition into $widetilde{SI}$, $widetilde{UI}$ and $widetilde{CI}$ to a non-negative decomposition on the partial information lattice of Williams and Beer. Nevertheless, the quantities $widetilde{UI}$, $widetilde{SI}$ and $widetilde{CI}$ have a well-defined interpretation, even in the multivariate setting.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا