ﻻ يوجد ملخص باللغة العربية
The Shannon lower bound is one of the few lower bounds on the rate-distortion function that holds for a large class of sources. In this paper, it is demonstrated that its gap to the rate-distortion function vanishes as the allowed distortion tends to zero for all sources having a finite differential entropy and whose integer part is finite. Conversely, it is demonstrated that if the integer part of the source has an infinite entropy, then its rate-distortion function is infinite for every finite distortion. Consequently, the Shannon lower bound provides an asymptotically tight bound on the rate-distortion function if, and only if, the integer part of the source has a finite entropy.
Shannon gave a lower bound in 1959 on the binary rate of spherical codes of given minimum Euclidean distance $rho$. Using nonconstructive codes over a finite alphabet, we give a lower bound that is weaker but very close for small values of $rho$. The
We prove a Bernstein-type bound for the difference between the average of negative log-likelihoods of independent discrete random variables and the Shannon entropy, both defined on a countably infinite alphabet. The result holds for the class of disc
A correlated phase-and-additive-noise (CPAN) mismatched model is developed for wavelength division multiplexing over optical fiber channels governed by the nonlinear Schrodinger equation. Both the phase and additive noise processes of the CPAN model
We consider the classic joint source-channel coding problem of transmitting a memoryless source over a memoryless channel. The focus of this work is on the long-standing open problem of finding the rate of convergence of the smallest attainable expec
Regular perturbation is applied to the Manakov equation and motivates a generalized correlated phase-and-additive noise model for wavelength-division multiplexing over dual-polarization optical fiber channels. The model includes three hidden Gauss-Ma