On Strong Data-Processing and Majorization Inequalities with Applications to Coding Problems


الملخص بالإنكليزية

This work provides data-processing and majorization inequalities for $f$-divergences, and it considers some of their applications to coding problems. This work also provides tight bounds on the R{e}nyi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one-to-one, and their derivation is based on majorization and the Schur-concavity of the R{e}nyi entropy. One application of the $f$-divergence inequalities refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, which is induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. In view of the tight bounds for the R{e}nyi entropy and the work by Campbell, non-asymptotic bounds are derived for lossless data compression of discrete memoryless sources.

تحميل البحث