ترغب بنشر مسار تعليمي؟ اضغط هنا

On Relations Between the Relative entropy and $chi^2$-Divergence, Generalizations and Applications

156   0   0.0 ( 0 )
 نشر من قبل Igal Sason
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The relative entropy and chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of $f$-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong~data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

قيم البحث

اقرأ أيضاً

81 - Igal Sason , Sergio Verdu 2015
A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszar and Talata. It is further exten ded to an upper bound on the Renyi divergence of an arbitrary non-negative order (including $infty$) as a function of the total variation distance.
96 - Igal Sason 2015
This paper starts by considering the minimization of the Renyi divergence subject to a constraint on the total variation distance. Based on the solution of this optimization problem, the exact locus of the points $bigl( D(Q|P_1), D(Q|P_2) bigr)$ is d etermined when $P_1, P_2, Q$ are arbitrary probability measures which are mutually absolutely continuous, and the total variation distance between $P_1$ and $P_2$ is not below a given value. It is further shown that all the points of this convex region are attained by probability measures which are defined on a binary alphabet. This characterization yields a geometric interpretation of the minimal Chernoff information subject to a constraint on the total variation distance. This paper also derives an exponential upper bound on the performance of binary linear block codes (or code ensembles) under maximum-likelihood decoding. Its derivation relies on the Gallager bounding technique, and it reproduces the Shulman-Feder bound as a special case. The bound is expressed in terms of the Renyi divergence from the normalized distance spectrum of the code (or the average distance spectrum of the ensemble) to the binomially distributed distance spectrum of the capacity-achieving ensemble of random block codes. This exponential bound provides a quantitative measure of the degradation in performance of binary linear block codes (or code ensembles) as a function of the deviation of their distance spectra from the binomial distribution. An efficient use of this bound is considered.
In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted $mathscr{I}_{alpha}$) were studied. Such minimizers were called forward $mathscr{I}_{alpha}$-projections. Here, a complementary class of minimization problems leading to the so-called reverse $mathscr{I}_{alpha}$-projections are studied. Reverse $mathscr{I}_{alpha}$-projections, particularly on log-convex or power-law families, are of interest in robust estimation problems ($alpha >1$) and in constrained compression settings ($alpha <1$). Orthogonality of the power-law family with an associated linear family is first established and is then exploited to turn a reverse $mathscr{I}_{alpha}$-projection into a forward $mathscr{I}_{alpha}$-projection. The transformed problem is a simpler quasiconvex minimization subject to linear constraints.
63 - Igal Sason 2018
This paper provides tight bounds on the Renyi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one-to-one. To that end, a tight lower bound on the Renyi entropy of a dis crete random variable with a finite support is derived as a function of the size of the support, and the ratio of the maximal to minimal probability masses. This work was inspired by the recently published paper by Cicalese et al., which is focused on the Shannon entropy, and it strengthens and generalizes the results of that paper to Renyi entropies of arbitrary positive orders. In view of these generalized bounds and the works by Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and lossless data compression of discrete memoryless sources.
83 - Lei Yu 2021
In this paper, we prove that for the doubly symmetric binary distribution, the lower increasing envelope and the upper envelope of the minimum-relative-entropy region are respectively convex and concave. We also prove that another function induced th e minimum-relative-entropy region is concave. These two envelopes and this function were previously used to characterize the optimal exponents in strong small-set expansion problems and strong Brascamp--Lieb inequalities. The results in this paper, combined with the strong small-set expansion theorem derived by Yu, Anantharam, and Chen (2021), and the strong Brascamp--Lieb inequality derived by Yu (2021), confirm positively Ordentlich--Polyanskiy--Shayevitzs conjecture on the strong small-set expansion (2019) and Polyanskiys conjecture on the strong Brascamp--Lieb inequality (2016). The proofs in this paper are based on the equivalence between the convexity of a function and the convexity of the set of minimizers of its Lagrangian dual.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا