ترغب بنشر مسار تعليمي؟ اضغط هنا

The Convexity and Concavity of Envelopes of the Minimum-Relative-Entropy Region for the DSBS

84   0   0.0 ( 0 )
 نشر من قبل Lei Yu
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Lei Yu




اسأل ChatGPT حول البحث

In this paper, we prove that for the doubly symmetric binary distribution, the lower increasing envelope and the upper envelope of the minimum-relative-entropy region are respectively convex and concave. We also prove that another function induced the minimum-relative-entropy region is concave. These two envelopes and this function were previously used to characterize the optimal exponents in strong small-set expansion problems and strong Brascamp--Lieb inequalities. The results in this paper, combined with the strong small-set expansion theorem derived by Yu, Anantharam, and Chen (2021), and the strong Brascamp--Lieb inequality derived by Yu (2021), confirm positively Ordentlich--Polyanskiy--Shayevitzs conjecture on the strong small-set expansion (2019) and Polyanskiys conjecture on the strong Brascamp--Lieb inequality (2016). The proofs in this paper are based on the equivalence between the convexity of a function and the convexity of the set of minimizers of its Lagrangian dual.



قيم البحث

اقرأ أيضاً

The relative entropy and chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their i nformation-theoretic applications, and some generalizations pertaining to the rich class of $f$-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong~data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.
140 - Igal Sason , Sergio Verdu 2015
A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszar and Talata. It is further exten ded to an upper bound on the Renyi divergence of an arbitrary non-negative order (including $infty$) as a function of the total variation distance.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
An extension of the entropy power inequality to the form $N_r^alpha(X+Y) geq N_r^alpha(X) + N_r^alpha(Y)$ with arbitrary independent summands $X$ and $Y$ in $mathbb{R}^n$ is obtained for the Renyi entropy and powers $alpha geq (r+1)/2$.
In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted $mathscr{I}_{alpha}$) were studied. Such minimizers were called forward $mathscr{I}_{alpha}$-projections. Here, a complementary class of minimization problems leading to the so-called reverse $mathscr{I}_{alpha}$-projections are studied. Reverse $mathscr{I}_{alpha}$-projections, particularly on log-convex or power-law families, are of interest in robust estimation problems ($alpha >1$) and in constrained compression settings ($alpha <1$). Orthogonality of the power-law family with an associated linear family is first established and is then exploited to turn a reverse $mathscr{I}_{alpha}$-projection into a forward $mathscr{I}_{alpha}$-projection. The transformed problem is a simpler quasiconvex minimization subject to linear constraints.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا