ﻻ يوجد ملخص باللغة العربية
The divergence minimization problem plays an important role in various fields. In this note, we focus on differentiable and strictly convex divergences. For some minimization problems, we show the minimizer conditions and the uniqueness of the minimizer without assuming a specific form of divergences. Furthermore, we show geometric properties related to the minimization problems.
Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative $alpha$-entropies (denoted $mathscr{I}_{alpha}$), arise as redundancies under mismatched comp
We consider a sub-class of the $f$-divergences satisfying a stronger convexity property, which we refer to as strongly convex, or $kappa$-convex divergences. We derive new and old relationships, based on convexity arguments, between popular $f$-divergences.
We consider the minimization problem of $phi$-divergences between a given probability measure $P$ and subsets $Omega$ of the vector space $mathcal{M}_mathcal{F}$ of all signed finite measures which integrate a given class $mathcal{F}$ of bounded or u
In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted $mathscr{I}_{alpha}$) were studied. Such minimizers were called forward $mathscr{I}_{alpha}$-projections. Here, a complementary
Nowadays data compressors are applied to many problems of text analysis, but many such applications are developed outside of the framework of mathematical statistics. In this paper we overcome this obstacle and show how several methods of classical m