ترغب بنشر مسار تعليمي؟ اضغط هنا

Minimization Problems Based on Relative $alpha$-Entropy I: Forward Projection

128   0   0.0 ( 0 )
 نشر من قبل M. Ashok Kumar
 تاريخ النشر 2014
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative $alpha$-entropies (denoted $mathscr{I}_{alpha}$), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimizers of these relative $alpha$-entropies on closed and convex sets are shown to exist. Such minimizations generalize the maximum R{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed forward $mathscr{I}_{alpha}$-projection) for a linear family is shown to obey a power-law. Other results in connection with statistical inference, namely subspace transitivity and iterated projections, are also established. In a companion paper, a related minimization problem of interest in robust statistics that leads to a reverse $mathscr{I}_{alpha}$-projection is studied.



قيم البحث

اقرأ أيضاً

In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted $mathscr{I}_{alpha}$) were studied. Such minimizers were called forward $mathscr{I}_{alpha}$-projections. Here, a complementary class of minimization problems leading to the so-called reverse $mathscr{I}_{alpha}$-projections are studied. Reverse $mathscr{I}_{alpha}$-projections, particularly on log-convex or power-law families, are of interest in robust estimation problems ($alpha >1$) and in constrained compression settings ($alpha <1$). Orthogonality of the power-law family with an associated linear family is first established and is then exploited to turn a reverse $mathscr{I}_{alpha}$-projection into a forward $mathscr{I}_{alpha}$-projection. The transformed problem is a simpler quasiconvex minimization subject to linear constraints.
We study minimization of a parametric family of relative entropies, termed relative $alpha$-entropies (denoted $mathscr{I}_{alpha}(P,Q)$). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered in stead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimization of $mathscr{I}_{alpha}(P,Q)$ over the first argument on a set of probability distributions that constitutes a linear family is studied. Such a minimization generalizes the maximum R{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed $mathscr{I}_{alpha}$-projection) for a linear family is shown to have a power-law.
72 - Tomohiro Nishiyama 2020
The divergence minimization problem plays an important role in various fields. In this note, we focus on differentiable and strictly convex divergences. For some minimization problems, we show the minimizer conditions and the uniqueness of the minimi zer without assuming a specific form of divergences. Furthermore, we show geometric properties related to the minimization problems.
117 - Xin Yuan 2015
We consider the total variation (TV) minimization problem used for compressive sensing and solve it using the generalized alternating projection (GAP) algorithm. Extensive results demonstrate the high performance of proposed algorithm on compressive sensing, including two dimensional images, hyperspectral images and videos. We further derive the Alternating Direction Method of Multipliers (ADMM) framework with TV minimization for video and hyperspectral image compressive sensing under the CACTI and CASSI framework, respectively. Connections between GAP and ADMM are also provided.
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal information-theoretic axioms, namely monotonicity under mixing and data-processing as well as additivity for product distributions. We find that these axioms induce sufficient structure to establish continuity in the interior of the probability simplex and meaningful upper and lower bounds, e.g., we find that every relative entropy must lie between the Renyi divergences of order $0$ and $infty$. We further show simple conditions for positive definiteness of such relative entropies and a characterisation in term of a variant of relative trumping. Our main result is a one-to-one correspondence between entropies and relative entropies.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا