ﻻ يوجد ملخص باللغة العربية
In this paper, we derive a useful lower bound for the Kullback-Leibler divergence (KL-divergence) based on the Hammersley-Chapman-Robbins bound (HCRB). The HCRB states that the variance of an estimator is bounded from below by the Chi-square divergence and the expectation value of the estimator. By using the relation between the KL-divergence and the Chi-square divergence, we show that the lower bound for the KL-divergence which only depends on the expectation value and the variance of a function we choose. This lower bound can also be derived from an information geometric approach. Furthermore, we show that the equality holds for the Bernoulli distributions and show that the inequality converges to the Cram{e}r-Rao bound when two distributions are very close. We also describe application examples and examples of numerical calculation.
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simp
The article presents new sup-sums principles for integral F-divergence for arbitrary convex function F and arbitrary (not necessarily positive and absolutely continuous) measures. As applications of these results we derive the corresponding sup-sums
Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two $n$-dimensional Gaussians $math
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannons entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average app