ﻻ يوجد ملخص باللغة العربية
For gambling on horses, a one-parameter family of utility functions is proposed, which contains Kellys logarithmic criterion and the expected-return criterion as special cases. The strategies that maximize the utility function are derived, and the connection to the Renyi divergence is shown. Optimal strategies are also derived when the gambler has some side information; this setting leads to a novel conditional Renyi divergence.
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannons entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback
This paper studies forward and reverse projections for the R{e}nyi divergence of order $alpha in (0, infty)$ on $alpha$-convex sets. The forward projection on such a set is motivated by some works of Tsallis {em et al.} in statistical physics, and th
This paper starts by considering the minimization of the Renyi divergence subject to a constraint on the total variation distance. Based on the solution of this optimization problem, the exact locus of the points $bigl( D(Q|P_1), D(Q|P_2) bigr)$ is d
A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszar and Talata. It is further exten
Two maximization problems of Renyi entropy rate are investigated: the maximization over all stochastic processes whose marginals satisfy a linear constraint, and the Burg-like maximization over all stochastic processes whose autocovariance function b