Upper Bounds on the Relative Entropy and Renyi Divergence as a Function of Total Variation Distance for Finite Alphabets


Abstract in English

A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszar and Talata. It is further extended to an upper bound on the Renyi divergence of an arbitrary non-negative order (including $infty$) as a function of the total variation distance.

Download