Unique Informations and Deficiencies


Abstract in English

Given two channels that convey information about the same random variable, we introduce two measures of the unique information of one channel with respect to the other. The two quantities are based on the notion of generalized weighted Le Cam deficiencies and differ on whether one channel can approximate the other by a randomization at either its input or output. We relate the proposed quantities to an existing measure of unique information which we call the minimum-synergy unique information. We give an operational interpretation of the latter in terms of an upper bound on the one-way secret key rate and discuss the role of the unique informations in the context of nonnegative mutual information decompositions into unique, redundant and synergistic components.

Download