Synergy, Redundancy and Common Information


Abstract in English

We consider the problem of decomposing the total mutual information conveyed by a pair of predictor random variables about a target random variable into redundant, unique and synergistic contributions. We focus on the relationship between redundant information and the more familiar information-theoretic notions of common information. Our main contribution is an impossibility result. We show that for independent predictor random variables, any common information based measure of redundancy cannot induce a nonnegative decomposition of the total mutual information. Interestingly, this entails that any reasonable measure of redundant information cannot be derived by optimization over a single random variable.

Download