ترغب بنشر مسار تعليمي؟ اضغط هنا

Reconsidering unique information: Towards a multivariate information decomposition

132   0   0.0 ( 0 )
 نشر من قبل Johannes Rauh
 تاريخ النشر 2014
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The information that two random variables $Y$, $Z$ contain about a third random variable $X$ can have aspects of shared information (contained in both $Y$ and $Z$), of complementary information (only available from $(Y,Z)$ together) and of unique information (contained exclusively in either $Y$ or $Z$). Here, we study measures $widetilde{SI}$ of shared, $widetilde{UI}$ unique and $widetilde{CI}$ complementary information introduced by Bertschinger et al., which are motivated from a decision theoretic perspective. We find that in most cases the intuitive rule that more variables contain more information applies, with the exception that $widetilde{SI}$ and $widetilde{CI}$ information are not monotone in the target variable $X$. Additionally, we show that it is not possible to extend the bivariate information decomposition into $widetilde{SI}$, $widetilde{UI}$ and $widetilde{CI}$ to a non-negative decomposition on the partial information lattice of Williams and Beer. Nevertheless, the quantities $widetilde{UI}$, $widetilde{SI}$ and $widetilde{CI}$ have a well-defined interpretation, even in the multivariate setting.



قيم البحث

اقرأ أيضاً

We study the measure of unique information $UI(T:Xsetminus Y)$ defined by Bertschinger et al. (2014) within the framework of information decompositions. We study uniqueness and support of the solutions to the optimization problem underlying the defin ition of $UI$. We identify sufficient conditions for non-uniqueness of solutions with full support in terms of conditional independence constraints and in terms of the cardinalities of $T$, $X$ and $Y$. Our results are based on a reformulation of the first order conditions on the objective function as rank constraints on a matrix of conditional probabilities. These results help to speed up the computation of $UI(T:Xsetminus Y)$, most notably when $T$ is binary. In the case that all variables are binary, we obtain a complete picture of where the optimizing probability distributions lie.
Given a pair of predictor variables and a response variable, how much information do the predictors have about the response, and how is this information distributed between unique, redundant, and synergistic components? Recent work has proposed to qu antify the unique component of the decomposition as the minimum value of the conditional mutual information over a constrained set of information channels. We present an efficient iterative divergence minimization algorithm to solve this optimization problem with convergence guarantees and evaluate its performance against other techniques.
The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate i nformation content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information sharing. It is shown that the distinct ways in which a set of marginal observers can share their information with a non-observing third party corresponds to the elements of a free distributive lattice. The redundancy lattice from partial information decomposition is then subsequently and independently derived by combining the algebraic structures of joint and shared information content.
The authors have recently defined the Renyi information dimension rate $d({X_t})$ of a stationary stochastic process ${X_t,,tinmathbb{Z}}$ as the entropy rate of the uniformly-quantized process divided by minus the logarithm of the quantizer step siz e $1/m$ in the limit as $mtoinfty$ (B. Geiger and T. Koch, On the information dimension rate of stochastic processes, in Proc. IEEE Int. Symp. Inf. Theory (ISIT), Aachen, Germany, June 2017). For Gaussian processes with a given spectral distribution function $F_X$, they showed that the information dimension rate equals the Lebesgue measure of the set of harmonics where the derivative of $F_X$ is positive. This paper extends this result to multivariate Gaussian processes with a given matrix-valued spectral distribution function $F_{mathbf{X}}$. It is demonstrated that the information dimension rate equals the average rank of the derivative of $F_{mathbf{X}}$. As a side result, it is shown that the scale and translation invariance of information dimension carries over from random variables to stochastic processes.
We offer a new approach to the information decomposition problem in information theory: given a target random variable co-distributed with multiple source variables, how can we decompose the mutual information into a sum of non-negative terms that qu antify the contributions of each random variable, not only individually but also in combination? We derive our composition from cooperative game theory. It can be seen as assigning a fair share of the mutual information to each combination of the source variables. Our decomposition is based on a different lattice from the usual partial information decomposition (PID) approach, and as a consequence our decomposition has a smaller number of terms: it has analogs of the synergy and unique information terms, but lacks terms corresponding to redundancy. Because of this, it is able to obey equivalents of the axioms known as local positivity and identity, which cannot be simultaneously satisfied by a PID measure.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا