ﻻ يوجد ملخص باللغة العربية
Information theory provides a fundamental framework for the quantification of information flows through channels, formally Markov kernels. However, quantities such as mutual information and conditional mutual information do not necessarily reflect the causal nature of such flows. We argue that this is often the result of conditioning based on sigma algebras that are not associated with the given channels. We propose a version of the (conditional) mutual information based on families of sigma algebras that are coupled with the underlying channel. This leads to filtrations which allow us to prove a corresponding causal chain rule as a basic requirement within the presented approach.
A new method to measure nonlinear dependence between two variables is described using mutual information to analyze the separate linear and nonlinear components of dependence. This technique, which gives an exact value for the proportion of linear de
We consider the problem of communication over a channel with a causal jamming adversary subject to quadratic constraints. A sender Alice wishes to communicate a message to a receiver Bob by transmitting a real-valued length-$n$ codeword $mathbf{x}=x_
Given two random variables $X$ and $Y$, an operational approach is undertaken to quantify the ``leakage of information from $X$ to $Y$. The resulting measure $mathcal{L}(X !! to !! Y)$ is called emph{maximal leakage}, and is defined as the multiplica
A communication setup is considered where a transmitter wishes to convey a message to a receiver and simultaneously estimate the state of that receiver through a common waveform. The state is estimated at the transmitter by means of generalized feedb
In the recent paper [1] it is shown, via an application example, that the Cover and Pombra [2] characterization of the $n-$block or transmission feedback capacity formula, of additive Gaussian noise (AGN) channels, is the subject of much confusion in