Do you want to publish a course? Click here

Confounding Ghost Channels and Causality: A New Approach to Causal Information Flows

430   0   0.0 ( 0 )
 Added by Nihat Ay
 Publication date 2020
and research's language is English
 Authors Nihat Ay




Ask ChatGPT about the research

Information theory provides a fundamental framework for the quantification of information flows through channels, formally Markov kernels. However, quantities such as mutual information and conditional mutual information do not necessarily reflect the causal nature of such flows. We argue that this is often the result of conditioning based on sigma algebras that are not associated with the given channels. We propose a version of the (conditional) mutual information based on families of sigma algebras that are coupled with the underlying channel. This leads to filtrations which allow us to prove a corresponding causal chain rule as a basic requirement within the presented approach.



rate research

Read More

105 - Reginald D. Smith 2015
A new method to measure nonlinear dependence between two variables is described using mutual information to analyze the separate linear and nonlinear components of dependence. This technique, which gives an exact value for the proportion of linear dependence, is then compared with another common test for linearity, the Brock, Dechert and Scheinkman (BDS) test.
We consider the problem of communication over a channel with a causal jamming adversary subject to quadratic constraints. A sender Alice wishes to communicate a message to a receiver Bob by transmitting a real-valued length-$n$ codeword $mathbf{x}=x_1,...,x_n$ through a communication channel. Alice and Bob do not share common randomness. Knowing Alices encoding strategy, an adversarial jammer James chooses a real-valued length-n noise sequence $mathbf{s}=s_1,..,s_n$ in a causal manner, i.e., each $s_t (1<=t<=n)$ can only depend on $x_1,...,x_t$. Bob receives $mathbf{y}$, the sum of Alices transmission $mathbf{x}$ and James jamming vector $mathbf{s}$, and is required to reliably estimate Alices message from this sum. In addition, Alice and Jamess transmission powers are restricted by quadratic constraints $P>0$ and $N>0$. In this work, we characterize the channel capacity for such a channel as the limit superior of the optimal values of a series of optimizations. Upper and lower bounds on the optimal values are provided both analytically and numerically. Interestingly, unlike many communication problems, in this causal setting Alices optimal codebook may not have a uniform power allocation - for certain SNR, a codebook with a two-level uniform power allocation results in a strictly higher rate than a codebook with a uniform power allocation would.
350 - Ibrahim Issa , Aaron B. Wagner , 2018
Given two random variables $X$ and $Y$, an operational approach is undertaken to quantify the ``leakage of information from $X$ to $Y$. The resulting measure $mathcal{L}(X !! to !! Y)$ is called emph{maximal leakage}, and is defined as the multiplicative increase, upon observing $Y$, of the probability of correctly guessing a randomized function of $X$, maximized over all such randomized functions. A closed-form expression for $mathcal{L}(X !! to !! Y)$ is given for discrete $X$ and $Y$, and it is subsequently generalized to handle a large class of random variables. The resulting properties are shown to be consistent with an axiomatic view of a leakage measure, and the definition is shown to be robust to variations in the setup. Moreover, a variant of the Shannon cipher system is studied, in which performance of an encryption scheme is measured using maximal leakage. A single-letter characterization of the optimal limit of (normalized) maximal leakage is derived and asymptotically-optimal encryption schemes are demonstrated. Furthermore, the sample complexity of estimating maximal leakage from data is characterized up to subpolynomial factors. Finally, the emph{guessing} framework used to define maximal leakage is used to give operational interpretations of commonly used leakage measures, such as Shannon capacity, maximal correlation, and local differential privacy.
A communication setup is considered where a transmitter wishes to convey a message to a receiver and simultaneously estimate the state of that receiver through a common waveform. The state is estimated at the transmitter by means of generalized feedback, i.e., a strictly causal channel output, and the known waveform. The scenario at hand is motivated by joint radar and communication, which aims to co-design radar sensing and communication over shared spectrum and hardware. For the case of memoryless single receiver channels with i.i.d. time-varying state sequences, we fully characterize the capacity-distortion tradeoff, defined as the largest achievable rate below which a message can be conveyed reliably while satisfying some distortion constraints on state sensing. We propose a numerical method to compute the optimal input that achieves the capacity-distortion tradeoff. Then, we address memoryless state-dependent broadcast channels (BCs). For physically degraded BCs with i.i.d. time-varying state sequences, we characterize the capacity-distortion tradeoff region as a rather straightforward extension of single receiver channels. For general BCs, we provide inner and outer bounds on the capacity-distortion region, as well as a sufficient condition when this capacity-distortion region is equal to the product of the capacity region and the set of achievable distortions. A number of illustrative examples demonstrates that the optimal co-design schemes outperform conventional schemes that split the resources between sensing and communication.
In the recent paper [1] it is shown, via an application example, that the Cover and Pombra [2] characterization of the $n-$block or transmission feedback capacity formula, of additive Gaussian noise (AGN) channels, is the subject of much confusion in the literature, with redundant incorrect results. The main objective of this paper is to clarify the main points of confusion and remove any further ambiguity. The first part of the paper applies time-domain methods, to derive for a first time, equivalent sequential characterizations of the Cover and Pombra characterization of feedback capacity of AGN channels driven by nonstationary and nonergodic Gaussian noise. The optimal channel input processes of the new equivalent sequential characterizations are expressed as functionals of a sufficient statistic and a Gaussian orthogonal innovations process. From the new representations follows that the Cover and Pombra $n-$block capacity formula is expressed as a functional of two generalized matrix difference Riccati equations (DRE) of filtering theory of Gaussian systems, contrary to results that appeared in the literature. In the second part of the paper the existence of the asymptotic limit of the $n-$block feedback capacity formula is shown to be equivalent to the convergence properties of solutions of the two generalized DREs. Further, necessary and or sufficient conditions are identified for existence of the asymptotic limits, for stable and unstable Gaussian noise, when the optimal input distributions are time-invariant, but not necessarily stationary. The paper contains an in depth analysis, with examples, of the specific technical issues, which are overlooked in past literature [3-7], that studied the AGN channel of [2], for stationary noises.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا