ترغب بنشر مسار تعليمي؟ اضغط هنا

A Technique for Deriving One-Shot Achievability Results in Network Information Theory

112   0   0.0 ( 0 )
 نشر من قبل Mohammad Hossein Yassaee
 تاريخ النشر 2013
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

This paper proposes a novel technique to prove a one-shot version of achievability results in network information theory. The technique is not based on covering and packing lemmas. In this technique, we use an stochastic encoder and decoder with a particular structure for coding that resembles both the ML and the joint-typicality coders. Although stochastic encoders and decoders do not usually enhance the capacity region, their use simplifies the analysis. The Jensen inequality lies at the heart of error analysis, which enables us to deal with the expectation of many terms coming from stochastic encoders and decoders at once. The technique is illustrated via several examples: point-to-point channel coding, Gelfand-Pinsker, Broadcast channel (Marton), Berger-Tung, Heegard-Berger/Kaspi, Multiple description coding and Joint source-channel coding over a MAC. Most of our one-shot results are new. The asymptotic forms of these expressions is the same as that of classical results. Our one-shot bounds in conjunction with multi-dimensional Berry-Essen CLT imply new results in the finite blocklength regime. In particular applying the one-shot result for the memoryless broadcast channel in the asymptotic case, we get the entire region of Martons inner bound without any need for time-sharing.



قيم البحث

اقرأ أيضاً

213 - Elad Domanovitz , Uri Erez 2017
Integer-forcing source coding has been proposed as a low-complexity method for compression of distributed correlated Gaussian sources. In this scheme, each encoder quantizes its observation using the same fine lattice and reduces the result modulo a coarse lattice. Rather than directly recovering the individual quantized signals, the decoder first recovers a full-rank set of judiciously chosen integer linear combinations of the quantized signals, and then inverts it. It has been observed that the method works very well for most but not all source covariance matrices. The present work quantifies the measure of bad covariance matrices by studying the probability that integer-forcing source coding fails as a function of the allocated rate, %in excess of the %Berger-Tung benchmark, where the probability is with respect to a random orthonormal transformation that is applied to the sources prior to quantization. For the important case where the signals to be compressed correspond to the antenna inputs of relays in an i.i.d. Rayleigh fading environment, this orthonormal transformation can be viewed as being performed by nature. Hence, the results provide performance guarantees for distributed source coding via integer forcing in this scenario.
A new model of multi-party secret key agreement is proposed, in which one terminal called the communicator can transmit public messages to other terminals before all terminals agree on a secret key. A single-letter characterization of the achievable region is derived in the stationary memoryless case. The new model generalizes some other (old and new) models of key agreement. In particular, key generation with an omniscient helper is the special case where the communicator knows all sources, for which we derive a zero-rate one-shot converse for the secret key per bit of communication.
In an effort to develop the foundations for a non-stochastic theory of information, the notion of $delta$-mutual information between uncertain variables is introduced as a generalization of Nairs non-stochastic information functional. Several propert ies of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest $delta$-mutual information between received and transmitted codewords over $epsilon$-noise channels equals the $(epsilon, delta)$-capacity. This notion of capacity generalizes the Kolmogorov $epsilon$-capacity to packing sets of overlap at most $delta$, and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, and to non-stochastic, memoryless, stationary channels. Finally, sufficient conditions are established for the factorization of the $delta$-mutual information and to obtain a single letter capacity expression. Compared to previous non-stochastic approaches, the presented theory admits the possibility of decoding errors as in Shannons probabilistic setting, while retaining a worst-case, non-stochastic character.
This paper introduces a new and ubiquitous framework for establishing achievability results in emph{network information theory} (NIT) problems. The framework uses random binning arguments and is based on a duality between channel and source coding pr oblems. {Further,} the framework uses pmf approximation arguments instead of counting and typicality. This allows for proving coordination and emph{strong} secrecy problems where certain statistical conditions on the distribution of random variables need to be satisfied. These statistical conditions include independence between messages and eavesdroppers observations in secrecy problems and closeness to a certain distribution (usually, i.i.d. distribution) in coordination problems. One important feature of the framework is to enable one {to} add an eavesdropper and obtain a result on the secrecy rates for free. We make a case for generality of the framework by studying examples in the variety of settings containing channel coding, lossy source coding, joint source-channel coding, coordination, strong secrecy, feedback and relaying. In particular, by investigating the framework for the lossy source coding problem over broadcast channel, it is shown that the new framework provides a simple alternative scheme to emph{hybrid} coding scheme. Also, new results on secrecy rate region (under strong secrecy criterion) of wiretap broadcast channel and wiretap relay channel are derived. In a set of accompanied papers, we have shown the usefulness of the framework to establish achievability results for coordination problems including interactive channel simulation, coordination via relay and channel simulation via another channel.
A basic information theoretic model for summarization is formulated. Here summarization is considered as the process of taking a report of $v$ binary objects, and producing from it a $j$ element subset that captures most of the important features of the original report, with importance being defined via an arbitrary set function endemic to the model. The loss of information is then measured by a weight average of variational distances, which we term the semantic loss. Our results include both cases where the probability distribution generating the $v$-length reports are known and unknown. In the case where it is known, our results demonstrate how to construct summarizers which minimize the semantic loss. For the case where the probability distribution is unknown, we show how to construct summarizers whose semantic loss when averaged uniformly over all possible distribution converges to the minimum.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا