ﻻ يوجد ملخص باللغة العربية
In 2018, Renes [IEEE Trans. Inf. Theory, vol. 64, no. 1, pp. 577-592 (2018)] (arXiv:1701.05583) developed a general theory of channel duality for classical-input quantum-output (CQ) channels. That result showed that a number of well-known duality results for linear codes on the binary erasure channel could be extended to general classical channels at the expense of using dual problems which are intrinsically quantum mechanical. One special case of this duality is a connection between coding for error correction (resp. wire-tap secrecy) on the quantum pure-state channel (PSC) and coding for wire-tap secrecy (resp. error correction) on the classical binary symmetric channel (BSC). While this result has important implications for classical coding, the machinery behind the general duality result is rather challenging for researchers without a strong background in quantum information theory. In this work, we leverage prior results for linear codes on PSCs to give an alternate derivation of the aforementioned special case by computing closed-form expressions for the performance metrics. The noted prior results include optimality of the square-root measurement (SRM) for linear codes on the PSC and the Fourier duality of linear codes. We also show that the SRM forms a suboptimal measurement for channel coding on the BSC (when interpreted as a CQ problem) and secret communications on the PSC. Our proofs only require linear algebra and basic group theory, though we use the quantum Dirac notation for convenience.
In this article, we describe various aspects of categorification of the structures appearing in information theory. These aspects include probabilistic models both of classical and quantum physics, emergence of F-manifolds, and motivic enrichments.
We provide a practical implementation of the rubber method of Ahlswede et al. for binary alphabets. The idea is to create the skeleton sequence therein via an arithmetic decoder designed for a particular $k$-th order Markov chain. For the stochastic
A renowned information-theoretic formula by Shannon expresses the mutual information rate of a white Gaussian channel with a stationary Gaussian input as an integral of a simple function of the power spectral density of the channel input. We give in
Building on the work of Horstein, Shayevitz and Feder, and Naghshvar emph{et al.}, this paper presents algorithms for low-complexity sequential transmission of a $k$-bit message over the binary symmetric channel (BSC) with full, noiseless feedback. T
Consider a binary linear code of length $N$, minimum distance $d_{text{min}}$, transmission over the binary erasure channel with parameter $0 < epsilon < 1$ or the binary symmetric channel with parameter $0 < epsilon < frac12$, and block-MAP decoding