No Arabic abstract
Let $X^n$ be a uniformly distributed $n$-dimensional binary vector, and $Y^n$ be the result of passing $X^n$ through a binary symmetric channel (BSC) with crossover probability $alpha$. A recent conjecture postulated by Courtade and Kumar states that for any Boolean function $f:{0,1}^nto{0,1}$, $I(f(X^n);Y^n)le 1-H(alpha)$. Although the conjecture has been proved to be true in the dimension-free high noise regime by Samorodnitsky, here we present a calculus-based approach to show a dimension-dependent result by examining the second derivative of $H(alpha)-H(f(X^n)|Y^n)$ at $alpha=1/2$. Along the way, we show that the dictator function is the most informative function in the high noise regime.
Motivated by DNA-based storage, we study the noisy shuffling channel, which can be seen as the concatenation of a standard noisy channel (such as the BSC) and a shuffling channel, which breaks the data block into small pieces and shuffles them. This channel models a DNA storage system, by capturing two of its key aspects: (1) the data is written onto many short DNA molecules that are stored in an unordered way and (2) the molecules are corrupted by noise at synthesis, sequencing, and during storage. For the BSC-shuffling channel we characterize the capacity exactly (for a large set of parameters), and show that a simple index-based coding scheme is optimal.
This work considers a communication scenario where the transmitter chooses a list of size K from a total of M messages to send over a noisy communication channel, the receiver generates a list of size L and communication is considered successful if the intersection of the lists at two terminals has cardinality greater than a threshold T. In traditional communication systems K=L=T=1. The fundamental limits of this setup in terms of K, L, T and the Shannon capacity of the channel between the terminals are examined. Specifically, necessary and/or sufficient conditions for asymptotically error free communication are provided.
Bent functions, which are maximally nonlinear Boolean functions with even numbers of variables and whose Hamming distance to the set of all affine functions equals $2^{n-1}pm 2^{frac{n}{2}-1}$, were introduced by Rothaus in 1976 when he considered problems in combinatorics. Bent functions have been extensively studied due to their applications in cryptography, such as S-box, block cipher and stream cipher. Further, they have been applied to coding theory, spread spectrum and combinatorial design. Hyper-bent functions, as a special class of bent functions, were introduced by Youssef and Gong in 2001, which have stronger properties and rarer elements. Many research focus on the construction of bent and hyper-bent functions. In this paper, we consider functions defined over $mathbb{F}_{2^n}$ by $f_{a,b}:=mathrm{Tr}_{1}^{n}(ax^{(2^m-1)})+mathrm{Tr}_{1}^{4}(bx^{frac{2^n-1}{5}})$, where $n=2m$, $mequiv 2pmod 4$, $ain mathbb{F}_{2^m}$ and $binmathbb{F}_{16}$. When $ain mathbb{F}_{2^m}$ and $(b+1)(b^4+b+1)=0$, with the help of Kloosterman sums and the factorization of $x^5+x+a^{-1}$, we present a characterization of hyper-bentness of $f_{a,b}$. Further, we use generalized Ramanujan-Nagell equations to characterize hyper-bent functions of $f_{a,b}$ in the case $ainmathbb{F}_{2^{frac{m}{2}}}$.
It is shown that for any binary-input discrete memoryless channel $W$ with symmetric capacity $I(W)$ and any rate $R <I(W)$, the probability of block decoding error for polar coding under successive cancellation decoding satisfies $P_e le 2^{-N^beta}$ for any $beta<frac12$ when the block-length $N$ is large enough.
We construct a joint coordination-channel polar coding scheme for strong coordination of actions between two agents $mathsf X$ and $mathsf Y$, which communicate over a discrete memoryless channel (DMC) such that the joint distribution of actions follows a prescribed probability distribution. We show that polar codes are able to achieve our previously established inner bound to the strong noisy coordination capacity region and thus provide a constructive alternative to a random coding proof. Our polar coding scheme also offers a constructive solution to a channel simulation problem where a DMC and shared randomness are together employed to simulate another DMC. In particular, our proposed solution is able to utilize the randomness of the DMC to reduce the amount of local randomness required to generate the sequence of actions at agent $mathsf Y$. By leveraging our earlier random coding results for this problem, we conclude that the proposed joint coordination-channel coding scheme strictly outperforms a separate scheme in terms of achievable communication rate for the same amount of injected randomness into both systems.