ترغب بنشر مسار تعليمي؟ اضغط هنا

Communicating Lists Over a Noisy Channel

175   0   0.0 ( 0 )
 نشر من قبل Mustafa Kocak
 تاريخ النشر 2014
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

This work considers a communication scenario where the transmitter chooses a list of size K from a total of M messages to send over a noisy communication channel, the receiver generates a list of size L and communication is considered successful if the intersection of the lists at two terminals has cardinality greater than a threshold T. In traditional communication systems K=L=T=1. The fundamental limits of this setup in terms of K, L, T and the Shannon capacity of the channel between the terminals are examined. Specifically, necessary and/or sufficient conditions for asymptotically error free communication are provided.



قيم البحث

اقرأ أيضاً

We consider the problem of communicating over a channel that randomly tears the message block into small pieces of different sizes and shuffles them. For the binary torn-paper channel with block length $n$ and pieces of length ${rm Geometric}(p_n)$, we characterize the capacity as $C = e^{-alpha}$, where $alpha = lim_{ntoinfty} p_n log n$. Our results show that the case of ${rm Geometric}(p_n)$-length fragments and the case of deterministic length-$(1/p_n)$ fragments are qualitatively different and, surprisingly, the capacity of the former is larger. Intuitively, this is due to the fact that, in the random fragments case, large fragments are sometimes observed, which boosts the capacity.
We construct a joint coordination-channel polar coding scheme for strong coordination of actions between two agents $mathsf X$ and $mathsf Y$, which communicate over a discrete memoryless channel (DMC) such that the joint distribution of actions foll ows a prescribed probability distribution. We show that polar codes are able to achieve our previously established inner bound to the strong noisy coordination capacity region and thus provide a constructive alternative to a random coding proof. Our polar coding scheme also offers a constructive solution to a channel simulation problem where a DMC and shared randomness are together employed to simulate another DMC. In particular, our proposed solution is able to utilize the randomness of the DMC to reduce the amount of local randomness required to generate the sequence of actions at agent $mathsf Y$. By leveraging our earlier random coding results for this problem, we conclude that the proposed joint coordination-channel coding scheme strictly outperforms a separate scheme in terms of achievable communication rate for the same amount of injected randomness into both systems.
We study the problem of strong coordination of the actions of two nodes $X$ and $Y$ that communicate over a discrete memoryless channel (DMC) such that the actions follow a prescribed joint probability distribution. We propose two novel random coding schemes and a polar coding scheme for this noisy strong coordination problem, and derive inner bounds for the respective strong coordination capacity region. The first scheme is a joint coordination-channel coding scheme that utilizes the randomness provided by the DMC to reduce the amount of local randomness required to generate the sequence of actions at Node $Y$. Based on this random coding scheme, we provide a characterization of the capacity region for two special cases of the noisy strong coordination setup, namely, when the actions at Node $Y$ are determined by Node $X$ and when the DMC is a deterministic channel. The second scheme exploits separate coordination and channel coding where local randomness is extracted from the channel after decoding. The third scheme is a joint coordination-channel polar coding scheme for strong coordination. We show that polar codes are able to achieve the established inner bound to the noisy strong coordination capacity region and thus provide a constructive alternative to a random coding proof. Our polar coding scheme also offers a constructive solution to a channel simulation problem where a DMC and shared randomness are employed together to simulate another DMC. Finally, by leveraging the random coding results for this problem, we present an example in which the proposed joint scheme is able to strictly outperform the separate scheme in terms of achievable communication rate for the same amount of injected randomness into both systems. Thus, we establish the sub-optimality of the separation of strong coordination and channel coding with respect to the communication rate over the DMC.
In this study, we generalize a problem of sampling a scalar Gauss Markov Process, namely, the Ornstein-Uhlenbeck (OU) process, where the samples are sent to a remote estimator and the estimator makes a causal estimate of the observed realtime signal. In recent years, the problem is solved for stable OU processes. We present solutions for the optimal sampling policy that exhibits a smaller estimation error for both stable and unstable cases of the OU process along with a special case when the OU process turns to a Wiener process. The obtained optimal sampling policy is a threshold policy. However, the thresholds are different for all three cases. Later, we consider additional noise with the sample when the sampling decision is made beforehand. The estimator utilizes noisy samples to make an estimate of the current signal value. The mean-square error (mse) is changed from previous due to noise and the additional term in the mse is solved which provides performance upper bound and room for a pursuing further investigation on this problem to find an optimal sampling strategy that minimizes the estimation error when the observed samples are noisy. Numerical results show performance degradation caused by the additive noise.
We establish the deterministic-code capacity region of a network with one transmitter and two receivers: an ordinary receiver and a robust receiver. The channel to the ordinary receiver is a given (known) discrete memoryless channel (DMC), whereas th e channel to the robust receiver is an arbitrarily varying channel (AVC). Both receivers are required to decode the common message, whereas only the ordinary receiver is required to decode the private message.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا