Do you want to publish a course? Click here

Capacity Bounds for Discrete-Time, Amplitude-Constrained, Additive White Gaussian Noise Channels

97   0   0.0 ( 0 )
 Added by Andrew Thangaraj
 Publication date 2015
and research's language is English




Ask ChatGPT about the research

The capacity-achieving input distribution of the discrete-time, additive white Gaussian noise (AWGN) channel with an amplitude constraint is discrete and seems difficult to characterize explicitly. A dual capacity expression is used to derive analytic capacity upper bounds for scalar and vector AWGN channels. The scalar bound improves on McKellips bound and is within 0.1 bits of capacity for all signal-to-noise ratios (SNRs). The two-dimensional bound is within 0.15 bits of capacity provably up to 4.5 dB, and numerical evidence suggests a similar gap for all SNRs.



rate research

Read More

In this work, novel upper and lower bounds for the capacity of channels with arbitrary constraints on the support of the channel input symbols are derived. As an immediate practical application, the case of multiple-input multiple-output channels with amplitude constraints is considered. The bounds are shown to be within a constant gap if the channel matrix is invertible and are tight in the high amplitude regime for arbitrary channel matrices. Moreover, in the high amplitude regime, it is shown that the capacity scales linearly with the minimum between the number of transmit and receive antennas, similarly to the case of average power-constrained inputs.
We derive bounds on the noncoherent capacity of a very general class of multiple-input multiple-output channels that allow for selectivity in time and frequency as well as for spatial correlation. The bounds apply to peak-constrained inputs; they are explicit in the channels scattering function, are useful for a large range of bandwidth, and allow to coarsely identify the capacity-optimal combination of bandwidth and number of transmit antennas. Furthermore, we obtain a closed-form expression for the first-order Taylor series expansion of capacity in the limit of infinite bandwidth. From this expression, we conclude that in the wideband regime: (i) it is optimal to use only one transmit antenna when the channel is spatially uncorrelated; (ii) rank-one statistical beamforming is optimal if the channel is spatially correlated; and (iii) spatial correlation, be it at the transmitter, the receiver, or both, is beneficial.
This paper studies an $n$-dimensional additive Gaussian noise channel with a peak-power-constrained input. It is well known that, in this case, when $n=1$ the capacity-achieving input distribution is discrete with finitely many mass points, and when $n>1$ the capacity-achieving input distribution is supported on finitely many concentric shells. However, due to the previous proof technique, neither the exact number of mass points/shells of the optimal input distribution nor a bound on it was available. This paper provides an alternative proof of the finiteness of the number mass points/shells of the capacity-achieving input distribution and produces the first firm bounds on the number of mass points and shells, paving an alternative way for approaching many such problems. Roughly, the paper consists of three parts. The first part considers the case of $n=1$. The first result, in this part, shows that the number of mass points in the capacity-achieving input distribution is within a factor of two from the downward shifted capacity-achieving output probability density function (pdf). The second result, by showing a bound on the number of zeros of the downward shifted capacity-achieving output pdf, provides a first firm upper on the number of mass points. Specifically, it is shown that the number of mass points is given by $O(mathsf{A}^2)$ where $mathsf{A}$ is the constraint on the input amplitude. The second part generalizes the results of the first part to the case of $n>1$. In particular, for every dimension $n>1$, it is shown that the number of shells is given by $O(mathsf{A}^2)$ where $mathsf{A}$ is the constraint on the input amplitude. Finally, the third part provides bounds on the number of points for the case of $n=1$ with an additional power constraint.
This paper studies a two-user state-dependent Gaussian multiple-access channel (MAC) with state noncausally known at one encoder. Two scenarios are considered: i) each user wishes to communicate an independent message to the common receiver, and ii) the two encoders send a common message to the receiver and the non-cognitive encoder (i.e., the encoder that does not know the state) sends an independent individual message (this model is also known as the MAC with degraded message sets). For both scenarios, new outer bounds on the capacity region are derived, which improve uniformly over the best known outer bounds. In the first scenario, the two corner points of the capacity region as well as the sum rate capacity are established, and it is shown that a single-letter solution is adequate to achieve both the corner points and the sum rate capacity. Furthermore, the full capacity region is characterized in situations in which the sum rate capacity is equal to the capacity of the helper problem. The proof exploits the optimal-transportation idea of Polyanskiy and Wu (which was used previously to establish an outer bound on the capacity region of the interference channel) and the worst-case Gaussian noise result for the case in which the input and the noise are dependent.
90 - Amir Saberi , Farhad Farokhi , 2020
It is known that for a discrete channel with correlated additive noise, the ordinary capacity with or without feedback both equal $ log q-mathcal{H} (Z) $, where $ mathcal{H}(Z) $ is the entropy rate of the noise process $ Z $ and $ q $ is the alphabet size. In this paper, a class of finite-state additive noise channels is introduced. It is shown that the zero-error feedback capacity of such channels is either zero or $C_{0f} =log q -h (Z) $, where $ h (Z) $ is the {em topological entropy} of the noise process. A topological condition is given when the zero-error capacity is zero, with or without feedback. Moreover, the zero-error capacity without feedback is lower-bounded by $ log q-2 h (Z) $. We explicitly compute the zero-error feedback capacity for several examples, including channels with isolated errors and a Gilbert-Elliot channel.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا