Do you want to publish a course? Click here

A Bound on the Shannon Capacity via a Linear Programming Variation

94   0   0.0 ( 0 )
 Added by Sihuang Hu
 Publication date 2018
and research's language is English




Ask ChatGPT about the research

We prove an upper bound on the Shannon capacity of a graph via a linear programming variation. We show that our bound can outperform both the Lovasz theta number and the Haemers minimum rank bound. As a by-product, we also obtain a new upper bound on the broadcast rate of Index Coding.



rate research

Read More

Motivated by the problem of zero-error broadcasting, we introduce a new notion of graph capacity, termed $rho$-capacity, that generalizes the Shannon capacity of a graph. We derive upper and lower bounds on the $rho$-capacity of arbitrary graphs, and provide a Lovasz-type upper bound for regular graphs. We study the behavior of the $rho$-capacity under two graph operations: the strong product and the disjoint union. Finally, we investigate the connection between the structure of a graph and its $rho$-capacity.
In this note, we provide a description of the elements of minimum rank of a generalized Gabidulin code in terms of Grassmann coordinates. As a consequence, a characterization of linearized polynomials of rank at most $n-k$ is obtained, as well as parametric equations for MRD-codes of distance $d=n-k+1$.
Shannon gave a lower bound in 1959 on the binary rate of spherical codes of given minimum Euclidean distance $rho$. Using nonconstructive codes over a finite alphabet, we give a lower bound that is weaker but very close for small values of $rho$. The construction is based on the Yaglom map combined with some finite sphere packings obtained from nonconstructive codes for the Euclidean metric. Concatenating geometric codes meeting the TVZ bound with a Lee metric BCH code over $GF(p),$ we obtain spherical codes that are polynomial time constructible. Their parameters outperform those obtained by Lachaud and Stern in 1994. At very high rate they are above 98 per cent of the Shannon bound.
318 - Tobias Koch 2015
The Shannon lower bound is one of the few lower bounds on the rate-distortion function that holds for a large class of sources. In this paper, it is demonstrated that its gap to the rate-distortion function vanishes as the allowed distortion tends to zero for all sources having a finite differential entropy and whose integer part is finite. Conversely, it is demonstrated that if the integer part of the source has an infinite entropy, then its rate-distortion function is infinite for every finite distortion. Consequently, the Shannon lower bound provides an asymptotically tight bound on the rate-distortion function if, and only if, the integer part of the source has a finite entropy.
We derive a single-letter upper bound to the mismatched-decoding capacity for discrete memoryless channels. The bound is expressed as the mutual information of a transformation of the channel, such that a maximum-likelihood decoding error on the translated channel implies a mismatched-decoding error in the original channel. In particular, a strong converse is shown to hold for this upper-bound: if the rate exceeds the upper-bound, the probability of error tends to 1 exponentially when the block-length tends to infinity. We also show that the underlying optimization problem is a convex-concave problem and that an efficient iterative algorithm converges to the optimal solution. In addition, we show that, unlike achievable rates in the literature, the multiletter version of the bound does not improve. A number of examples are discussed throughout the paper.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا