ﻻ يوجد ملخص باللغة العربية
For a (single-source) multicast network, the size of a base field is the most known and studied algebraic identity that is involved in characterizing its linear solvability over the base field. In this paper, we design a new class $mathcal{N}$ of multicast networks and obtain an explicit formula for the linear solvability of these networks, which involves the associated coset numbers of a multiplicative subgroup in a base field. The concise formula turns out to be the first that matches the topological structure of a multicast network and algebraic identities of a field other than size. It further facilitates us to unveil emph{infinitely many} new multicast networks linearly solvable over GF($q$) but not over GF($q$) with $q < q$, based on a subgroup order criterion. In particular, i) for every $kgeq 2$, an instance in $mathcal{N}$ can be found linearly solvable over GF($2^{2k}$) but emph{not} over GF($2^{2k+1}$), and ii) for arbitrary distinct primes $p$ and $p$, there are infinitely many $k$ and $k$ such that an instance in $mathcal{N}$ can be found linearly solvable over GF($p^k$) but emph{not} over GF($p^{k}$) with $p^k < p^{k}$. On the other hand, the construction of $mathcal{N}$ also leads to a new class of multicast networks with $Theta(q^2)$ nodes and $Theta(q^2)$ edges, where $q geq 5$ is the minimum field size for linear solvability of the network.
We consider linear network error correction (LNEC) coding when errors may occur on edges of a communication network of which the topology is known. In this paper, we first revisit and explore the framework of LNEC coding, and then unify two well-know
Sparse random linear network coding (SRLNC) used as a class of erasure codes to ensure the reliability of multicast communications has been widely investigated. However, an exact expression for the decoding success probability of SRLNC is still unkno
In this paper we introduce Neural Network Coding(NNC), a data-driven approach to joint source and network coding. In NNC, the encoders at each source and intermediate node, as well as the decoder at each destination node, are neural networks which ar
In this paper, we study the effect of a single link on the capacity of a network of error-free bit pipes. More precisely, we study the change in network capacity that results when we remove a single link of capacity $delta$. In a recent result, we pr
The conventional theory of linear network coding (LNC) is only over acyclic networks. Convolutional network coding (CNC) applies to all networks. It is also a form of LNC, but the linearity is w.r.t. the ring of rational power series rather than the