No Arabic abstract
We investigate error propagation in sliding window decoding of braided convolutional codes (BCCs). Previous studies of BCCs have focused on iterative decoding thresholds, minimum distance properties, and their bit error rate (BER) performance at small to moderate frame length. Here, we consider a sliding window decoder in the context of large frame length or one that continuously outputs blocks in a streaming fashion. In this case, decoder error propagation, due to the feedback inherent in BCCs, can be a serious problem.In order to mitigate the effects of error propagation, we propose several schemes: a emph{window extension algorithm} where the decoder window size can be extended adaptively, a resynchronization mechanism where we reset the encoder to the initial state, and a retransmission strategy where erroneously decoded blocks are retransmitted. In addition, we introduce a soft BER stopping rule to reduce computational complexity, and the tradeoff between performance and complexity is examined. Simulation results show that, using the proposed window extension algorithm, resynchronization mechanism, and retransmission strategy, the BER performance of BCCs can be improved by up to four orders of magnitude in the signal-to-noise ratio operating range of interest, and in addition the soft BER stopping rule can be employed to reduce computational complexity.
In this paper, we study sliding window decoding of braided convolutional codes (BCCs) in the context of a streaming application, where decoder error propagation can be a serious problem. A window extension algorithm and a resynchronization mechanism are introduced to mitigate the effect of error propagation. In addition, we introduce a soft bit-error-rate stopping rule to reduce computational complexity, and the tradeoff between performance and complexity is examined. Simulation results show that, using the proposed window extension algorithm and resynchronization mechanism, the error performance of BCCs can be improved by up to three orders of magnitude with reduced computational complexity.
A Viterbi-like decoding algorithm is proposed in this paper for generalized convolutional network error correction coding. Different from classical Viterbi algorithm, our decoding algorithm is based on minimum error weight rather than the shortest Hamming distance between received and sent sequences. Network errors may disperse or neutralize due to network transmission and convolutional network coding. Therefore, classical decoding algorithm cannot be employed any more. Source decoding was proposed by multiplying the inverse of network transmission matrix, where the inverse is hard to compute. Starting from the Maximum A Posteriori (MAP) decoding criterion, we find that it is equivalent to the minimum error weight under our model. Inspired by Viterbi algorithm, we propose a Viterbi-like decoding algorithm based on minimum error weight of combined error vectors, which can be carried out directly at sink nodes and can correct any network errors within the capability of convolutional network error correction codes (CNECC). Under certain situations, the proposed algorithm can realize the distributed decoding of CNECC.
In this paper, we perform a threshold analysis of braided convolutional codes (BCCs) on the additive white Gaussian noise (AWGN) channel. The decoding thresholds are estimated by Monte-Carlo density evolution (MC-DE) techniques and compared with approximate thresholds from an erasure channel prediction. The results show that, with spatial coupling, the predicted thresholds are very accurate and quickly approach capacity if the coupling memory is increased. For uncoupled ensembles with random puncturing, the prediction can be improved with help of the AWGN threshold of the unpunctured ensemble.
A low-density parity-check (LDPC) code is a linear block code described by a sparse parity-check matrix, which can be efficiently represented by a bipartite Tanner graph. The standard iterative decoding algorithm, known as belief propagation, passes messages along the edges of this Tanner graph. Density evolution is an efficient method to analyze the performance of the belief propagation decoding algorithm for a particular LDPC code ensemble, enabling the determination of a decoding threshold. The basic problem addressed in this work is how to optimize the Tanner graph so that the decoding threshold is as large as possible. We introduce a new code optimization technique which involves the search space range which can be thought of as minimizing randomness in differential evolution or limiting the search range in exhaustive search. This technique is applied to the design of good irregular LDPC codes and multiedge type LDPC codes.
In this work it is shown that locally repairable codes (LRCs) can be list-decoded efficiently beyond the Johnson radius for a large range of parameters by utilizing the local error-correction capabilities. The corresponding decoding radius is derived and the asymptotic behavior is analyzed. A general list-decoding algorithm for LRCs that achieves this radius is proposed along with an explicit realization for LRCs that are subcodes of Reed--Solomon codes (such as, e.g., Tamo--Barg LRCs). Further, a probabilistic algorithm of low complexity for unique decoding of LRCs is given and its success probability is analyzed. The second part of this work considers error decoding of LRCs and partial maximum distance separable (PMDS) codes through interleaved decoding. For a specific class of LRCs the success probability of interleaved decoding is investigated. For PMDS codes, it is shown that there is a wide range of parameters for which interleaved decoding can increase their decoding radius beyond the minimum distance such that the probability of successful decoding approaches $1$ when the code length goes to infinity.