No Arabic abstract
Homological product codes are a class of codes that can have improved distance while retaining relatively low stabilizer weight. We show how to build union-find decoders for these codes, using a union-find decoder for one of the codes in the product and a brute force decoder for the other code. We apply this construction to the specific case of the product of a surface code with a small code such as a $[[4,2,2]]$ code, which we call an augmented surface code. The distance of the augmented surface code is the product of the distance of the surface code with that of the small code, and the union-find decoder, with slight modifications, can decode errors up to half the distance. We present numerical simulations, showing that while the threshold of these augmented codes is lower than that of the surface code, the low noise performance is improved.
Quantum codes with low-weight stabilizers known as LDPC codes have been actively studied recently due to their simple syndrome readout circuits and potential applications in fault-tolerant quantum computing. However, all families of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the square-root of the code length. This is in a sharp contrast with the classical case where good families of LDPC codes are known that combine constant encoding rate and linear distance. Here we propose the first family of good quantum codes with low-weight stabilizers. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most $sqrt{n}$ qubits, where $n$ is the code length. For comparison, all previously known families of good quantum codes have stabilizers of linear weight. Our proof combines two techniques: randomized constructions of good quantum codes and the homological product operation from algebraic topology. We conjecture that similar methods can produce good stabilizer codes with stabilizer weight $n^a$ for any $a>0$. Finally, we apply the homological product to construct new small codes with low-weight stabilizers.
Quantum LDPC codes are a promising direction for low overhead quantum computing. In this paper, we propose a generalization of the Union-Find decoder as adecoder for quantum LDPC codes. We prove that this decoder corrects all errors with weight up to An^{alpha} for some A, {alpha} > 0 for different classes of quantum LDPC codes such as toric codes and hyperbolic codes in any dimension D geq 3 and quantum expander codes. To prove this result, we introduce a notion of covering radius which measures the spread of an error from its syndrome. We believe this notion could find application beyond the decoding problem. We also perform numerical simulations, which show that our Union-Find decoder outperforms the belief propagation decoder in the low error rate regime in the case of a quantum LDPC code with length 3600.
The design of decoding algorithms is a significant technological component in the development of fault-tolerant quantum computers. Often design of quantum decoders is inspired by classical decoding algorithms, but there are no general principles for building quantum decoders from classical decoders. Given any pair of classical codes, we can build a quantum code using the hypergraph product, yielding a hypergraph product code. Here we show we can also lift the decoders for these classical codes. That is, given oracle access to a minimum weight decoder for the relevant classical codes, the corresponding $[[n,k,d]]$ quantum code can be efficiently decoded for any error of weight smaller than $(d-1)/2$. The quantum decoder requires only $O(k)$ oracle calls to the classical decoder and $O(n^2)$ classical resources. The lift and the correctness proof of the decoder have a purely algebraic nature that draws on the discovery of some novel homological invariants of the hypergraph product codespace. While the decoder works perfectly for adversarial errors, it is not suitable for more realistic stochastic noise models and therefore can not be used to establish an error correcting threshold.
Belief-propagation (BP) decoders play a vital role in modern coding theory, but they are not suitable to decode quantum error-correcting codes because of a unique quantum feature called error degeneracy. Inspired by an exact mapping between BP and deep neural networks, we train neural BP decoders for quantum low-density parity-check (LDPC) codes with a loss function tailored to error degeneracy. Training substantially improves the performance of BP decoders for all families of codes we tested and may solve the degeneracy problem which plagues the decoding of quantum LDPC codes.
We study a three-fold variant of the hypergraph product code construction, differing from the standard homological product of three classical codes. When instantiated with 3 classical LDPC codes, this XYZ product yields a non CSS quantum LDPC code which might display a large minimum distance. The simplest instance of this construction, corresponding to the product of 3 repetition codes, is a non CSS variant of the 3-dimensional toric code known as the Chamon code. The general construction was introduced in Maurices PhD thesis, but has remained poorly understood so far. The reason is that while hypergraph product codes can be analyzed with combinatorial tools, the XYZ product codes depend crucially on the algebraic properties of the parity-check matrices of the three classical codes, making their analysis much more involved. Our main motivation for studying XYZ product codes is that the natural representatives of logical operators are two-dimensional objects. This contrasts with standard hypergraph product codes in 3 dimensions which always admit one-dimensional logical operators. In particular, specific instances of XYZ product codes might display a minimum distance as large as $Theta(N^{2/3})$ which would beat the current record for the minimum distance of quantum LDPC codes held by fiber bundle codes. While we do not prove this result here, we obtain the dimension of a large class of XYZ product codes, and when restricting to codes with dimension 1, we reduce the problem of computing the minimum distance to a more elementary combinatorial problem involving binary 3-tensors. We also discuss in detail some families of XYZ product codes in three dimensions with local interaction. Some of these codes seem to share properties with Haahs cubic codes and might be interesting candidates for self-correcting quantum memories with a logarithmic energy barrier.