No Arabic abstract
Quantum codes with low-weight stabilizers known as LDPC codes have been actively studied recently due to their simple syndrome readout circuits and potential applications in fault-tolerant quantum computing. However, all families of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the square-root of the code length. This is in a sharp contrast with the classical case where good families of LDPC codes are known that combine constant encoding rate and linear distance. Here we propose the first family of good quantum codes with low-weight stabilizers. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most $sqrt{n}$ qubits, where $n$ is the code length. For comparison, all previously known families of good quantum codes have stabilizers of linear weight. Our proof combines two techniques: randomized constructions of good quantum codes and the homological product operation from algebraic topology. We conjecture that similar methods can produce good stabilizer codes with stabilizer weight $n^a$ for any $a>0$. Finally, we apply the homological product to construct new small codes with low-weight stabilizers.
Homological product codes are a class of codes that can have improved distance while retaining relatively low stabilizer weight. We show how to build union-find decoders for these codes, using a union-find decoder for one of the codes in the product and a brute force decoder for the other code. We apply this construction to the specific case of the product of a surface code with a small code such as a $[[4,2,2]]$ code, which we call an augmented surface code. The distance of the augmented surface code is the product of the distance of the surface code with that of the small code, and the union-find decoder, with slight modifications, can decode errors up to half the distance. We present numerical simulations, showing that while the threshold of these augmented codes is lower than that of the surface code, the low noise performance is improved.
We study a three-fold variant of the hypergraph product code construction, differing from the standard homological product of three classical codes. When instantiated with 3 classical LDPC codes, this XYZ product yields a non CSS quantum LDPC code which might display a large minimum distance. The simplest instance of this construction, corresponding to the product of 3 repetition codes, is a non CSS variant of the 3-dimensional toric code known as the Chamon code. The general construction was introduced in Maurices PhD thesis, but has remained poorly understood so far. The reason is that while hypergraph product codes can be analyzed with combinatorial tools, the XYZ product codes depend crucially on the algebraic properties of the parity-check matrices of the three classical codes, making their analysis much more involved. Our main motivation for studying XYZ product codes is that the natural representatives of logical operators are two-dimensional objects. This contrasts with standard hypergraph product codes in 3 dimensions which always admit one-dimensional logical operators. In particular, specific instances of XYZ product codes might display a minimum distance as large as $Theta(N^{2/3})$ which would beat the current record for the minimum distance of quantum LDPC codes held by fiber bundle codes. While we do not prove this result here, we obtain the dimension of a large class of XYZ product codes, and when restricting to codes with dimension 1, we reduce the problem of computing the minimum distance to a more elementary combinatorial problem involving binary 3-tensors. We also discuss in detail some families of XYZ product codes in three dimensions with local interaction. Some of these codes seem to share properties with Haahs cubic codes and might be interesting candidates for self-correcting quantum memories with a logarithmic energy barrier.
We provide a numerical investigation of two families of subsystem quantum codes that are related to hypergraph product codes by gauge-fixing. The first family consists of the Bravyi-Bacon-Shor (BBS) codes which have optimal code parameters for subsystem quantum codes local in 2-dimensions. The second family consists of the constant rate generalized Shor codes of Bacon and Cassicino cite{bacon2006quantum}, which we re-brand as subsystem hypergraph product (SHP) codes. We show that any hypergraph product code can be obtained by entangling the gauge qubits of two SHP codes. To evaluate the performance of these codes, we simulate both small and large examples. For circuit noise, a $[[21,4,3]]$ BBS code and a $[[49,16,3]]$ SHP code have pseudthresholds of $2times10^{-3}$ and $8times10^{-4}$, respectively. Simulations for phenomenological noise show that large BBS and SHP codes start to outperform surface codes with similar encoding rate at physical error rates $1times 10^{-6}$ and $4times10^{-4}$, respectively.
Product perfect codes have been proven to enhance the performance of the F5 steganographic method, whereas perfect Z2Z4-linear codes have been recently introduced as an efficient way to embed data, conforming to the +/-1-steganography. In this paper, we present two steganographic methods. On the one hand, a generalization of product perfect codes is made. On the other hand, this generalization is applied to perfect Z2Z4-linear codes. Finally, the performance of the proposed methods is evaluated and compared with those of the aforementioned schemes.
We give a polynomial time algorithm to decode multivariate polynomial codes of degree $d$ up to half their minimum distance, when the evaluation points are an arbitrary product set $S^m$, for every $d < |S|$. Previously known algorithms can achieve this only if the set $S$ has some very special algebraic structure, or if the degree $d$ is significantly smaller than $|S|$. We also give a near-linear time randomized algorithm, which is based on tools from list-decoding, to decode these codes from nearly half their minimum distance, provided $d < (1-epsilon)|S|$ for constant $epsilon > 0$. Our result gives an $m$-dimensional generalization of the well known decoding algorithms for Reed-Solomon codes, and can be viewed as giving an algorithmic version of the Schwartz-Zippel lemma.