Publication
Nature
Paper

High-threshold and low-overhead fault-tolerant quantum memory

View publication

Abstract

The accumulation of physical errors prevents the execution of large-scale algorithms in current quantum computers. Quantum error correction promises a solution by encoding k logical qubits onto a larger number $\textit{n}$ of physical qubits, such that the physical errors are suppressed enough to allow running a desired computation with tolerable fidelity. Quantum error correction becomes practically realizable once the physical error rate is below a threshold value that depends on the choice of quantum code, syndrome measurement circuit and decoding algorithm. We present an end-to-end quantum error correction protocol that implements fault-tolerant memory on the basis of a family of low-density parity-check codes. Our approach achieves an error threshold of 0.7% for the standard circuit-based noise model, on par with the surface code that for 20 years was the leading code in terms of error threshold. The syndrome measurement cycle for a length-$\textit{n}$ code in our family requires n ancillary qubits and a depth-8 circuit with CNOT gates, qubit initializations and measurements. The required qubit connectivity is a degree-6 graph composed of two edge-disjoint planar subgraphs. In particular, we show that 12 logical qubits can be preserved for nearly 1 million syndrome cycles using 288 physical qubits in total, assuming the physical error rate of 0.1%, whereas the surface code would require nearly 3,000 physical qubits to achieve said performance. Our findings bring demonstrations of a low-overhead fault-tolerant quantum memory within the reach of near-term quantum processors.