Loading…

Local tensor-network codes

Tensor-network codes enable the construction of large stabilizer codes out of tensors describing smaller stabilizer codes. An application of tensor-network codes was an efficient and exact decoder for holographic codes. Here, we show how to write some topological codes, including the surface code an...

Full description

Saved in:
Bibliographic Details
Published in:New journal of physics 2022-04, Vol.24 (4), p.43015
Main Authors: Farrelly, Terry, Tuckett, David K, Stace, Thomas M
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Tensor-network codes enable the construction of large stabilizer codes out of tensors describing smaller stabilizer codes. An application of tensor-network codes was an efficient and exact decoder for holographic codes. Here, we show how to write some topological codes, including the surface code and colour code, as simple tensor-network codes. We also show how to calculate distances of stabilizer codes by contracting a tensor network. The algorithm actually gives more information, including a histogram of all logical coset weights. We prove that this method is efficient in the case of stabilizer codes encoded via local log-depth circuits in one dimension and holographic codes. Using our tensor-network distance calculator, we find a modification of the rotated surface code that has the same distance but fewer minimum-weight logical operators by ‘doping’ the tensor network, i.e., we break the homogeneity of the tensor network by locally replacing tensors. For this example, this corresponds to an improvement in successful error correction of almost 2% against depolarizing noise (in the perfect-measurement setting), but comes at the cost of introducing three higher-weight stabilizers. Our general construction lets us pick a network geometry (e.g., a Euclidean lattice in the case of the surface code), and, using only a small set of seed codes (constituent tensors), build extensive codes with the potential for optimisation.
ISSN:1367-2630
1367-2630
DOI:10.1088/1367-2630/ac5e87