Characterizing mid-circuit measurements on a superconducting qubit using gate set tomography


الملخص بالإنكليزية

Measurements that occur within the internal layers of a quantum circuit -- mid-circuit measurements -- are an important quantum computing primitive, most notably for quantum error correction. Mid-circuit measurements have both classical and quantum outputs, so they can be subject to error modes that do not exist for measurements that terminate quantum circuits. Here we show how to characterize mid-circuit measurements, modelled by quantum instruments, using a technique that we call quantum instrument linear gate set tomography (QILGST). We then apply this technique to characterize a dispersive measurement on a superconducting transmon qubit within a multiqubit system. By varying the delay time between the measurement pulse and subsequent gates, we explore the impact of residual cavity photon population on measurement error. QILGST can resolve different error modes and quantify the total error from a measurement; in our experiment, for delay times above 1000 ns we measured a total error rate (i.e., half diamond distance) of $epsilon_{diamond} = 8.1 pm 1.4 %$, a readout fidelity of $97.0 pm 0.3%$, and output quantum state fidelities of $96.7 pm 0.6%$ and $93.7 pm 0.7%$ when measuring $0$ and $1$, respectively.

تحميل البحث