ترغب بنشر مسار تعليمي؟ اضغط هنا

Experimental accreditation of outputs of noisy quantum computers

72   0   0.0 ( 0 )
 نشر من قبل Samuele Ferracin
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We provide and experimentally demonstrate an accreditation protocol that upper-bounds the variation distance between noisy and noiseless probability distributions of the outputs of arbitrary quantum computations. We accredit the outputs of twenty-four quantum circuits executed on programmable superconducting hardware, ranging from depth four circuits on ten qubits to depth twenty-one circuits on four qubits. Our protocol requires implementing the target quantum circuit along with a number of random Clifford circuits and subsequently post-processing the outputs of these Clifford circuits. Importantly, the number of Clifford circuits is chosen to obtain the bound with the desired confidence and accuracy, and is independent of the size and nature of the target circuit. We thus demonstrate a practical and scalable method of ascertaining the correctness of the outputs of arbitrary-sized noisy quantum computers$-$the ultimate arbiter of the utility of the computer itself.



قيم البحث

اقرأ أيضاً

196 - Kevin Slagle 2021
We consider the hypothesis that quantum mechanics is not fundamental, but instead emerges from a theory with less computational power, such as classical mechanics. This hypothesis makes the prediction that quantum computers will not be capable of suf ficiently complex quantum computations. Utilizing this prediction, we outline a proposal to test for such a breakdown of quantum mechanics using near-term noisy intermediate-scale quantum (NISQ) computers. Our procedure involves simulating a non-Clifford random circuit, followed by its inverse, and then checking that the resulting state is the same as the initial state. We show that quantum mechanics predicts that the fidelity of this procedure decays exponentially with circuit depth (due to noise in NISQ computers). However, if quantum mechanics emerges from a theory with significantly less computational power, then we expect the fidelity to decay significantly more rapidly than the quantum mechanics prediction for sufficiently deep circuits, which is the experimental signature that we propose to search for. Useful experiments can be performed with 80 qubits and gate infidelity $10^{-3}$, while highly informative experiments should require only 1000 qubits and gate infidelity $10^{-5}$.
The combination of machine learning and quantum computing has emerged as a promising approach for addressing previously untenable problems. Reservoir computing is an efficient learning paradigm that utilizes nonlinear dynamical systems for temporal i nformation processing, i.e., processing of input sequences to produce output sequences. Here we propose quantum reservoir computing that harnesses complex dissipative quantum dynamics. Our class of quantum reservoirs is universal, in that any nonlinear fading memory map can be approximated arbitrarily closely and uniformly over all inputs by a quantum reservoir from this class. We describe a subclass of the universal class that is readily implementable using quantum gates native to current noisy gate-model quantum computers. Proof-of-principle experiments on remotely accessed cloud-based superconducting quantum computers demonstrate that small and noisy quantum reservoirs can tackle high-order nonlinear temporal tasks. Our theoretical and experimental results pave the path for attractive temporal processing applications of near-term gate-model quantum computers of increasing fidelity but without quantum error correction, signifying the potential of these devices for wider applications including neural modeling, speech recognition and natural language processing, going beyond static classification and regression tasks.
Quantum simulation represents the most promising quantum application to demonstrate quantum advantage on near-term noisy intermediate-scale quantum (NISQ) computers, yet available quantum simulation algorithms are prone to errors and thus difficult t o be realized. Herein, we propose a novel scheme to utilize intrinsic gate errors of NISQ devices to enable controllable simulation of open quantum system dynamics without ancillary qubits or explicit bath engineering, thus turning unwanted quantum noises into useful quantum resources. Specifically, we simulate energy transfer process in a photosynthetic dimer system on IBM-Q cloud. By employing designed decoherence-inducing gates, we show that quantum dissipative dynamics can be simulated efficiently across coherent-to-incoherent regimes with results comparable to those of the numerically-exact classical method. Moreover, we demonstrate a calibration routine that enables consistent and predictive simulations of open-quantum system dynamics in the intermediate coupling regime. This work provides a new direction for quantum advantage in the NISQ era.
Crosstalk is a major source of noise in Noisy Intermediate-Scale Quantum (NISQ) systems and is a fundamental challenge for hardware design. When multiple instructions are executed in parallel, crosstalk between the instructions can corrupt the quantu m state and lead to incorrect program execution. Our goal is to mitigate the application impact of crosstalk noise through software techniques. This requires (i) accurate characterization of hardware crosstalk, and (ii) intelligent instruction scheduling to serialize the affected operations. Since crosstalk characterization is computationally expensive, we develop optimizations which reduce the characterization overhead. On three 20-qubit IBMQ systems, we demonstrate two orders of magnitude reduction in characterization time (compute time on the QC device) compared to all-pairs crosstalk measurements. Informed by these characterization, we develop a scheduler that judiciously serializes high crosstalk instructions balancing the need to mitigate crosstalk and exponential decoherence errors from serialization. On real-system runs on three IBMQ systems, our scheduler improves the error rate of application circuits by up to 5.6x, compared to the IBM instruction scheduler and offers near-optimal crosstalk mitigation in practice. In a broader picture, the difficulty of mitigating crosstalk has recently driven QC vendors to move towards sparser qubit connectivity or disabling nearby operations entirely in hardware, which can be detrimental to performance. Our work makes the case for software mitigation of crosstalk errors.
Trapped ions (TI) are a leading candidate for building Noisy Intermediate-Scale Quantum (NISQ) hardware. TI qubits have fundamental advantages over other technologies such as superconducting qubits, including high qubit quality, coherence and connect ivity. However, current TI systems are small in size, with 5-20 qubits and typically use a single trap architecture which has fundamental scalability limitations. To progress towards the next major milestone of 50-100 qubits, a modular architecture termed the Quantum Charge Coupled Device (QCCD) has been proposed. In a QCCD-based TI device, small traps are connected through ion shuttling. While the basic hardware components for such devices have been demonstrated, building a 50-100 qubit system is challenging because of a wide range of design possibilities for trap sizing, communication topology and gate implementations and the need to match diverse application resource requirements. Towards realizing QCCD systems with 50-100 qubits, we perform an extensive architectural study evaluating the key design choices of trap sizing, communication topology and operation implementation methods. We built a design toolflow which takes a QCCD architectures parameters as input, along with a set of applications and realistic hardware performance models. Our toolflow maps the applications onto the target device and simulates their execution to compute metrics such as application run time, reliability and device noise rates. Using six applications and several hardware design points, we show that trap sizing and communication topology choices can impact application reliability by up to three orders of magnitude. Microarchitectural gate implementation choices influence reliability by another order of magnitude. From these studies, we provide concrete recommendations to tune these choices to achieve highly reliable and performant application executions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا