ترغب بنشر مسار تعليمي؟ اضغط هنا

Temporal Information Processing on Noisy Quantum Computers

90   0   0.0 ( 0 )
 نشر من قبل Hendra Nurdin
 تاريخ النشر 2020
والبحث باللغة English




اسأل ChatGPT حول البحث

The combination of machine learning and quantum computing has emerged as a promising approach for addressing previously untenable problems. Reservoir computing is an efficient learning paradigm that utilizes nonlinear dynamical systems for temporal information processing, i.e., processing of input sequences to produce output sequences. Here we propose quantum reservoir computing that harnesses complex dissipative quantum dynamics. Our class of quantum reservoirs is universal, in that any nonlinear fading memory map can be approximated arbitrarily closely and uniformly over all inputs by a quantum reservoir from this class. We describe a subclass of the universal class that is readily implementable using quantum gates native to current noisy gate-model quantum computers. Proof-of-principle experiments on remotely accessed cloud-based superconducting quantum computers demonstrate that small and noisy quantum reservoirs can tackle high-order nonlinear temporal tasks. Our theoretical and experimental results pave the path for attractive temporal processing applications of near-term gate-model quantum computers of increasing fidelity but without quantum error correction, signifying the potential of these devices for wider applications including neural modeling, speech recognition and natural language processing, going beyond static classification and regression tasks.



قيم البحث

اقرأ أيضاً

Reservoir computer is a temporal information processing system that exploits an artificial or physical dissipative dynamics to learn a dynamical system generating the target time-series. This paper proposes the use of real superconducting quantum com puting devices as the reservoir, where the dissipative property is served by the natural noise added to the quantum bits. The performance of this natural quantum reservoir is demonstrated in a benchmark time-series regression problem and a practical problem classifying different objects based on a temporal sensor data. In both cases the proposed reservoir computer shows a higher performance than a linear regression or classification model. The results indicate that a noisy quantum device potentially functions as a reservoir computer, and notably, the quantum noise, which is undesirable in the conventional quantum computation, can be used as a rich computation resource.
196 - Kevin Slagle 2021
We consider the hypothesis that quantum mechanics is not fundamental, but instead emerges from a theory with less computational power, such as classical mechanics. This hypothesis makes the prediction that quantum computers will not be capable of suf ficiently complex quantum computations. Utilizing this prediction, we outline a proposal to test for such a breakdown of quantum mechanics using near-term noisy intermediate-scale quantum (NISQ) computers. Our procedure involves simulating a non-Clifford random circuit, followed by its inverse, and then checking that the resulting state is the same as the initial state. We show that quantum mechanics predicts that the fidelity of this procedure decays exponentially with circuit depth (due to noise in NISQ computers). However, if quantum mechanics emerges from a theory with significantly less computational power, then we expect the fidelity to decay significantly more rapidly than the quantum mechanics prediction for sufficiently deep circuits, which is the experimental signature that we propose to search for. Useful experiments can be performed with 80 qubits and gate infidelity $10^{-3}$, while highly informative experiments should require only 1000 qubits and gate infidelity $10^{-5}$.
Quantum simulation represents the most promising quantum application to demonstrate quantum advantage on near-term noisy intermediate-scale quantum (NISQ) computers, yet available quantum simulation algorithms are prone to errors and thus difficult t o be realized. Herein, we propose a novel scheme to utilize intrinsic gate errors of NISQ devices to enable controllable simulation of open quantum system dynamics without ancillary qubits or explicit bath engineering, thus turning unwanted quantum noises into useful quantum resources. Specifically, we simulate energy transfer process in a photosynthetic dimer system on IBM-Q cloud. By employing designed decoherence-inducing gates, we show that quantum dissipative dynamics can be simulated efficiently across coherent-to-incoherent regimes with results comparable to those of the numerically-exact classical method. Moreover, we demonstrate a calibration routine that enables consistent and predictive simulations of open-quantum system dynamics in the intermediate coupling regime. This work provides a new direction for quantum advantage in the NISQ era.
Crosstalk is a major source of noise in Noisy Intermediate-Scale Quantum (NISQ) systems and is a fundamental challenge for hardware design. When multiple instructions are executed in parallel, crosstalk between the instructions can corrupt the quantu m state and lead to incorrect program execution. Our goal is to mitigate the application impact of crosstalk noise through software techniques. This requires (i) accurate characterization of hardware crosstalk, and (ii) intelligent instruction scheduling to serialize the affected operations. Since crosstalk characterization is computationally expensive, we develop optimizations which reduce the characterization overhead. On three 20-qubit IBMQ systems, we demonstrate two orders of magnitude reduction in characterization time (compute time on the QC device) compared to all-pairs crosstalk measurements. Informed by these characterization, we develop a scheduler that judiciously serializes high crosstalk instructions balancing the need to mitigate crosstalk and exponential decoherence errors from serialization. On real-system runs on three IBMQ systems, our scheduler improves the error rate of application circuits by up to 5.6x, compared to the IBM instruction scheduler and offers near-optimal crosstalk mitigation in practice. In a broader picture, the difficulty of mitigating crosstalk has recently driven QC vendors to move towards sparser qubit connectivity or disabling nearby operations entirely in hardware, which can be detrimental to performance. Our work makes the case for software mitigation of crosstalk errors.
We introduce Mitiq, a Python package for error mitigation on noisy quantum computers. Error mitigation techniques can reduce the impact of noise on near-term quantum computers with minimal overhead in quantum resources by relying on a mixture of quan tum sampling and classical post-processing techniques. Mitiq is an extensible toolkit of different error mitigation methods, including zero-noise extrapolation, probabilistic error cancellation, and Clifford data regression. The library is designed to be compatible with generic backends and interfaces with different quantum software frameworks. We describe Mitiq using code snippets to demonstrate usage and discuss features and contribution guidelines. We present several examples demonstrating error mitigation on IBM and Rigetti superconducting quantum processors as well as on noisy simulators.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا