ﻻ يوجد ملخص باللغة العربية
The combination of machine learning and quantum computing has emerged as a promising approach for addressing previously untenable problems. Reservoir computing is an efficient learning paradigm that utilizes nonlinear dynamical systems for temporal information processing, i.e., processing of input sequences to produce output sequences. Here we propose quantum reservoir computing that harnesses complex dissipative quantum dynamics. Our class of quantum reservoirs is universal, in that any nonlinear fading memory map can be approximated arbitrarily closely and uniformly over all inputs by a quantum reservoir from this class. We describe a subclass of the universal class that is readily implementable using quantum gates native to current noisy gate-model quantum computers. Proof-of-principle experiments on remotely accessed cloud-based superconducting quantum computers demonstrate that small and noisy quantum reservoirs can tackle high-order nonlinear temporal tasks. Our theoretical and experimental results pave the path for attractive temporal processing applications of near-term gate-model quantum computers of increasing fidelity but without quantum error correction, signifying the potential of these devices for wider applications including neural modeling, speech recognition and natural language processing, going beyond static classification and regression tasks.
Reservoir computer is a temporal information processing system that exploits an artificial or physical dissipative dynamics to learn a dynamical system generating the target time-series. This paper proposes the use of real superconducting quantum com
We consider the hypothesis that quantum mechanics is not fundamental, but instead emerges from a theory with less computational power, such as classical mechanics. This hypothesis makes the prediction that quantum computers will not be capable of suf
Quantum simulation represents the most promising quantum application to demonstrate quantum advantage on near-term noisy intermediate-scale quantum (NISQ) computers, yet available quantum simulation algorithms are prone to errors and thus difficult t
Crosstalk is a major source of noise in Noisy Intermediate-Scale Quantum (NISQ) systems and is a fundamental challenge for hardware design. When multiple instructions are executed in parallel, crosstalk between the instructions can corrupt the quantu
We introduce Mitiq, a Python package for error mitigation on noisy quantum computers. Error mitigation techniques can reduce the impact of noise on near-term quantum computers with minimal overhead in quantum resources by relying on a mixture of quan