No Arabic abstract
The great promise of quantum computers comes with the dual challenges of building them and finding their useful applications. We argue that these two challenges should be considered together, by co-designing full-stack quantum computer systems along with their applications in order to hasten their development and potential for scientific discovery. In this context, we identify scientific and community needs, opportunities, a sampling of a few use case studies, and significant challenges for the development of quantum computers for science over the next 2--10 years. This document is written by a community of university, national laboratory, and industrial researchers in the field of Quantum Information Science and Technology, and is based on a summary from a U.S. National Science Foundation workshop on Quantum Computing held on October 21--22, 2019 in Alexandria, VA.
Gauge theories are fundamental to our understanding of interactions between the elementary constituents of matter as mediated by gauge bosons. However, computing the real-time dynamics in gauge theories is a notorious challenge for classical computational methods. In the spirit of Feynmans vision of a quantum simulator, this has recently stimulated theoretical effort to devise schemes for simulating such theories on engineered quantum-mechanical devices, with the difficulty that gauge invariance and the associated local conservation laws (Gauss laws) need to be implemented. Here we report the first experimental demonstration of a digital quantum simulation of a lattice gauge theory, by realising 1+1-dimensional quantum electrodynamics (Schwinger model) on a few-qubit trapped-ion quantum computer. We are interested in the real-time evolution of the Schwinger mechanism, describing the instability of the bare vacuum due to quantum fluctuations, which manifests itself in the spontaneous creation of electron-positron pairs. To make efficient use of our quantum resources, we map the original problem to a spin model by eliminating the gauge fields in favour of exotic long-range interactions, which have a direct and efficient implementation on an ion trap architecture. We explore the Schwinger mechanism of particle-antiparticle generation by monitoring the mass production and the vacuum persistence amplitude. Moreover, we track the real-time evolution of entanglement in the system, which illustrates how particle creation and entanglement generation are directly related. Our work represents a first step towards quantum simulating high-energy theories with atomic physics experiments, the long-term vision being the extension to real-time quantum simulations of non-Abelian lattice gauge theories.
While quantum computing proposes promising solutions to computational problems not accessible with classical approaches, due to current hardware constraints, most quantum algorithms are not yet capable of computing systems of practical relevance, and classical counterparts outperform them. To practically benefit from quantum architecture, one has to identify problems and algorithms with favorable scaling and improve on corresponding limitations depending on available hardware. For this reason, we developed an algorithm that solves integer linear programming problems, a classically NP-hard problem, on a quantum annealer, and investigated problem and hardware-specific limitations. This work presents the formalism of how to map ILP problems to the annealing architectures, how to systematically improve computations utilizing optimized anneal schedules, and models the anneal process through a simulation. It illustrates the effects of decoherence and many body localization for the minimum dominating set problem, and compares annealing results against numerical simulations of the quantum architecture. We find that the algorithm outperforms random guessing but is limited to small problems and that annealing schedules can be adjusted to reduce the effects of decoherence. Simulations qualitatively reproduce algorithmic improvements of the modified annealing schedule, suggesting the improvements have origins from quantum effects.
We present an algorithm that extends existing quantum algorithms for simulating fermion systems in quantum chemistry and condensed matter physics to include bosons in general and phonons in particular. We introduce a qubit representation for the low-energy subspace of phonons which allows an efficient simulation of the evolution operator of the electron-phonon systems. As a consequence of the Nyquist-Shannon sampling theorem, the phonons are represented with exponential accuracy on a discretized Hilbert space with a size that increases linearly with the cutoff of the maximum phonon number. The additional number of qubits required by the presence of phonons scales linearly with the size of the system. The additional circuit depth is constant for systems with finite-range electron-phonon and phonon-phonon interactions and linear for long-range electron-phonon interactions. Our algorithm for a Holstein polaron problem was implemented on an Atos Quantum Learning Machine (QLM) quantum simulator employing the Quantum Phase Estimation method. The energy and the phonon number distribution of the polaron state agree with exact diagonalization results for weak, intermediate and strong electron-phonon coupling regimes.
Machine learning, a branch of artificial intelligence, learns from previous experience to optimize performance, which is ubiquitous in various fields such as computer sciences, financial analysis, robotics, and bioinformatics. A challenge is that machine learning with the rapidly growing big data could become intractable for classical computers. Recently, quantum machine learning algorithms [Lloyd, Mohseni, and Rebentrost, arXiv.1307.0411] was proposed which could offer an exponential speedup over classical algorithms. Here, we report the first experimental entanglement-based classification of 2-, 4-, and 8-dimensional vectors to different clusters using a small-scale photonic quantum computer, which is then used to implement supervised and unsupervised machine learning. The results demonstrate the working principle of using quantum computers to manipulate and classify high-dimensional vectors, the core mathematical routine in machine learning. The method can in principle be scaled to a larger number of qubits, and may provide a new route to accelerate machine learning.
Quantum computers promise tremendous impact across applications -- and have shown great strides in hardware engineering -- but remain notoriously error prone. Careful design of low-level controls has been shown to compensate for the processes which induce hardware errors, leveraging techniques from optimal and robust control. However, these techniques rely heavily on the availability of highly accurate and detailed physical models which generally only achieve sufficient representative fidelity for the most simple operations and generic noise modes. In this work, we use deep reinforcement learning to design a universal set of error-robust quantum logic gates on a superconducting quantum computer, without requiring knowledge of a specific Hamiltonian model of the system, its controls, or its underlying error processes. We experimentally demonstrate that a fully autonomous deep reinforcement learning agent can design single qubit gates up to $3times$ faster than default DRAG operations without additional leakage error, and exhibiting robustness against calibration drifts over weeks. We then show that $ZX(-pi/2)$ operations implemented using the cross-resonance interaction can outperform hardware default gates by over $2times$ and equivalently exhibit superior calibration-free performance up to 25 days post optimization using various metrics. We benchmark the performance of deep reinforcement learning derived gates against other black box optimization techniques, showing that deep reinforcement learning can achieve comparable or marginally superior performance, even with limited hardware access.