No Arabic abstract
One of the most promising applications of quantum computing is simulating quantum many-body systems. However, there is still a need for methods to efficiently investigate these systems in a native way, capturing their full complexity. Here, we propose variational quantum anomaly detection, an unsupervised quantum machine learning algorithm to analyze quantum data from quantum simulation. The algorithm is used to extract the phase diagram of a system with no prior physical knowledge and can be performed end-to-end on the same quantum device that the system is simulated on. We showcase its capabilities by mapping out the phase diagram of the one-dimensional extended Bose Hubbard model with dimerized hoppings, which exhibits a symmetry protected topological phase. Further, we show that it can be used with readily accessible devices nowadays and perform the algorithm on a real quantum computer.
Machine learning techniques have led to broad adoption of a statistical model of computing. The statistical distributions natively available on quantum processors are a superset of those available classically. Harnessing this attribute has the potential to accelerate or otherwise improve machine learning relative to purely classical performance. A key challenge toward that goal is learning to hybridize classical computing resources and traditional learning techniques with the emerging capabilities of general purpose quantum processors. Here, we demonstrate such hybridization by training a 19-qubit gate model processor to solve a clustering problem, a foundational challenge in unsupervised learning. We use the quantum approximate optimization algorithm in conjunction with a gradient-free Bayesian optimization to train the quantum machine. This quantum/classical hybrid algorithm shows robustness to realistic noise, and we find evidence that classical optimization can be used to train around both coherent and incoherent imperfections.
The preparation of thermal equilibrium states is important for the simulation of condensed-matter and cosmology systems using a quantum computer. We present a method to prepare such mixed states with unitary operators, and demonstrate this technique experimentally using a gate-based quantum processor. Our method targets the generation of thermofield double states using a hybrid quantum-classical variational approach motivated by quantum-approximate optimization algorithms, without prior calculation of optimal variational parameters by numerical simulation. The fidelity of generated states to the thermal-equilibrium state smoothly varies from 99 to 75% between infinite and near-zero simulated temperature, in quantitative agreement with numerical simulations of the noisy quantum processor with error parameters drawn from experiment.
We demonstrate how to explore phase diagrams with automated and unsupervised machine learning to find regions of interest for possible new phases. In contrast to supervised learning, where data is classified using predetermined labels, we here perform anomaly detection, where the task is to differentiate a normal data set, composed of one or several classes, from anomalous data. Asa paradigmatic example, we explore the phase diagram of the extended Bose Hubbard model in one dimension at exact integer filling and employ deep neural networks to determine the entire phase diagram in a completely unsupervised and automated fashion. As input data for learning, we first use the entanglement spectra and central tensors derived from tensor-networks algorithms for ground-state computation and later we extend our method and use experimentally accessible data such as low-order correlation functions as inputs. Our method allows us to reveal a phase-separated region between supersolid and superfluid parts with unexpected properties, which appears in the system in addition to the standard superfluid, Mott insulator, Haldane-insulating, and density wave phases.
We revisit the question of universality in quantum computing and propose a new paradigm. Instead of forcing a physical system to enact a predetermined set of universal gates (e.g., single-qubit operations and CNOT), we focus on the intrinsic ability of a system to act as a universal quantum computer using only its naturally available interactions. A key element of this approach is the realization that the fungible nature of quantum information allows for universal manipulations using quantum information encoded in a subspace of the full system Hilbert space, as an alternative to using physical qubits directly. Starting with the interactions intrinsic to the physical system, we show how to determine the possible universality resulting from these interactions over an encoded subspace. We outline a general Lie-algebraic framework which can be used to find the encoding for universality and give several examples relevant to solid-state quantum computing.
We present efficient quantum algorithms for simulating time-dependent Hamiltonian evolution of general input states using an oracular model of a quantum computer. Our algorithms use either constant or adaptively chosen time steps and are significant because they are the first to have time-complexities that are comparable to the best known methods for simulating time-independent Hamiltonian evolution, given appropriate smoothness criteria on the Hamiltonian are satisfied. We provide a thorough cost analysis of these algorithms that considers discretizion errors in both the time and the representation of the Hamiltonian. In addition, we provide the first upper bounds for the error in Lie-Trotter-Suzuki approximations to unitary evolution operators, that use adaptively chosen time steps.