Current implementations of quantum logic gates can be highly faulty and introduce errors. In order to correct these errors, it is necessary to first identify the faulty gates. We demonstrate a procedure to diagnose where gate faults occur in a circuit by using a hybridized quantum-and-classical K-Nearest-Neighbors (KNN) machine-learning technique. We accomplish this task using a diagnostic circuit and selected input qubits to obtain the fidelity between a set of output states and reference states. The outcomes of the circuit can then be stored to be used for a classical KNN algorithm. We numerically demonstrate an ability to locate a faulty gate in circuits with over 30 gates and up to nine qubits with over 90% accuracy.
Noise mitigation and reduction will be crucial for obtaining useful answers from near-term quantum computers. In this work, we present a general framework based on machine learning for reducing the impact of quantum hardware noise on quantum circuits. Our method, called noise-aware circuit learning (NACL), applies to circuits designed to compute a unitary transformation, prepare a set of quantum states, or estimate an observable of a many-qubit state. Given a task and a device model that captures information about the noise and connectivity of qubits in a device, NACL outputs an optimized circuit to accomplish this task in the presence of noise. It does so by minimizing a task-specific cost function over circuit depths and circuit structures. To demonstrate NACL, we construct circuits resilient to a fine-grained noise model derived from gate set tomography on a superconducting-circuit quantum device, for applications including quantum state overlap, quantum Fourier transform, and W-state preparation.
We generalize quantum circuits for the Toffoli gate presented by Selinger and Jones for functionally controlled NOT gates, i.e., $X$ gates controlled by arbitrary $n$-variable Boolean functions. Our constructions target the gate set consisting of Clifford gates and single qubit rotations by arbitrary angles. Our constructions use the Walsh-Hadamard spectrum of Boolean functions and build on the work by Schuch and Siewert and Welch et al. We present quantum circuits for the case where the target qubit is in an arbitrary state as well as the special case where the target is in a known state. Additionally, we present constructions that require no auxiliary qubits and constructions that have a rotation depth of 1.
The execution of quantum circuits on real systems has largely been limited to those which are simply time-ordered sequences of unitary operations followed by a projective measurement. As hardware platforms for quantum computing continue to mature in size and capability, it is imperative to enable quantum circuits beyond their conventional construction. Here we break into the realm of dynamic quantum circuits on a superconducting-based quantum system. Dynamic quantum circuits involve not only the evolution of the quantum state throughout the computation, but also periodic measurements of a subset of qubits mid-circuit and concurrent processing of the resulting classical information within timescales shorter than the execution times of the circuits. Using noisy quantum hardware, we explore one of the most fundamental quantum algorithms, quantum phase estimation, in its adaptive version, which exploits dynamic circuits, and compare the results to a non-adaptive implementation of the same algorithm. We demonstrate that the version of real-time quantum computing with dynamic circuits can offer a substantial and tangible advantage when noise and latency are sufficiently low in the system, opening the door to a new realm of available algorithms on real quantum systems.
Machine learning techniques have led to broad adoption of a statistical model of computing. The statistical distributions natively available on quantum processors are a superset of those available classically. Harnessing this attribute has the potential to accelerate or otherwise improve machine learning relative to purely classical performance. A key challenge toward that goal is learning to hybridize classical computing resources and traditional learning techniques with the emerging capabilities of general purpose quantum processors. Here, we demonstrate such hybridization by training a 19-qubit gate model processor to solve a clustering problem, a foundational challenge in unsupervised learning. We use the quantum approximate optimization algorithm in conjunction with a gradient-free Bayesian optimization to train the quantum machine. This quantum/classical hybrid algorithm shows robustness to realistic noise, and we find evidence that classical optimization can be used to train around both coherent and incoherent imperfections.
Machine learning is seen as a promising application of quantum computation. For near-term noisy intermediate-scale quantum (NISQ) devices, parametrized quantum circuits (PQCs) have been proposed as machine learning models due to their robustness and ease of implementation. However, the cost function is normally calculated classically from repeated measurement outcomes, such that it is no longer encoded in a quantum state. This prevents the value from being directly manipulated by a quantum computer. To solve this problem, we give a routine to embed the cost function for machine learning into a quantum circuit, which accepts a training dataset encoded in superposition or an easily preparable mixed state. We also demonstrate the ability to evaluate the gradient of the encoded cost function in a quantum state.