No Arabic abstract
Quantum algorithms for Noisy Intermediate-Scale Quantum (NISQ) machines have recently emerged as new promising routes towards demonstrating near-term quantum advantage (or supremacy) over classical systems. In these systems samples are typically drawn from probability distributions which --- under plausible complexity-theoretic conjectures --- cannot be efficiently generated classically. Rather than first define a physical system and then determine computational features of the output state, we ask the converse question: given direct access to the quantum state, what features of the generating system can we efficiently learn? In this work we introduce the Variational Quantum Unsampling (VQU) protocol, a nonlinear quantum neural network approach for verification and inference of near-term quantum circuits outputs. In our approach one can variationally train a quantum operation to unravel the action of an unknown unitary on a known input state; essentially learning the inverse of the black-box quantum dynamics. While the principle of our approach is platform independent, its implementation will depend on the unique architecture of a specific quantum processor. Here, we experimentally demonstrate the VQU protocol on a quantum photonic processor. Alongside quantum verification, our protocol has broad applications; including optimal quantum measurement and tomography, quantum sensing and imaging, and ansatz validation.
Solving finite-temperature properties of quantum many-body systems is generally challenging to classical computers due to their high computational complexities. In this article, we present experiments to demonstrate a hybrid quantum-classical simulation of thermal quantum states. By combining a classical probabilistic model and a 5-qubit programmable superconducting quantum processor, we prepare Gibbs states and excited states of Heisenberg XY and XXZ models with high fidelity and compute thermal properties including the variational free energy, energy, and entropy with a small statistical error. Our approach combines the advantage of classical probabilistic models for sampling and quantum co-processors for unitary transformations. We show that the approach is scalable in the number of qubits, and has a self-verifiable feature, revealing its potentials in solving large-scale quantum statistical mechanics problems on near-term intermediate-scale quantum computers.
In the near-term, hybrid quantum-classical algorithms hold great potential for outperforming classical approaches. Understanding how these two computing paradigms work in tandem is critical for identifying areas where such hybrid algorithms could provide a quantum advantage. In this work, we study a QAOA-based quantum optimization algorithm by implementing the Variational Quantum Factoring (VQF) algorithm. We execute experimental demonstrations using a superconducting quantum processor and investigate the trade-off between quantum resources (number of qubits and circuit depth) and the probability that a given biprime is successfully factored. In our experiments, the integers 1099551473989, 3127, and 6557 are factored with 3, 4, and 5 qubits, respectively, using a QAOA ansatz with up to 8 layers and we are able to identify the optimal number of circuit layers for a given instance to maximize success probability. Furthermore, we demonstrate the impact of different noise sources on the performance of QAOA and reveal the coherent error caused by the residual ZZ-coupling between qubits as a dominant source of error in the superconducting quantum processor.
Photonic processors are pivotal for both quantum and classical information processing tasks using light. In particular, linear optical quantum information processing requires both largescale and low-loss programmable photonic processors. In this paper, we report the demonstration of the largest universal quantum photonic processor to date: a low-loss, 12-mode fully tunable linear interferometer with all-to-all coupling based on stoichiometric silicon nitride waveguides.
A quantum processor to import, process, and export optical quantum states is a common core technology enabling various photonic quantum information processing. However, there has been no photonic processor which is simultaneously universal, scalable, and programmable. Here, we report on an original loop-based single-mode versatile photonic quantum processor which is designed to be universal, scalable, and programmable. Our processor can perform arbitrarily many steps of programmable quantum operations on a given single-mode optical quantum state by time-domain processing in a dynamically controlled loop-based optical circuit. We use this processor to demonstrate programmable single-mode Gaussian gates and multi-step squeezing gates. In addition, we prove that the processor can perform universal quantum operations by injecting appropriate ancillary states and also be straightforwardly extended to a multi-mode processor. These results show that our processor is programmable, scalable, and potentially universal, leading to be suitable for general-purpose applications.
Entanglement is a fundamental property of quantum mechanics, and is a primary resource in quantum information systems. Its manipulation remains a central challenge in the development of quantum technology. In this work, we demonstrate a device which can generate, manipulate, and analyse two-qubit entangled states, using miniature and mass-manufacturable silicon photonics. By combining four photon-pair sources with a reconfigurable six-mode interferometer, embedding a switchable entangling gate, we generate two-qubit entangled states, manipulate their entanglement, and analyse them, all in the same silicon chip. Using quantum state tomography, we show how our source can produce a range of entangled and separable states, and how our switchable controlled-Z gate operates on them, entangling them or making them separable depending on its configuration.