No Arabic abstract
As quantum computers improve in the number of qubits and fidelity, the question of when they surpass state-of-the-art classical computation for a well-defined computational task is attracting much attention. The leading candidate task for this milestone entails sampling from the output distribution defined by a random quantum circuit. We develop a massively-parallel simulation tool Rollright that does not require inter-process communication (IPC) or proprietary hardware. We also develop two ways to trade circuit fidelity for computational speedups, so as to match the fidelity of a given quantum computer --- a task previously thought impossible. We report massive speedups for the sampling task over prior software from Microsoft, IBM, Alibaba and Google, as well as supercomputer and GPU-based simulations. By using publicly available Google Cloud Computing, we price such simulations and enable comparisons by total cost across hardware platforms. We simulate approximate sampling from the output of a circuit with 7x8 qubits and depth 1+40+1 by producing one million bitstring probabilities with fidelity 0.5%, at an estimated cost of $35184. The simulation costs scale linearly with fidelity, and using this scaling we estimate that extending circuit depth to 1+48+1 increases costs to one million dollars. Scaling the simulation to 10M bitstring probabilities needed for sampling 1M bitstrings helps comparing simulation to quantum computers. We describe refinements in benchmarks that slow down leading simulators, halving the circuit depth that can be simulated within the same time.
As Moores law reaches its limits, quantum computers are emerging with the promise of dramatically outperforming classical computers. We have witnessed the advent of quantum processors with over $50$ quantum bits (qubits), which are expected to be beyond the reach of classical simulation. Quantum supremacy is the event at which the old Extended Church-Turing Thesis is overturned: A quantum computer performs a task that is practically impossible for any classical (super)computer. The demonstration requires both a solid theoretical guarantee and an experimental realization. The lead candidate is Random Circuit Sampling (RCS), which is the task of sampling from the output distribution of random quantum circuits. Google recently announced a $53-$qubit experimental demonstration of RCS. Soon after, classical algorithms appeared that challenge the supremacy of random circuits by estimating their outputs. How hard is it to classically simulate the output of random quantum circuits? We prove that estimating the output probabilities of random quantum circuits is formidably hard ($#P$-Hard) for any classical computer. This makes RCS the strongest candidate for demonstrating quantum supremacy relative to all other proposals. The robustness to the estimation error that we prove may serve as a new hardness criterion for the performance of classical algorithms. To achieve this, we introduce the Cayley path interpolation between any two gates of a quantum computation and convolve recent advances in quantum complexity and information with probability and random matrices. Furthermore, we apply algebraic geometry to generalize the well-known Berlekamp-Welch algorithm that is widely used in coding theory and cryptography. Our results imply that there is an exponential hardness barrier for the classical simulation of most quantum circuits.
This paper answers Bells question: What does quantum information refer to? It is about quantum properties represented by subspaces of the quantum Hilbert space, or their projectors, to which standard (Kolmogorov) probabilities can be assigned by using a projective decomposition of the identity (PDI or framework) as a quantum sample space. The single framework rule of consistent histories prevents paradoxes or contradictions. When only one framework is employed, classical (Shannon) information theory can be imported unchanged into the quantum domain. A particular case is the macroscopic world of classical physics whose quantum description needs only a single quasiclassical framework. Nontrivial issues unique to quantum information, those with no classical analog, arise when aspects of two or more incompatible frameworks are compared.
Entanglement has long stood as one of the characteristic features of quantum mechanics, yet recent developments have emphasized the importance of quantumness beyond entanglement for quantum foundations and technologies. We demonstrate that entanglement cannot entirely capture the worst-case sensitivity in quantum interferometry, when quantum probes are used to estimate the phase imprinted by a Hamiltonian, with fixed energy levels but variable eigenbasis, acting on one arm of an interferometer. This is shown by defining a bipartite entanglement monotone tailored to this interferometric setting and proving that it never exceeds the so-called interferometric power, a quantity which relies on more general quantum correlations beyond entanglement and captures the relevant resource. We then prove that the interferometric power can never increase when local commutativity-preserving operations are applied to qubit probes, an important step to validate such a quantity as a genuine quantum correlations monotone. These findings are accompanied by a room-temperature nuclear magnetic resonance experimental investigation, in which two-qubit states with extremal (maximal and minimal) interferometric power at fixed entanglement are produced and characterized.
Operational frameworks are very useful to study the foundations of quantum mechanics, and are sometimes used to promote antirealist attitudes towards the theory. The aim of this paper is to review three arguments aiming at defending an antirealist reading of quantum physics based on various developments of standard quantum mechanics appealing to notions such as quantum information, non-causal correlations and indefinite causal orders. Those arguments will be discussed in order to show that they are not convincing. Instead, it is argued that there is conceptually no argument that could favour realist or antirealist attitudes towards quantum mechanics based solely on some features of some formalism. In particular, both realist and antirealist views are well accomodable within operational formulations of the theory. The reason for this is that the realist/antirealist debate is located at a purely epistemic level, which is not engaged by formal aspects of theories. As such, operational formulations of quantum mechanics are epistmologically and ontologically neutral. This discussion aims at clarifying the limits of the historical and methodological affinities between scientific antirealism and operational physics while engaging with recent discoveries in quantum foundations. It also aims at presenting various realist strategies to account for those developments.
We show that quantum mechanics is the first theory in human history that violates the basic a priori principles that have shaped human thought since immemorial times. Therefore although it is more contrary to magic than any body of knowledge could be, what could be called its magic precisely resides in this violation.