No Arabic abstract
We introduce a reliable compressive procedure to uniquely characterize any given low-rank quantum measurement using a minimal set of probe states that is based solely on data collected from the unknown measurement itself. The procedure is most compressive when the measurement constitutes pure detection outcomes, requiring only an informationally complete number of probe states that scales linearly with the system dimension. We argue and provide numerical evidence showing that the minimal number of probe states needed is even generally below the numbers known in the closely-related classical phase-retrieval problem because of the quantum constraint. We also present affirmative results with polarization experiments that illustrate significant compressive behaviors for both two- and four-qubit detectors just by using random product probe states.
High-dimensional encoding of quantum information provides a promising method of transcending current limitations in quantum communication. One of the central challenges in the pursuit of such an approach is the certification of high-dimensional entanglement. In particular, it is desirable to do so without resorting to inefficient full state tomography. Here, we show how carefully constructed measurements in two bases (one of which is not orthonormal) can be used to faithfully and efficiently certify bipartite high-dimensional states and their entanglement for any physical platform. To showcase the practicality of this approach under realistic conditions, we put it to the test for photons entangled in their orbital angular momentum. In our experimental setup, we are able to verify 9-dimensional entanglement for a pair of photons on a 11-dimensional subspace each, at present the highest amount certified without any assumptions on the state.
The power of quantum computers relies on the capability of their components to maintain faithfully and process accurately quantum information. Since this property eludes classical certification methods, fundamentally new protocols are required to guarantee that elementary components are suitable for quantum computation. These protocols must be device-independent, that is, they cannot rely on a particular physical description of the actual implementation if one is to qualify a block for all possible usages. Bells theorem has been proposed to certify, in a device-independent way, blocks either producing or measuring quantum states. In this manuscript, we provide the missing piece: a method based on Bells theorem to certify coherent operations such as storage, processing and transfer of quantum information. This completes the set of tools needed to certify all building blocks of a quantum computer. Our method is robust to experimental imperfections, and so can be readily used to certify that todays quantum devices are qualified for usage in future quantum computers.
In response to recent criticisms by Okon and Sudarsky, various aspects of the consistent histories (CH) resolution of the quantum measurement problem(s) are discussed using a simple Stern-Gerlach device, and compared with the alternative approaches to the measurement problem provided by spontaneous localization (GRW), Bohmian mechanics, many worlds, and standard (textbook) quantum mechanics. Among these CH is unique in solving the second measurement problem: inferring from the measurement outcome a property of the measured system at a time before the measurement took place, as is done routinely by experimental physicists. The main respect in which CH differs from other quantum interpretations is in allowing multiple stochastic descriptions of a given measurement situation, from which one (or more) can be selected on the basis of its utility. This requires abandoning a principle (termed unicity), central to classical physics, that at any instant of time there is only a single correct description of the world.
The resources needed to conventionally characterize a quantum system are overwhelmingly large for high- dimensional systems. This obstacle may be overcome by abandoning traditional cornerstones of quantum measurement, such as general quantum states, strong projective measurement, and assumption-free characterization. Following this reasoning, we demonstrate an efficient technique for characterizing high-dimensional, spatial entanglement with one set of measurements. We recover sharp distributions with local, random filtering of the same ensemble in momentum followed by position---something the uncertainty principle forbids for projective measurements. Exploiting the expectation that entangled signals are highly correlated, we use fewer than 5,000 measurements to characterize a 65, 536-dimensional state. Finally, we use entropic inequalities to witness entanglement without a density matrix. Our method represents the sea change unfolding in quantum measurement where methods influenced by the information theory and signal-processing communities replace unscalable, brute-force techniques---a progression previously followed by classical sensing.
Separability criteria are typically of the necessary, but not sufficient, variety, in that satisfying some separability criterion, such as positivity of eigenvalues under partial transpose, does not strictly imply separability. Certifying separability amounts to proving the existence of a decomposition of a target mixed state into some convex combination of separable states; determining the existence of such a decomposition is hard. We show that it is effective to ask, instead, if the target mixed state fits some preconstructed separable form, in that one can generate a sufficient separability criterion relevant to all target states in some family by ensuring enough degrees of freedom in the preconstructed separable form. We demonstrate this technique by inducing a sufficient criterion for diagonally symmetric states of N qubits. A sufficient separability criterion opens the door to study precisely how entanglement is (not) formed; we use ours to prove that, counterintuitively, entanglement is not generated in idealized Dicke model superradiance despite its exemplification of many-body effects. We introduce a quantification of the extent to which a given preconstructed parametrization comprises the set of all separable states; for diagonally symmetric states our preconstruction is shown to be fully complete. This implies that our criterion is necessary in addition to sufficient, among other ramifications which we explore.