No Arabic abstract
A crucial subroutine for various quantum computing and communication algorithms is to efficiently extract different classical properties of quantum states. In a notable recent theoretical work by Huang, Kueng, and Preskill~cite{huang2020predicting}, a thrifty scheme showed how to project the quantum state into classical shadows and simultaneously predict $M$ different functions of a state with only $mathcal{O}(log_2 M)$ measurements, independent of the system size and saturating the information-theoretical limit. Here, we experimentally explore the feasibility of the scheme in the realistic scenario with a finite number of measurements and noisy operations. We prepare a four-qubit GHZ state and show how to estimate expectation values of multiple observables and Hamiltonian. We compare the strategies with uniform, biased, and derandomized classical shadows to conventional ones that sequentially measures each state function exploiting either importance sampling or observable grouping. We next demonstrate the estimation of nonlinear functions using classical shadows and analyze the entanglement of the prepared quantum state. Our experiment verifies the efficacy of exploiting (derandomized) classical shadows and sheds light on efficient quantum computing with noisy intermediate-scale quantum hardware.
Obtaining precise estimates of quantum observables is a crucial step of variational quantum algorithms. We consider the problem of estimating expectation values of molecular Hamiltonians, obtained on states prepared on a quantum computer. We propose a novel estimator for this task, which is locally optimised with knowledge of the Hamiltonian and a classical approximation to the underlying quantum state. Our estimator is based on the concept of classical shadows of a quantum state, and has the important property of not adding to the circuit depth for the state preparation. We test its performance numerically for molecular Hamiltonians of increasing size, finding a sizable reduction in variance with respect to current measurement protocols that do not increase circuit depths.
The measurement-device-independent quantum key distribution (MDI-QKD) protocol plays an important role in quantum communications due to its high level of security and practicability. It can be immune to all side-channel attacks directed on the detecting devices. However, the protocol still contains strict requirements during state preparation in most existing MDI-QKD schemes, e.g., perfect state preparation or perfectly characterized sources, which are very hard to realize in practice. In this letter, we investigate uncharacterized MDI-QKD by utilizing a three-state method, greatly reducing the finite-size effect. The only requirement for state preparation is that the state are prepared in a bidimensional Hilbert space. Furthermore, a proof-of-principle demonstration over a 170 km transmission distance is achieved, representing the longest transmission distance under the same security level on record.
Quantum polyspectra of up to fourth order are introduced for modeling and evaluating quantum transport measurements offering a powerful alternative to methods of the traditional full counting statistics. Experimental time-traces of the occupation dynamics of a single quantum dot are evaluated via simultaneously fitting their 2nd-, 3rd-, and 4th-order spectra. The scheme recovers the same electron tunneling and spin relaxation rates as previously obtained from an analysis of the same data in terms of factorial cumulants of the full counting statistics and waiting-time distributions. Moreover, the evaluation of time-traces via quantum polyspectra is demonstrated to be feasible also in the weak measurement regime even when quantum jumps can no longer be identified from time-traces and methods related to the full counting statistics cease to be applicable. A numerical study of a double dot system shows strongly changing features in the quantum polyspectra for the transition from the weak measurement regime to the Zeno-regime where coherent tunneling dynamics is suppressed. Quantum polyspectra thus constitute a general unifying approach to the strong and weak regime of quantum measurements with possible applications in diverse fields as nano-electronics, circuit quantum electrodynamics, spin noise spectroscopy, or quantum optics.
We report on an experiment to detect non-classical correlations in a highly mixed state. The correlations are characterized by the quantum discord and are observed using four qubits in a liquid state nuclear magnetic resonance quantum information processor. The state analyzed is the output of a DQC1 computation, whose input is a single quantum bit accompanied by n maximally mixed qubits. This model of computation outperforms the best known classical algorithms, and although it contains vanishing entanglement it is known to have quantum correlations characterized by the quantum discord. This experiment detects non-vanishing quantum discord, ensuring the existence of non-classical correlations as measured by the quantum discord.
An experimental test of the special state theory of quantum measurement is proposed. It should be feasible with present-day laboratory equipment and involves a slightly elaborated Stern-Gerlach setup. The special state theory is conservative with respect to quantum mechanics, but radical with respect to statistical mechanics, in particular regarding the arrow of time. In this article background material is given on both quantum measurement and statistical mechanics aspects. For example, it is shown that future boundary conditions would not contradict experience, indicating that the fundamental equal-a-priori-probability assumption at the foundations of statistical mechanics is far too strong (since future conditioning reduces the class of allowed states). The test is based on a feature of this theory that was found necessary in order to recover standard (Born) probabilities in quantum measurements. Specifically, certain systems should have noise whose amplitude follows the long-tailed Cauchy distribution. This distribution is marked by the occasional occurrence of extremely large signals as well as a non-self-averaging property. The proposed test is a variant of the Stern-Gerlach experiment in which protocols are devised, some of which will require the presence of this noise, some of which will not. The likely observational schemes would involve the distinction between detection and non-detection of that noise. The signal to be detected (or not) would be either single photons in the visible and UV range or electric fields (and related excitations) in the neighborhood of the ends of the magnets.