Weak measurements may result in extra quantity of quantumness of correlations compared with standard projective measurement on a bipartite quantum state. We show that the quantumness of correlations by weak measurements can be consumed for information encoding which is only accessible by coherent quantum interactions. Then it can be considered as a resource for quantum information processing and can quantify this quantum advantage. We conclude that weak measurements can create more valuable quantum correlation.
The standard quantum error correction protocols use projective measurements to extract the error syndromes from the encoded states. We consider the more general scenario of weak measurements, where only partial information about the error syndrome can be extracted from the encoded state. We construct a feedback protocol that probabilistically corrects the error based on the extracted information. Using numerical simulations of one-qubit error correction codes, we show that our error correction succeeds for a range of the weak measurement strength, where (a) the error rate is below the threshold beyond which multiple errors dominate, and (b) the error rate is less than the rate at which weak measurement extracts information. It is also obvious that error correction with too small a measurement strength should be avoided.
The ability to follow the dynamics of a quantum system in a quantitative manner is of key importance for quantum technology. Despite its central role, justifiable deduction of the quantum dynamics of a single quantum system in terms of a macroscopical observable remains a challenge. Here we show that the relation between the readout signal of a single electron spin and the quantum dynamics of the single nuclear spin is given by a parameter related to the measurement strength. We determine this measurement strength in independent experiments and use this value to compare our analysis of the quantum dynamics with experimental results. We prove the validity of our approach by measuring violations of the Leggett-Garg inequality.
We show that postselection offers a nonclassical advantage in metrology. In every parameter-estimation experiment, the final measurement or the postprocessing incurs some cost. Postselection can improve the rate of Fisher information (the average information learned about an unknown parameter from an experimental trial) to cost. This improvement, we show, stems from the negativity of a quasiprobability distribution, a quantum extension of a probability distribution. In a classical theory, in which all observables commute, our quasiprobability distribution can be expressed as real and nonnegative. In a quantum-mechanically noncommuting theory, nonclassicality manifests in negative or nonreal quasiprobabilities. The distributions nonclassically negative values enable postselected experiments to outperform even postselection-free experiments whose input states and final measurements are optimized: Postselected quantum experiments can yield anomalously large information-cost rates. We prove that this advantage is genuinely nonclassical: no classically commuting theory can describe any quantum experiment that delivers an anomalously large Fisher information. Finally, we outline a preparation-and-postselection procedure that can yield an arbitrarily large Fisher information. Our results establish the nonclassicality of a metrological advantage, leveraging our quasiprobability distribution as a mathematical tool.
Quantum computer, harnessing quantum superposition to boost a parallel computational power, promises to outperform its classical counterparts and offer an exponentially increased scaling. The term quantum advantage was proposed to mark the key point when people can solve a classically intractable problem by artificially controlling a quantum system in an unprecedented scale, even without error correction or known practical applications. Boson sampling, a problem about quantum evolutions of multi-photons on multimode photonic networks, as well as its variants, has been considered as a promising candidate to reach this milestone. However, the current photonic platforms suffer from the scaling problems, both in photon numbers and circuit modes. Here, we propose a new variant of the problem, timestamp membosonsampling, exploiting the timestamp information of single photons as free resources, and the scaling of the problem can be in principle extended to infinitely large. We experimentally verify the scheme on a self-looped photonic chip inspired by memristor, and obtain multi-photon registrations up to 56-fold in 750,000 modes with a Hilbert space up to $10^{254}$. Our work exhibits an integrated and cost-efficient shortcut stepping into the quantum advantage regime in a photonic system far beyond previous scenarios, and provide a scalable and controllable platform for quantum information processing.
Gaussian boson sampling exploits squeezed states to provide a highly efficient way to demonstrate quantum computational advantage. We perform experiments with 50 input single-mode squeezed states with high indistinguishability and squeezing parameters, which are fed into a 100-mode ultralow-loss interferometer with full connectivity and random transformation, and sampled using 100 high-efficiency single-photon detectors. The whole optical set-up is phase-locked to maintain a high coherence between the superposition of all photon number states. We observe up to 76 output photon-clicks, which yield an output state space dimension of $10^{30}$ and a sampling rate that is $10^{14}$ faster than using the state-of-the-art simulation strategy and supercomputers. The obtained samples are validated against various hypotheses including using thermal states, distinguishable photons, and uniform distribution.