No Arabic abstract
The duration, strength and structure of memory effects are crucial properties of physical evolution. Due to the invasive nature of quantum measurement, such properties must be defined with respect to the probing instruments employed. Here, using a photonic platform, we experimentally demonstrate this necessity via two paradigmatic processes: future-history correlations in the first process can be erased by an intermediate quantum measurement; for the second process, a noisy classical measurement blocks the effect of history. We then apply memory truncation techniques to recover an efficient description that approximates expectation values for multi-time observables. Our proof-of-principle analysis paves the way for experiments concerning more general non-Markovian quantum processes and highlights where standard open systems techniques break down.
Every quantum system is coupled to an environment. Such system-environment interaction leads to temporal correlation between quantum operations at different times, resulting in non-Markovian noise. In principle, a full characterisation of non-Markovian noise requires tomography of a multi-time processes matrix, which is both computationally and experimentally demanding. In this paper, we propose a more efficient solution. We employ machine learning models to estimate the amount of non-Markovianity, as quantified by an information-theoretic measure, with tomographically incomplete measurement. We test our model on a quantum optical experiment, and we are able to predict the non-Markovianity measure with $90%$ accuracy. Our experiment paves the way for efficient detection of non-Markovian noise appearing in large scale quantum computers.
Simulating complex processes can be intractable when memory effects are present, often necessitating approximations in the length or strength of the memory. However, quantum processes display distinct memory effects when probed differently, precluding memory approximations that are both universal and operational. Here, we show that it is nevertheless sensible to characterize the memory strength across a duration of time with respect to a sequence of probing instruments. We propose a notion of process recovery, leading to accurate predictions for any multi-time observable, with errors bounded by the memory strength. We then apply our framework to an exactly solvable non-Markovian model, highlighting the decay of memory for certain instruments that justify its truncation. Our formalism provides an unambiguous description of memory strength,paving the way for practical compression and recovery techniques pivotal to near-term quantum technologies.
The ability to communicate quantum information over long distances is of central importance in quantum science and engineering. For example, it enables secure quantum key distribution (QKD) relying on fundamental principles that prohibit the cloning of unknown quantum states. While QKD is being successfully deployed, its range is currently limited by photon losses and cannot be extended using straightforward measure-and-repeat strategies without compromising its unconditional security. Alternatively, quantum repeaters, which utilize intermediate quantum memory nodes and error correction techniques, can extend the range of quantum channels. However, their implementation remains an outstanding challenge, requiring a combination of efficient and high-fidelity quantum memories, gate operations, and measurements. Here we report the experimental realization of memory-enhanced quantum communication. We use a single solid-state spin memory integrated in a nanophotonic diamond resonator to implement asynchronous Bell-state measurements. This enables a four-fold increase in the secret key rate of measurement device independent (MDI)-QKD over the loss-equivalent direct-transmission method while operating megahertz clock rates. Our results represent a significant step towards practical quantum repeaters and large-scale quantum networks.
Characterisation protocols have so far played a central role in the development of noisy intermediate-scale quantum (NISQ) computers capable of impressive quantum feats. This trajectory is expected to continue in building the next generation of devices: ones that can surpass classical computers for particular tasks -- but progress in characterisation must keep up with the complexities of intricate device noise. A missing piece in the zoo of characterisation procedures is tomography which can completely describe non-Markovian dynamics. Here, we formally introduce a generalisation of quantum process tomography, which we call process tensor tomography. We detail the experimental requirements, construct the necessary post-processing algorithms for maximum-likelihood estimation, outline the best-practice aspects for accurate results, and make the procedure efficient for low-memory processes. The characterisation is the pathway to diagnostics and informed control of correlated noise. As an example application of the technique, we improve multi-time circuit fidelities on IBM Quantum devices for both standalone qubits and in the presence of crosstalk to a level comparable with the fault-tolerant noise threshold in a variety of different noise conditions. Our methods could form the core for carefully developed software that may help hardware consistently pass the fault-tolerant noise threshold.
Efficient simulations of the dynamics of open systems is of wide importance for quantum science and tech-nology. Here, we introduce a generalization of the transfer-tensor, or discrete-time memory kernel, formalism to multi-time measurement scenarios. The transfer-tensor method sets out to compute the state of an open few-body quantum system at long times, given that only short-time system trajectories are available. Here, we showthat the transfer-tensor method can be extended to processes which include multiple interrogations (e.g. measurements) of the open system dynamics as it evolves, allowing us to propagate high order short-time correlation functions to later times, without further recourse to the underlying system-environment evolution. Our approach exploits the process-tensor description of open quantum processes to represent and propagate the dynamics in terms of an object from which any multi-time correlation can be extracted. As an illustration of the utility of the method, we study the build-up of system-environment correlations in the paradigmatic spin-boson model, and compute steady-state emission spectra, taking fully into account system-environment correlations present in the steady state.