No Arabic abstract
The accurate and reliable description of measurement devices is a central problem in both observing uniquely non-classical behaviors and realizing quantum technologies from powerful computing to precision metrology. To date quantum tomography is the prevalent tool to characterize quantum detectors. However, such a characterization relies on accurately characterized probe states, rendering reliability of the characterization lost in circular argument. Here we report a self-characterization method of quantum measurements based on reconstructing the response range, the entirety of attainable measurement outcomes, eliminating the reliance on known states. We characterize two representative measurements implemented with photonic setups and obtain fidelities above 99.99% with the conventional tomographic reconstructions. This initiates range-based techniques in characterizing quantum systems and foreshadows novel device-independent protocols of quantum information applications.
We report the characterization of a universal set of logic gates for one-way quantum computing using a four-photon `star cluster state generated by fusing photons from two independent photonic crystal fibre sources. We obtain a fidelity for the cluster state of 0.66 +/- 0.01 with respect to the ideal case. We perform quantum process tomography to completely characterize a controlled-NOT, Hadamard and T gate all on the same compact entangled resource. Together, these operations make up a universal set of gates such that arbitrary quantum logic can be efficiently constructed from combinations of them. We find process fidelities with respect to the ideal cases of 0.64 +/- 0.01 for the CNOT, 0.67 +/- 0.03 for the Hadamard and 0.76 +/- 0.04 for the T gate. The characterisation of these gates enables the simulation of larger protocols and algorithms. As a basic example, we simulate a Swap gate consisting of three concatenated CNOT gates. Our work provides some pragmatic insights into the prospects for building up to a fully scalable and fault-tolerant one-way quantum computer with photons in realistic conditions.
As strength of disorder enhances beyond a threshold value in many-body systems, a fundamental transformation happens through which the entire spectrum localizes, a phenomenon known as many-body localization. This has profound implications as it breaks down fundamental principles of statistical mechanics, such as thermalization and ergodicity. Due to the complexity of the problem, the investigation of the many-body localization transition has remained a big challenge. The experimental exploration of the transition point is even more challenging as most of the proposed quantities for studying such effect are practically infeasible. Here, we experimentally implement a scalable protocol for detecting the many-body localization transition point, using the dynamics of a $N=12$ superconducting qubit array. We show that the sensitivity of the dynamics to random samples becomes maximum at the transition point which leaves its fingerprints in all spatial scales. By exploiting three quantities, each with different spatial resolution, we identify the transition point with excellent match between simulation and experiment. In addition, one can detect the evidence of mobility edge through slight variation of the transition point as the initial state varies. The protocol is easily scalable and can be performed across various physical platforms.
We characterize the energetic footprint of a two-qubit quantum gate from the perspective of non-equilibrium quantum thermodynamics. We experimentally reconstruct the statistics of energy and entropy fluctuations following the implementation of a controlled-unitary gate, linking them to the performance of the gate itself and the phenomenology of Landauer principle at the single-quantum level. Our work thus addresses the energetic cost of operating quantum circuits, a problem that is crucial for the grounding of the upcoming quantum technologies.
A goal of the emerging field of quantum control is to develop methods for quantum technologies to function robustly in the presence of noise. Central issues are the fundamental limitations on the available information about quantum systems and the disturbance they suffer in the process of measurement. In the context of a simple quantum control scenario--the stabilization of non-orthogonal states of a qubit against dephasing--we experimentally explore the use of weak measurements in feedback control. We find that, despite the intrinsic difficultly of implementing them, weak measurements allow us to control the qubit better in practice than is even theoretically possible without them. Our work shows that these more general quantum measurements can play an important role for feedback control of quantum systems.
Quantum tomography is a critically important tool to evaluate quantum hardware, making it essential to develop optimized measurement strategies that are both accurate and efficient. We compare a variety of strategies using nearly pure test states. Those that are informationally complete for all states are found to be accurate and reliable even in the presence of errors in the measurements themselves, while those designed to be complete only for pure states are far more efficient but highly sensitive to such errors. Our results highlight the unavoidable tradeoffs inherent to quantum tomography.