No Arabic abstract
Multi-particle interference is a key resource for quantum information processing, as exemplified by Boson Sampling. Hence, given its fragile nature, an essential desideratum is a solid and reliable framework for its validation. However, while several protocols have been introduced to this end, the approach is still fragmented and fails to build a big picture for future developments. In this work, we propose an operational approach to validation that encompasses and strengthens the state of the art for these protocols. To this end, we consider the Bayesian hypothesis testing and the statistical benchmark as most favorable protocols for small- and large-scale applications, respectively. We numerically investigate their operation with finite sample size, extending previous tests to larger dimensions, and against two adversarial algorithms for classical simulation: the Mean-Field sampler and the Metropolized Independent Sampler. To evidence the actual need for refined validation techniques, we show how the assessment of numerically simulated data depends on the available sample size, as well as on the internal hyper-parameters and other practically relevant constraints. Our analyses provide general insights into the challenge of validation, and can inspire the design of algorithms with a measurable quantum advantage.
Metasurfaces based on resonant nanophotonic structures have enabled novel types of flat-optics devices often outperforming the capabilities of bulk components, yet these advances remain largely unexplored for quantum applications. We show that non-classical multi-photon interferences can be achieved at the subwavelength scale in all-dielectric metasurfaces. We simultaneously image multiple projections of quantum states with a single metasurface, enabling a robust reconstruction of amplitude, phase, coherence, and entanglement of multi-photon polarization-encoded states. One- and two-photon states are reconstructed through nonlocal photon correlation measurements with polarization-insensitive click-detectors positioned after the metasurface, and the scalability to higher photon numbers is established theoretically. Our work illustrates the feasibility of ultra-thin quantum metadevices for the manipulation and measurement of multi-photon quantum states with applications in free-space quantum imaging and communications.
Multi-photon entangled states of light are key to advancing quantum communication, computation, and metrology. Current methods for building such states are based on stitching together photons from probabilistic sources. The probability of $N$ such sources firing simultaneously decreases exponentially with $N$, imposing severe limitations on the practically achievable number of coincident photons. We tackle this challenge with a quantum interference buffer (QIB), which combines three functionalities: firstly, it stores polarization qubits, enabling the use of polarization-entangled states as resource; secondly, it implements entangled-source multiplexing, greatly enhancing the resource-state generation rates; thirdly, it implements time-multiplexed, on-demand linear optical networks for interfering subsequent states. Using the QIB, we multiplex 21 Bell-state sources and demonstrate a nine-fold enhancement in the generation rate of four-photon GHZ states. The enhancement scales exponentially with the photon number; larger states benefit more strongly. Multiplexed photon entanglement and interference will find diverse applications in quantum photonics, allowing for practical realisations of multi-photon protocols.
Quantum entanglement can help to increase the precision of optical phase measurements beyond the shot noise limit (SNL) to the ultimate Heisenberg limit. However, the N-photon parity measurements required to achieve this optimal sensitivity are extremely difficult to realize with current photon detection technologies, requiring high-fidelity resolution of N+1 different photon distributions between the output ports. Recent experimental demonstrations of precision beyond the SNL have therefore used only one or two photon-number detection patterns instead of parity measurements. Here we investigate the achievable phase sensitivity of the simple and efficient single interference fringe detection technique. We show that the maximally-entangled NOON state does not achieve optimal phase sensitivity when N > 4, rather, we show that the Holland-Burnett state is optimal. We experimentally demonstrate this enhanced sensitivity using a single photon-counted fringe of the six-photon Holland-Burnett state. Specifically, our single-fringe six-photon measurement achieves a phase variance three times below the SNL.
We introduce the concept of hypergraphs to describe quantum optical experiments with probabilistic multi-photon sources. Every hyperedge represents a correlated photon source, and every vertex stands for an optical output path. Such general graph description provides new insights for producing complex high-dimensional multi-photon quantum entangled states, which go beyond limitations imposed by pair creation via spontaneous parametric down-conversion. Furthermore, properties of hypergraphs can be investigated experimentally. For example, the NP-Complete problem of deciding whether a hypergraph has a perfect matchin can be answered by experimentally detecting multi-photon events in quantum experiments. By introducing complex weights in hypergraphs, we show a general many-particle quantum interference and manipulating entanglement in a pictorial way. Our work paves the path for the development of multi-photon high-dimensional state generation and might inspire new applications of quantum computations using hypergraph mappings.
We present a quantum fingerprinting protocol relying on two-photon interference which does not require a shared phase reference between the parties preparing optical signals carrying data fingerprints. We show that the scaling of the protocol, in terms of transmittable classical information, is analogous to the recently proposed and demonstrated scheme based on coherent pulses and first-order interference, offering comparable advantage over classical fingerprinting protocols without access to shared prior randomness. We analyze the protocol taking into account non-Poissonian photon statistics of optical signals and a variety of imperfections, such as transmission losses, dark counts, and residual distinguishability. The impact of these effects on the protocol performance is quantified with the help of Chernoff information.