No Arabic abstract
Universal quantum computers promise a dramatic speed-up over classical computers but a full-size realization remains challenging. However, intermediate quantum computational models have been proposed that are not universal, but can solve problems that are strongly believed to be classically hard. Aaronson and Arkhipov have shown that interference of single photons in random optical networks can solve the hard problem of sampling the bosonic output distribution which is directly connected to computing matrix permanents. Remarkably, this computation does not require measurement-based interactions or adaptive feed-forward techniques. Here we demonstrate this model of computation using high--quality laser--written integrated quantum networks that were designed to implement random unitary matrix transformations. We experimentally characterize the integrated devices using an in--situ reconstruction method and observe three-photon interference that leads to the boson-sampling output distribution. Our results set a benchmark for quantum computers, that hold the potential of outperforming conventional ones using only a few dozen photons and linear-optical elements.
Gaussian Boson sampling (GBS) provides a highly efficient approach to make use of squeezed states from parametric down-conversion to solve a classically hard-to-solve sampling problem. The GBS protocol not only significantly enhances the photon generation probability, compared to standard boson sampling with single photon Fock states, but also links to potential applications such as dense subgraph problems and molecular vibronic spectra. Here, we report the first experimental demonstration of GBS using squeezed-state sources with simultaneously high photon indistinguishability and collection efficiency. We implement and validate 3-, 4- and 5-photon GBS with high sampling rates of 832 kHz, 163 kHz and 23 kHz, respectively, which is more than 4.4, 12.0, and 29.5 times faster than the previous experiments. Further, we observe a quantum speed-up on a NP-hard optimization problem when comparing with simulated thermal sampler and uniform sampler.
Quantum computation, aiming at tackling hard problems beyond classical approaches, has been flourishing with each passing day. Unfortunately, a fully scalable and fault-tolerant universal quantum computer remains challenging based on the current technology. Boson sampling, first proposed by Aaronson and Arkhipov, is commonly believed as the most promising candidate to reach the intermediate quantum computational milestone, namely, quantum supremacy. Following this leading proposal, many experimental implementations as well as variants of boson sampling have been shown. However, most of these works are limited to small scale and cannot fulfill the permanent-of-Gaussians conjecture. Here, we experimentally demonstrate the largest scale boson sampling in the collision-free dominant regime using multi-port interferometer in a 3D photonic chip. We measure all 6,545 no-collision output combinations and validate the experimental results. Our work shows the potential of 3D photonic chip platform and represents a solid step toward large scale boson sampling.
A boson sampling device is a specialised quantum computer that solves a problem which is strongly believed to be computationally hard for classical computers. Recently a number of small-scale implementations have been reported, all based on multi-photon interference in multimode interferometers. In the hard-to-simulate regime, even validating the devices functioning may pose a problem . In a recent paper, Gogolin et al. showed that so-called symmetric algorithms would be unable to distinguish the experimental distribution from the trivial, uniform distribution. Here we report new boson sampling experiments on larger photonic chips, and analyse the data using a scalable statistical test recently proposed by Aaronson and Arkhipov. We show the test successfully validates small experimental data samples against the hypothesis that they are uniformly distributed. We also show how to discriminate data arising from either indistinguishable or distinguishable photons. Our results pave the way towards larger boson sampling experiments whose functioning, despite being non-trivial to simulate, can be certified against alternative hypotheses.
Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. When using heralded single-photon sources based on parametric down-conversion, this approach offers an $sim e$-fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. More significantly, this approach offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.
Boson Sampling has emerged as a tool to explore the advantages of quantum over classical computers as it does not require a universal control over the quantum system, which favours current photonic experimental platforms.Here, we introduce Gaussian Boson Sampling, a classically hard-to-solve problem that uses squeezed states as a non-classical resource. We relate the probability to measure specific photon patterns from a general Gaussian state in the Fock basis to a matrix function called the hafnian, which answers the last remaining question of sampling from Gaussian states. Based on this result, we design Gaussian Boson Sampling, a #P hard problem, using squeezed states. This approach leads to a more efficient photonic boson sampler with significant advantages in generation probability and measurement time over currently existing protocols.