ترغب بنشر مسار تعليمي؟ اضغط هنا

Experimental Collision-Free Dominant Boson Sampling

240   0   0.0 ( 0 )
 نشر من قبل Xian-Min Jin
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Quantum computation, aiming at tackling hard problems beyond classical approaches, has been flourishing with each passing day. Unfortunately, a fully scalable and fault-tolerant universal quantum computer remains challenging based on the current technology. Boson sampling, first proposed by Aaronson and Arkhipov, is commonly believed as the most promising candidate to reach the intermediate quantum computational milestone, namely, quantum supremacy. Following this leading proposal, many experimental implementations as well as variants of boson sampling have been shown. However, most of these works are limited to small scale and cannot fulfill the permanent-of-Gaussians conjecture. Here, we experimentally demonstrate the largest scale boson sampling in the collision-free dominant regime using multi-port interferometer in a 3D photonic chip. We measure all 6,545 no-collision output combinations and validate the experimental results. Our work shows the potential of 3D photonic chip platform and represents a solid step toward large scale boson sampling.



قيم البحث

اقرأ أيضاً

Universal quantum computers promise a dramatic speed-up over classical computers but a full-size realization remains challenging. However, intermediate quantum computational models have been proposed that are not universal, but can solve problems tha t are strongly believed to be classically hard. Aaronson and Arkhipov have shown that interference of single photons in random optical networks can solve the hard problem of sampling the bosonic output distribution which is directly connected to computing matrix permanents. Remarkably, this computation does not require measurement-based interactions or adaptive feed-forward techniques. Here we demonstrate this model of computation using high--quality laser--written integrated quantum networks that were designed to implement random unitary matrix transformations. We experimentally characterize the integrated devices using an in--situ reconstruction method and observe three-photon interference that leads to the boson-sampling output distribution. Our results set a benchmark for quantum computers, that hold the potential of outperforming conventional ones using only a few dozen photons and linear-optical elements.
Gaussian Boson sampling (GBS) provides a highly efficient approach to make use of squeezed states from parametric down-conversion to solve a classically hard-to-solve sampling problem. The GBS protocol not only significantly enhances the photon gener ation probability, compared to standard boson sampling with single photon Fock states, but also links to potential applications such as dense subgraph problems and molecular vibronic spectra. Here, we report the first experimental demonstration of GBS using squeezed-state sources with simultaneously high photon indistinguishability and collection efficiency. We implement and validate 3-, 4- and 5-photon GBS with high sampling rates of 832 kHz, 163 kHz and 23 kHz, respectively, which is more than 4.4, 12.0, and 29.5 times faster than the previous experiments. Further, we observe a quantum speed-up on a NP-hard optimization problem when comparing with simulated thermal sampler and uniform sampler.
A boson sampling device is a specialised quantum computer that solves a problem which is strongly believed to be computationally hard for classical computers. Recently a number of small-scale implementations have been reported, all based on multi-pho ton interference in multimode interferometers. In the hard-to-simulate regime, even validating the devices functioning may pose a problem . In a recent paper, Gogolin et al. showed that so-called symmetric algorithms would be unable to distinguish the experimental distribution from the trivial, uniform distribution. Here we report new boson sampling experiments on larger photonic chips, and analyse the data using a scalable statistical test recently proposed by Aaronson and Arkhipov. We show the test successfully validates small experimental data samples against the hypothesis that they are uniformly distributed. We also show how to discriminate data arising from either indistinguishable or distinguishable photons. Our results pave the way towards larger boson sampling experiments whose functioning, despite being non-trivial to simulate, can be certified against alternative hypotheses.
Quantum advantage, benchmarking the computational power of quantum machines outperforming all classical computers in a specific task, represents a crucial milestone in developing quantum computers and has been driving different physical implementatio ns since the concept was proposed. Boson sampling machine, an analog quantum computer that only requires multiphoton interference and single-photon detection, is considered to be a promising candidate to reach this goal. However, the probabilistic nature of photon sources and inevitable loss in evolution network make the execution time exponentially increasing with the problem size. Here, we propose and experimentally demonstrate a timestamp boson sampling that can reduce the execution time by 2 orders of magnitude for any problem size. We theoretically show that the registration time of sampling events can be retrieved to reconstruct the probability distribution at an extremely low-flux rate. By developing a time-of-flight storage technique with a precision up to picosecond level, we are able to detect and record the complete time information of 30 individual modes out of a large-scale 3D photonic chip. We successfully validate boson sampling with only one registered event. We show that it is promptly applicable to fill the remained gap of realizing quantum advantage by timestamp boson sampling. The approach associated with newly exploited resource from time information can boost all the count-rate-limited experiments, suggesting an emerging field of timestamp quantum optics.
The tantalizing promise of quantum computational speedup in solving certain problems has been strongly supported by recent experimental evidence from a high-fidelity 53-qubit superconducting processor1 and Gaussian boson sampling (GBS) with up to 76 detected photons. Analogous to the increasingly sophisticated Bell tests that continued to refute local hidden variable theories, quantum computational advantage tests are expected to provide increasingly compelling experimental evidence against the Extended Church-Turing thesis. In this direction, continued competition between upgraded quantum hardware and improved classical simulations is required. Here, we report a new GBS experiment that produces up to 113 detection events out of a 144-mode photonic circuit. We develop a new high-brightness and scalable quantum light source, exploring the idea of stimulated squeezed photons, which has simultaneously near-unity purity and efficiency. This GBS is programmable by tuning the phase of the input squeezed states. We demonstrate a new method to efficiently validate the samples by inferring from computationally friendly subsystems, which rules out hypotheses including distinguishable photons and thermal states. We show that our noisy GBS experiment passes the nonclassicality test using an inequality, and we reveal non-trivial genuine high-order correlation in the GBS samples, which are evidence of robustness against possible classical simulation schemes. The photonic quantum computer, Jiuzhang 2.0, yields a Hilbert space dimension up to $10^{43}$, and a sampling rate $10^{24}$ faster than using brute-force simulation on supercomputers.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا