Do you want to publish a course? Click here

Faster classical Boson Sampling

488   0   0.0 ( 0 )
 Added by Raphael Clifford
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Since its introduction Boson Sampling has been the subject of intense study in the world of quantum computing. The task is to sample independently from the set of all $n times n$ submatrices built from possibly repeated rows of a larger $m times n$ complex matrix according to a probability distribution related to the permanents of the submatrices. Experimental systems exploiting quantum photonic effects can in principle perform the task at great speed. In the framework of classical computing, Aaronson and Arkhipov (2011) showed that exact Boson Sampling problem cannot be solved in polynomial time unless the polynomial hierarchy collapses to the third level. Indeed for a number of years the fastest known exact classical algorithm ran in $O({m+n-1 choose n} n 2^n )$ time per sample, emphasising the potential speed advantage of quantum computation. The advantage was reduced by Clifford and Clifford (2018) who gave a significantly faster classical solution taking $O(n 2^n + operatorname{poly}(m,n))$ time and linear space, matching the complexity of computing the permanent of a single matrix when $m$ is polynomial in $n$. We continue by presenting an algorithm for Boson Sampling whose average-case time complexity is much faster when $m$ is proportional to $n$. In particular, when $m = n$ our algorithm runs in approximately $O(ncdot1.69^n)$ time on average. This result further increases the problem size needed to establish quantum computational supremacy via Boson Sampling.

rate research

Read More

Quantum mechanics promises computational powers beyond the reach of classical computers. Current technology is on the brink of an experimental demonstration of the superior power of quantum computation compared to classical devices. For such a demonstration to be meaningful, experimental noise must not affect the computational power of the device; this occurs when a classical algorithm can use the noise to simulate the quantum system. In this work, we demonstrate an algorithm which simulates boson sampling, a quantum advantage demonstration based on many-body quantum interference of indistinguishable bosons, in the presence of optical loss. Finding the level of noise where this approximation becomes efficient lets us map out the maximum level of imperfections at which it is still possible to demonstrate a quantum advantage. We show that current photonic technology falls short of this benchmark. These results call into question the suitability of boson sampling as a quantum advantage demonstration.
97 - Hui Wang , Yu He , Yu-Huai Li 2016
Boson sampling is considered as a strong candidate to demonstrate the quantum computational supremacy over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multi-port optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multi-photon interferometers with 0.99 transmission rate, and actively demultiplexed single-photon sources from a quantum-dot-micropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate 3-, 4-, and 5-photon boson sampling, and achieve sampling rates of 4.96 kHz, 151 Hz, and 4 Hz, respectively, which are over 24,000 times faster than the previous experiments, and over 220 times faster than obtaining one sample through calculating the matrices permanent using the first electronic computer (ENIAC) and transistorized computer (TRADIC) in the human history. Our architecture is feasible to be scaled up to larger number of photons and with higher rate to race against classical computers, and might provide experimental evidence against the Extended Church-Turing Thesis.
Universal quantum computers promise a dramatic speed-up over classical computers but a full-size realization remains challenging. However, intermediate quantum computational models have been proposed that are not universal, but can solve problems that are strongly believed to be classically hard. Aaronson and Arkhipov have shown that interference of single photons in random optical networks can solve the hard problem of sampling the bosonic output distribution which is directly connected to computing matrix permanents. Remarkably, this computation does not require measurement-based interactions or adaptive feed-forward techniques. Here we demonstrate this model of computation using high--quality laser--written integrated quantum networks that were designed to implement random unitary matrix transformations. We experimentally characterize the integrated devices using an in--situ reconstruction method and observe three-photon interference that leads to the boson-sampling output distribution. Our results set a benchmark for quantum computers, that hold the potential of outperforming conventional ones using only a few dozen photons and linear-optical elements.
Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. When using heralded single-photon sources based on parametric down-conversion, this approach offers an $sim e$-fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. More significantly, this approach offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.
Boson Sampling has emerged as a tool to explore the advantages of quantum over classical computers as it does not require a universal control over the quantum system, which favours current photonic experimental platforms.Here, we introduce Gaussian Boson Sampling, a classically hard-to-solve problem that uses squeezed states as a non-classical resource. We relate the probability to measure specific photon patterns from a general Gaussian state in the Fock basis to a matrix function called the hafnian, which answers the last remaining question of sampling from Gaussian states. Based on this result, we design Gaussian Boson Sampling, a #P hard problem, using squeezed states. This approach leads to a more efficient photonic boson sampler with significant advantages in generation probability and measurement time over currently existing protocols.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا