ترغب بنشر مسار تعليمي؟ اضغط هنا

Stringent and efficient assessment of Boson-Sampling devices

127   0   0.0 ( 0 )
 نشر من قبل Malte Tichy
 تاريخ النشر 2013
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Boson-Sampling holds the potential to experimentally falsify the Extended Church Turing thesis. The computational hardness of Boson-Sampling, however, complicates the certification that an experimental device yields correct results in the regime in which it outmatches classical computers. To certify a boson-sampler, one needs to verify quantum predictions and rule out models that yield these predictions without true many-boson interference. We show that a semiclassical model for many-boson propagation reproduces coarse-grained observables that were proposed as witnesses of Boson-Sampling. A test based on Fourier matrices is demonstrated to falsify physically plausible alternatives to coherent many-boson propagation.



قيم البحث

اقرأ أيضاً

We demonstrate how boson sampling with photons of partial distinguishability can be expressed in terms of interference of fewer photons. We use this observation to propose a classical algorithm to simulate the output of a boson sampler fed with photo ns of partial distinguishability. We find conditions for which this algorithm is efficient, which gives a lower limit on the required indistinguishability to demonstrate a quantum advantage. Under these conditions, adding more photons only polynomially increases the computational cost to simulate a boson sampling experiment.
A boson sampling device is a specialised quantum computer that solves a problem which is strongly believed to be computationally hard for classical computers. Recently a number of small-scale implementations have been reported, all based on multi-pho ton interference in multimode interferometers. In the hard-to-simulate regime, even validating the devices functioning may pose a problem . In a recent paper, Gogolin et al. showed that so-called symmetric algorithms would be unable to distinguish the experimental distribution from the trivial, uniform distribution. Here we report new boson sampling experiments on larger photonic chips, and analyse the data using a scalable statistical test recently proposed by Aaronson and Arkhipov. We show the test successfully validates small experimental data samples against the hypothesis that they are uniformly distributed. We also show how to discriminate data arising from either indistinguishable or distinguishable photons. Our results pave the way towards larger boson sampling experiments whose functioning, despite being non-trivial to simulate, can be certified against alternative hypotheses.
In this work we proof that boson sampling with $N$ particles in $M$ modes is equivalent to short-time evolution with $N$ excitations in an XY model of $2N$ spins. This mapping is efficient whenever the boson bunching probability is small, and errors can be efficiently postselected. This mapping opens the door to boson sampling with quantum simulators or general purpose quantum computers, and highlights the complexity of time-evolution with critical spin models, even for very short times.
Universal quantum computers promise a dramatic speed-up over classical computers but a full-size realization remains challenging. However, intermediate quantum computational models have been proposed that are not universal, but can solve problems tha t are strongly believed to be classically hard. Aaronson and Arkhipov have shown that interference of single photons in random optical networks can solve the hard problem of sampling the bosonic output distribution which is directly connected to computing matrix permanents. Remarkably, this computation does not require measurement-based interactions or adaptive feed-forward techniques. Here we demonstrate this model of computation using high--quality laser--written integrated quantum networks that were designed to implement random unitary matrix transformations. We experimentally characterize the integrated devices using an in--situ reconstruction method and observe three-photon interference that leads to the boson-sampling output distribution. Our results set a benchmark for quantum computers, that hold the potential of outperforming conventional ones using only a few dozen photons and linear-optical elements.
Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. When using heralded single-photon sources based on parametric down-conversion, this approach offers an $sim e$-fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. More significantly, this approach offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا