ترغب بنشر مسار تعليمي؟ اضغط هنا

Testing Randomness in Quantum Mechanics

250   0   0.0 ( 0 )
 نشر من قبل Isaac P\\'erez Castillo
 تاريخ النشر 2018
والبحث باللغة English




اسأل ChatGPT حول البحث

Pseudo-random number generators are widely used in many branches of science, mainly in applications related to Monte Carlo methods, although they are deterministic in design and, therefore, unsuitable for tackling fundamental problems in security and cryptography. The natural laws of the microscopic realm provide a fairly simple method to generate non-deterministic sequences of random numbers, based on measurements of quantum states. In practice, however, the experimental devices on which quantum random number generators are based are often unable to pass some tests of randomness. In this review, we briefly discuss two such tests, point out the challenges that we have encountered and finally present a fairly simple method that successfully generates non-deterministic maximally random sequences.



قيم البحث

اقرأ أيضاً

196 - Kevin Slagle 2021
We consider the hypothesis that quantum mechanics is not fundamental, but instead emerges from a theory with less computational power, such as classical mechanics. This hypothesis makes the prediction that quantum computers will not be capable of suf ficiently complex quantum computations. Utilizing this prediction, we outline a proposal to test for such a breakdown of quantum mechanics using near-term noisy intermediate-scale quantum (NISQ) computers. Our procedure involves simulating a non-Clifford random circuit, followed by its inverse, and then checking that the resulting state is the same as the initial state. We show that quantum mechanics predicts that the fidelity of this procedure decays exponentially with circuit depth (due to noise in NISQ computers). However, if quantum mechanics emerges from a theory with significantly less computational power, then we expect the fidelity to decay significantly more rapidly than the quantum mechanics prediction for sufficiently deep circuits, which is the experimental signature that we propose to search for. Useful experiments can be performed with 80 qubits and gate infidelity $10^{-3}$, while highly informative experiments should require only 1000 qubits and gate infidelity $10^{-5}$.
Steganographic protocols enable one to embed covert messages into inconspicuous data over a public communication channel in such a way that no one, aside from the sender and the intended receiver, can even detect the presence of the secret message. I n this paper, we provide a new provably-secure, private-key steganographic encryption protocol secure in the framework of Hopper et al. We first present a one-time stegosystem that allows two parties to transmit messages of length at most that of the shared key with information-theoretic security guarantees. The employment of a pseudorandom generator (PRG) permits secure transmission of longer messages in the same way that such a generator allows the use of one-time pad encryption for messages longer than the key in symmetric encryption. The advantage of our construction, compared to all previous work is randomness efficiency: in the information theoretic setting our protocol embeds a message of length n bits using a shared secret key of length (1+o(1))n bits while achieving security 2^{-n/log^{O(1)}n}; simply put this gives a rate of key over message that is 1 as n tends to infinity (the previous best result achieved a constant rate greater than 1 regardless of the security offered). In this sense, our protocol is the first truly randomness efficient steganographic system. Furthermore, in our protocol, we can permit a portion of the shared secret key to be public while retaining precisely n private key bits. In this setting, by separating the public and the private randomness of the shared key, we achieve security of 2^{-n}. Our result comes as an effect of the application of randomness extractors to stegosystem design. To the best of our knowledge this is the first time extractors have been applied in steganography.
In Mod. Phys. Lett. A 9, 3119 (1994), one of us (R.D.S) investigated a formulation of quantum mechanics as a generalized measure theory. Quantum mechanics computes probabilities from the absolute squares of complex amplitudes, and the resulting inter ference violates the (Kolmogorov) sum rule expressing the additivity of probabilities of mutually exclusive events. However, there is a higher order sum rule that quantum mechanics does obey, involving the probabilities of three mutually exclusive possibilities. We could imagine a yet more general theory by assuming that it violates the next higher sum rule. In this paper, we report results from an ongoing experiment that sets out to test the validity of this second sum rule by measuring the interference patterns produced by three slits and all the possible combinations of those slits being open or closed. We use attenuated laser light combined with single photon counting to confirm the particle character of the measured light.
We present a new experimental approach using a three-path interferometer and find a tighter empirical upper bound on possible violations of Borns Rule. A deviation from Borns rule would result in multi-order interference. Among the potential systemat ic errors that could lead to an apparent violation we specifically study the nonlinear response of our detectors and present ways to calibrate this error in order to obtain an even better bound.
We consider sequential hypothesis testing between two quantum states using adaptive and non-adaptive strategies. In this setting, samples of an unknown state are requested sequentially and a decision to either continue or to accept one of the two hyp otheses is made after each test. Under the constraint that the number of samples is bounded, either in expectation or with high probability, we exhibit adaptive strategies that minimize both types of misidentification errors. Namely, we show that these errors decrease exponentially (in the stopping time) with decay rates given by the measured relative entropies between the two states. Moreover, if we allow joint measurements on multiple samples, the rates are increased to the respective quantum relative entropies. We also fully characterize the achievable error exponents for non-adaptive strategies and provide numerical evidence showing that adaptive measurements are necessary to achieve our bounds under some additional assumptions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا