No Arabic abstract
Random numbers represent an indispensable resource for many applications. A recent remarkable result is the realization that non-locality in quantum mechanics can be used to certify genuine randomness through Bells theorem, producing reliable random numbers in a device independent way. Here, we explore the contextuality aspect of quantum mechanics and show that true random numbers can be generated using only single qutrit (three-state systems) without entanglement and non-locality. In particular, we show that any observed violation of the Klyachko-Can-Binicioglu-Shumovsky (KCBS) inequality [Phys. Rev. Lett. 101, 20403 (2008)] provides a positive lower bound on genuine randomness. As a proof-of-concept experiment, we demonstrate with photonic qutrits that at least 5246 net true random numbers are generated with a confidence level of 99.9%.
Random numbers are essential for applications ranging from secure communications to numerical simulation and quantitative finance. Algorithms can rapidly produce pseudo-random outcomes, series of numbers that mimic most properties of true random numbers while quantum random number generators (QRNGs) exploit intrinsic quantum randomness to produce true random numbers. Single-photon QRNGs are conceptually simple but produce few random bits per detection. In contrast, vacuum fluctuations are a vast resource for QRNGs: they are broad-band and thus can encode many random bits per second. Direct recording of vacuum fluctuations is possible, but requires shot-noise-limited detectors, at the cost of bandwidth. We demonstrate efficient conversion of vacuum fluctuations to true random bits using optical amplification of vacuum and interferometry. Using commercially-available optical components we demonstrate a QRNG at a bit rate of 1.11 Gbps. The proposed scheme has the potential to be extended to 10 Gbps and even up to 100 Gbps by taking advantage of high speed modulation sources and detectors for optical fiber telecommunication devices.
A central result in the foundations of quantum mechanics is the Kochen-Specker theorem. In short, it states that quantum mechanics is in conflict with classical models in which the result of a measurement does not depend on which other compatible measurements are jointly performed. Here, compatible measurements are those that can be performed simultaneously or in any order without disturbance. This conflict is generically called quantum contextuality. In this article, we present an introduction to this subject and its current status. We review several proofs of the Kochen-Specker theorem and different notions of contextuality. We explain how to experimentally test some of these notions and discuss connections between contextuality and nonlocality or graph theory. Finally, we review some applications of contextuality in quantum information processing.
In quantum physics the term `contextual can be used in more than one way. One usage, here called `Bell contextual since the idea goes back to Bell, is that if $A$, $B$ and $C$ are three quantum observables, with $A$ compatible (i.e., commuting) with $B$ and also with $C$, whereas $B$ and $C$ are incompatible, a measurement of $A$ might yield a different result (indicating that quantum mechanics is contextual) depending upon whether $A$ is measured along with $B$ (the ${A,B}$ context) or with $C$ (the ${A,C}$ context). An analysis of what projective quantum measurements measure shows that quantum theory is Bell noncontextual: the outcome of a particular $A$ measurement when $A$ is measured along with $B$ would have been exactly the same if $A$ had, instead, been measured along with $C$. A different definition, here called `globally (non)contextual refers to whether or not there is (noncontextual) or is not (contextual) a single joint probability distribution that simultaneously assigns probabilities in a consistent manner to the outcomes of measurements of a certain collection of observables, not all of which are compatible. A simple example shows that such a joint probability distribution can exist even in a situation where the measurement probabilities cannot refer to properties of a quantum system, and hence lack physical significance, even though mathematically well-defined. It is noted that the quantum sample space, a projective decomposition of the identity, required for interpreting measurements of incompatible properties in different runs of an experiment using different types of apparatus has a tensor product structure, a fact sometimes overlooked.
The notion of contextuality, which emerges from a theorem established by Simon Kochen and Ernst Specker (1960-1967) and by John Bell (1964-1966), is certainly one of the most fundamental aspects of quantum weirdness. If it is a questioning on scholastic philosophy and a study of contrafactual logic that led Specker to his demonstration with Kochen, it was a criticism of von Neumanns proof that led John Bell to the result. A misinterpretation of this famous proof will lead them to diametrically opposite conclusions. Over the last decades, remarkable theoretical progresses have been made on the subject in the context of the study of quantum foundations and quantum information. Thus, the graphic generalizations of Cabello-Severini-Winter and Acin-Fritz-Leverrier-Sainz raise the question of the connection between non-locality and contextuality. It is also the case of the sheaf-theoretic approach of Samson Abramsky et al., which also invites us to compare contextuality with the logical structure of certain classical logical paradoxes. Another approach, initiated by Robert Spekkens, generalizes the concept to any type of experimental procedure. This new form of universal contextuality has been raised as a criterion of non-classicality, i.e. of weirdness. It notably led to identify the nature of curious quantum paradoxes involving post-selections and weak measurements. In the light of the fiftieth anniversary of the publication of the Kochen-Specker theorem, this report aims to introduce these results little known to the French scientific public, in the context of an investigation on the nature of the weirdness of quantum physics.
It is well known that certain measurement scenarios behave in a way which can not be explained by classical theories but by quantum theories. This behaviours are usually studied by Bell or non-contextuality (NC) inequalities. Knowing the maximal classical and quantum bounds of this inequalities is interesting, but tells us little about the quantum set Q of all quantum behaviours P. Despite having a constructive description of the quantum set associated to a given inequality, the freedom to choose quantum dimension, quantum states, and quantum measurements makes the shape of such convex bodies quite elusive. It is well known that a NC-inequality can be associated to a graph and the quantum set is a combinatorial object. Extra conditions, like Bell concept of parts, may restrict the behaviours achievable within quantum theory for a given scenario. For the simplest case, CHSH inequality, the NC and Be