ترغب بنشر مسار تعليمي؟ اضغط هنا

From the problem of Future Contingents to Peres-Mermin square experiments: An introductory review to Contextuality

283   0   0.0 ( 0 )
 نشر من قبل Gael Masse
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English
 تأليف G. Masse




اسأل ChatGPT حول البحث

A review is made of the field of contextuality in quantum mechanics. We study the historical emergence of the concept from philosophical and logical issues. We present and compare the main theoretical frameworks that have been derived. Finally, we focus on the complex task of establishing experimental tests of contextuality. Throughout this work, we try to show that the conceptualisation of contextuality has progressed through different complementary perspectives, before summoning them together to analyse the signification of contextuality experiments. Doing so, we argue that contextuality emerged as a discrete logical problem and developed into a quantifiable quantum resource.



قيم البحث

اقرأ أيضاً

138 - Pawel Blasiak 2013
Contextuality lays at the heart of quantum mechanics. In the prevailing opinion it is considered as a signature of quantumness that classical theories lack. However, this assertion is only partially justified. Although contextuality is certainly true of quantum mechanics, it cannot be taken by itself as discriminating against classical theories. Here we consider a representative example of contextual behaviour, the so-called Mermin-Peres square, and present a discrete toy model of a bipartite system which reproduces the pattern of quantum predictions that leads to contradiction with the assumption of non-contextuality. This illustrates that quantum-like contextual effects have their analogues within classical models with epistemic constraints such as limited information gain and measurement disturbance.
A central result in the foundations of quantum mechanics is the Kochen-Specker theorem. In short, it states that quantum mechanics cannot be reconciled with classical models that are noncontextual for ideal measurements. The first explicit derivation by Kochen and Specker was rather complex, but considerable simplifications have been achieved thereafter. We propose a systematic approach to find minimal Hardy-type and Greenberger-Horne-Zeilinger-type (GHZ-type) proofs of the Kochen-Specker theorem, these are characterized by the fact that the predictions of classical models are opposite to the predictions of quantum mechanics. Based on our results, we show that the Kochen-Specker set with 18 vectors from Cabello et al. [A. Cabello et al., Phys. Lett. A 212, 183 (1996)] is the minimal set for any dimension, verifying a longstanding conjecture by Peres. Our results allow to identify minimal contextuality scenarios and to study their usefulness for information processing.
Exploring the graph approach, we restate the extended definition of noncontextuality provided by the contextuality-by-default framework. This extended definition avoids the assumption of nondisturbance, which states that whenever two contexts overlap , the marginal distribution obtained for the intersection must be the same. We show how standard tools for characterizing contextuality can also be used in this extended framework for any set of measurements and, in addition, we also provide several conditions that can be tested directly in any contextuality experiment. Our conditions reduce to traditional ones for noncontextuality if the nondisturbance assumption is satisfied.
74 - Riccardo Scarpa 2006
By the time, in 1937, the Swiss astronomer Zwicky measured the velocity dispersion of the Coma cluster of galaxies, astronomers somehow got acquainted with the idea that the universe is filled by some kind of dark matter. After almost a century of in vestigations, we have learned two things about dark matter, (i) it has to be non-baryonic -- that is, made of something new that interact with normal matter only by gravitation-- and, (ii) that its effects are observed in stellar systems when and only when their internal acceleration of gravity falls below a fix value a0=1.2x10-8 cm s-2. This systematic, more than anything else, tells us we might be facing a failure of the law of gravity in the weak field limit rather then the effects of dark matter. Thus, in an attempt to avoid the need for dark matter, the Modified Newtonian Dynamics. MOND posits a breakdown of Newtons law of gravity (or inertia) below a0, after which the dependence with distance became linear. Despite many attempts, MOND resisted stubbornly to be falsified as an alternative to dark matter and succeeds in explaining the properties of an impressively large number of objects without invoking the presence of non-baryonic dark matter. In this paper, I will review the basics of MOND and its ability to explain observations without the need of dark matter.
The connection between contextuality and graph theory has led to many developments in the field. In particular, the sets of probability distributions in many contextuality scenarios can be described using well known convex sets from graph theory, lea ding to a beautiful geometric characterization of such sets. This geometry can also be explored in the definition of contextuality quantifiers based on geometric distances, which is important for the resource theory of contextuality, developed after the recognition of contextuality as a potential resource for quantum computation. In this paper we review the geometric aspects of contextuality and use it to define several quantifiers, which have the advantage of being applicable to the exclusivity approach to contextuality, where previously defined quantifiers do not fit.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا