No Arabic abstract
The relationship between quantum logic, standard propositional logic, and the (consistent) histories rules for quantum reasoning is discussed. It is shown that Maudlins claim [Am. J. Phys. 79 (2011) 954] that the histories approach is inconsistent, is incorrect. The histories approach is both internally consistent and adequate for discussing the physical situations considered by Maudlin.
In arXiv:1707.08641, Tim Maudlin claims to construct a counterexample to the result of Proc. Roy. Soc. A vol. 473, iss. 2202, 2017 (arXiv:1607.07871), in which it was shown that no realist model satisfying a certain notion of time-symmetry (in addition to three other assumptions) can reproduce the predictions of quantum theory without retrocausality (influences travelling backwards in time). In this comment, I explain why Maudlins model is not a counterexample because it does not satisfy our time-symmetry assumption. I also explain why Maudlins claim that one of the Lemmas we used in our proof is incorrect is wrong.
A recent post by Tim Maudlin to this archive (arXiv:1408.1828) was entitled Reply to Werner. However, it was not clear to what text this was supposed to be a reply. Here I briefly provide this context, and show that Maudlins post is as ill-conceived as the original paper (arXiv:1408.1826).
The decoherence of a two-state tunneling molecule, such as a chiral molecule or ammonia, due to collisions with a buffer gas is analyzed in terms of a succession of quantum states of the molecule satisfying the conditions for a consistent family of histories. With $hbar omega$ the separation in energy of the levels in the isolated molecule and $gamma$ a decoherence rate proportional to the rate of collisions, we find for $gamma gg omega$ (strong decoherence) a consistent family in which the molecule flips randomly back and forth between the left- and right-handed chiral states in a stationary Markov process. For $gamma < omega$ there is a family in which the molecule oscillates continuously between the different chiral states, but with occasional random changes of phase, at a frequency that goes to zero at a phase transition $gamma = omega$. This transition is similar to the behavior of the inversion frequency of ammonia with increasing pressure, but will be difficult to observe in chiral molecules such as D$_2$S$_2$. There are additional consistent families both for $gamma > omega$ and for $gamma < omega$. In addition we relate the speed with which chiral information is transferred to the environment to the rate of decrease of complementary types of information (e.g., parity information) remaining in the molecule itself.
An example shows that weak decoherence is more restrictive than the minimal logical decoherence structure that allows probabilities to be used consistently for quantum histories. The probabilities in the sum rules that define minimal decoherence are all calculated by using a projection operator to describe each possibility for the state at each time. Weak decoherence requires more sum rules. They bring in additional variables, that require different measurements and a different way to calculate probabilities, and raise questions of operational meaning. The example shows that extending the linearly positive probability formula from weak to minimal decoherence gives probabilities that are different from those calculated in the usual way using the Born and von Neumann rules and a projection operator at each time.
Although the notion of superdeterminism can, in principle, account for the violation of the Bell inequalities, this potential explanation has been roundly rejected by the quantum foundations community. The arguments for rejection, one of the most substantive coming from Bell himself, are critically reviewed. In particular, analysis of Bells argument reveals an implicit unwarranted assumption: that the Euclidean metric is the appropriate yardstick for measuring distances in state space. Bells argument is largely negated if this yardstick is instead based on the alternative $p$-adic metric. Such a metric, common in number theory, arises naturally when describing chaotic systems which evolve precisely on self-similar invariant sets in their state space. A locally-causal realistic model of quantum entanglement is developed, based on the premise that the laws of physics ultimately derive from an invariant-set geometry in the state space of a deterministic quasi-cyclic mono-universe. Based on this, the notion of a complex Hilbert vector is reinterpreted in terms of an uncertain selection from a finite sample space of states, leading to a novel form of `consistent histories based on number-theoretic properties of the transcendental cosine function. This leads to novel realistic interpretations of position/momentum non-commutativity, EPR, the Bell Theorem and the Tsirelson bound. In this inherently holistic theory - neither conspiratorial, retrocausal, fine tuned nor nonlocal - superdeterminism is not invoked by fiat but is emergent from these `consistent histories number-theoretic constraints. Invariant set theory provides new perspectives on many of the contemporary problems at the interface of quantum and gravitational physics, and, if correct, may signal the end of particle physics beyond the Standard Model.