Do you want to publish a course? Click here

Probability sum rules and consistent quantum histories

119   0   0.0 ( 0 )
 Added by Eric D. Chisolm
 Publication date 2009
  fields Physics
and research's language is English




Ask ChatGPT about the research

An example shows that weak decoherence is more restrictive than the minimal logical decoherence structure that allows probabilities to be used consistently for quantum histories. The probabilities in the sum rules that define minimal decoherence are all calculated by using a projection operator to describe each possibility for the state at each time. Weak decoherence requires more sum rules. They bring in additional variables, that require different measurements and a different way to calculate probabilities, and raise questions of operational meaning. The example shows that extending the linearly positive probability formula from weak to minimal decoherence gives probabilities that are different from those calculated in the usual way using the Born and von Neumann rules and a projection operator at each time.



rate research

Read More

137 - Akira Inomata 1998
A history of Feynmans sum over histories is presented in brief. A focus is placed on the progress of path-integration techniques for exactly path-integrable problems in quantum mechanics.
123 - Robert B. Griffiths 2011
The relationship between quantum logic, standard propositional logic, and the (consistent) histories rules for quantum reasoning is discussed. It is shown that Maudlins claim [Am. J. Phys. 79 (2011) 954] that the histories approach is inconsistent, is incorrect. The histories approach is both internally consistent and adequate for discussing the physical situations considered by Maudlin.
The decoherence of a two-state tunneling molecule, such as a chiral molecule or ammonia, due to collisions with a buffer gas is analyzed in terms of a succession of quantum states of the molecule satisfying the conditions for a consistent family of histories. With $hbar omega$ the separation in energy of the levels in the isolated molecule and $gamma$ a decoherence rate proportional to the rate of collisions, we find for $gamma gg omega$ (strong decoherence) a consistent family in which the molecule flips randomly back and forth between the left- and right-handed chiral states in a stationary Markov process. For $gamma < omega$ there is a family in which the molecule oscillates continuously between the different chiral states, but with occasional random changes of phase, at a frequency that goes to zero at a phase transition $gamma = omega$. This transition is similar to the behavior of the inversion frequency of ammonia with increasing pressure, but will be difficult to observe in chiral molecules such as D$_2$S$_2$. There are additional consistent families both for $gamma > omega$ and for $gamma < omega$. In addition we relate the speed with which chiral information is transferred to the environment to the rate of decrease of complementary types of information (e.g., parity information) remaining in the molecule itself.
131 - S. Rodini 2020
Different decompositions of the nucleon mass, in terms of the masses and energies of the underlying constituents, have been proposed in the literature. We explore the corresponding sum rules in quantum electrodynamics for an electron at one-loop order in perturbation theory. To this end we compute the form factors of the energy-momentum tensor, by paying particular attention to the renormalization of ultraviolet divergences, operator mixing and scheme dependence. We clarify the expressions of all the proposed sum rules in the electron rest frame in terms of renormalized operators. Furthermore, we consider the same sum rules in a moving frame, where they become energy decompositions. Finally, we discuss some implications of our study on the mass sum rules for the nucleon.
62 - T.N.Palmer 2016
Although the notion of superdeterminism can, in principle, account for the violation of the Bell inequalities, this potential explanation has been roundly rejected by the quantum foundations community. The arguments for rejection, one of the most substantive coming from Bell himself, are critically reviewed. In particular, analysis of Bells argument reveals an implicit unwarranted assumption: that the Euclidean metric is the appropriate yardstick for measuring distances in state space. Bells argument is largely negated if this yardstick is instead based on the alternative $p$-adic metric. Such a metric, common in number theory, arises naturally when describing chaotic systems which evolve precisely on self-similar invariant sets in their state space. A locally-causal realistic model of quantum entanglement is developed, based on the premise that the laws of physics ultimately derive from an invariant-set geometry in the state space of a deterministic quasi-cyclic mono-universe. Based on this, the notion of a complex Hilbert vector is reinterpreted in terms of an uncertain selection from a finite sample space of states, leading to a novel form of `consistent histories based on number-theoretic properties of the transcendental cosine function. This leads to novel realistic interpretations of position/momentum non-commutativity, EPR, the Bell Theorem and the Tsirelson bound. In this inherently holistic theory - neither conspiratorial, retrocausal, fine tuned nor nonlocal - superdeterminism is not invoked by fiat but is emergent from these `consistent histories number-theoretic constraints. Invariant set theory provides new perspectives on many of the contemporary problems at the interface of quantum and gravitational physics, and, if correct, may signal the end of particle physics beyond the Standard Model.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا