No Arabic abstract
Motivated by the question what it is that makes quantum mechanics a holistic theory (if so), I try to define for general physical theories what we mean by `holism. For this purpose I propose an epistemological criterion to decide whether or not a physical theory is holistic, namely: a physical theory is holistic if and only if it is impossible in principle to infer the global properties, as assigned in the theory, by local resources available to an agent. I propose that these resources include at least all local operations and classical communication. This approach is contrasted with the well-known approaches to holism in terms of supervenience. The criterion for holism proposed here involves a shift in emphasis from ontology to epistemology. I apply this epistemological criterion to classical physics and Bohmian mechanics as represented on a phase and configuration space respectively, and for quantum mechanics (in the orthodox interpretation) using the formalism of general quantum operations as completely positive trace non-increasing maps. Furthermore, I provide an interesting example from which one can conclude that quantum mechanics is holistic in the above mentioned sense, although, perhaps surprisingly, no entanglement is needed.
On the base of years of experience of working on the problem of the physical foundation of quantum mechanics the author offers principles of solving it. Under certain pressure of mathematical formalism there has raised a hypothesis of complexity of space and time by Minkovsky, being significant mainly for quantum objects. In this eight-dimensional space and time with six space and two time dimensions all the problems and peculiarities of quantum mechanical formalism disappear, the reasons of their appearance become clear, and there comes a clear and physically transparent picture of the foundations of quantum mechanics.
Mathematical modeling should present a consistent description of physical phenomena. We illustrate an inconsistency with two Hamiltonians -- the standard Hamiltonian and an example found in Goldstein -- for the simple harmonic oscillator and its quantisation. Both descriptions are rich in Lie point symmetries and so one can calculate many Jacobi Last Multipliers and therefore Lagrangians. The Last Multiplier provides the route to the resolution of this problem and indicates that the great debate about the quantisation of dissipative systems should never have occurred.
We present a derivation of the third postulate of Relational Quantum Mechanics (RQM) from the properties of conditional probabilities.The first two RQM postulates are based on the information that can be extracted from interaction of different systems, and the third postulate defines the properties of the probability function. Here we demonstrate that from a rigorous definition of the conditional probability for the possible outcomes of different measurements, the third postulate is unnecessary and the Borns rule naturally emerges from the first two postulates by applying the Gleasons theorem. We demonstrate in addition that the probability function is uniquely defined for classical and quantum phenomena. The presence or not of interference terms is demonstrated to be related to the precise formulation of the conditional probability where distributive property on its arguments cannot be taken for granted. In the particular case of Youngs slits experiment, the two possible argument formulations correspond to the possibility or not to determine the particle passage through a particular path.
It is unusual to find QCD factorization explained in the language of quantum information science. However, we will discuss how the issue of factorization and its breaking in high-energy QCD processes relates to phenomena like decoherence and entanglement. We will elaborate with several examples and explain them in terms familiar from basic quantum mechanics and quantum information science.
The experimental realization of successive non-demolition measurements on single microscopic systems brings up the question of ergodicity in Quantum Mechanics (QM). We investigate whether time averages over one realization of a single system are related to QM averages over an ensemble of similarly prepared systems. We adopt a generalization of von Neumann model of measurement, coupling the system to $N$ probes --with a strength that is at our disposal-- and detecting the latter. The model parallels the procedure followed in experiments on Quantum Electrodynamic cavities. The modification of the probability of the observable eigenvalues due to the coupling to the probes can be computed analytically and the results compare qualitatively well with those obtained numerically by the experimental groups. We find that the problem is not ergodic, except in the case of an eigenstate of the observable being studied.