Do you want to publish a course? Click here

A Consistent Quantum Ontology

439   0   0.0 ( 0 )
 Added by Robert B. Griffiths
 Publication date 2011
  fields Physics
and research's language is English




Ask ChatGPT about the research

The (consistent or decoherent) histories interpretation provides a consistent realistic ontology for quantum mechanics, based on two main ideas. First, a logic (system of reasoning) is employed which is compatible with the Hilbert-space structure of quantum mechanics as understood by von Neumann: quantum properties and their negations correspond to subspaces and their orthogonal complements. It employs a special (single framework) syntactical rule to construct meaningful quantum expressions, quite different from the quantum logic of Birkhoff and von Neumann. Second, quantum time development is treated as an inherently stochastic process under all circumstances, not just when measurements take place. The time-dependent Schrodinger equation provides probabilities, not a deterministic time development of the world. The resulting interpretive framework has no measurement problem and can be used to analyze in quantum terms what is going on before, after, and during physical preparation and measurement processes. In particular, appropriate measurements can reveal quantum properties possessed by the measured system before the measurement took place. There are no mysterious superluminal influences: quantum systems satisfy an appropriate form of Einstein locality. This ontology provides a satisfactory foundation for quantum information theory, since it supplies definite answers as to what the information is about. The formalism of classical (Shannon) information theory applies without change in suitable quantum contexts, and this suggests the way in which quantum information theory extends beyond its classical counterpart.

rate research

Read More

In response to recent criticisms by Okon and Sudarsky, various aspects of the consistent histories (CH) resolution of the quantum measurement problem(s) are discussed using a simple Stern-Gerlach device, and compared with the alternative approaches to the measurement problem provided by spontaneous localization (GRW), Bohmian mechanics, many worlds, and standard (textbook) quantum mechanics. Among these CH is unique in solving the second measurement problem: inferring from the measurement outcome a property of the measured system at a time before the measurement took place, as is done routinely by experimental physicists. The main respect in which CH differs from other quantum interpretations is in allowing multiple stochastic descriptions of a given measurement situation, from which one (or more) can be selected on the basis of its utility. This requires abandoning a principle (termed unicity), central to classical physics, that at any instant of time there is only a single correct description of the world.
181 - P. C. Hohenberg 2011
The foundations of quantum mechanics have been plagued by controversy throughout the 85 year history of the field. It is argued that lack of clarity in the formulation of basic philosophical questions leads to unnecessary obscurity and controversy and an attempt is made to identify the main forks in the road that separate the most important interpretations of quantum theory. The consistent histories formulation, also known as consistent quantum theory, is described as one particular way (favored by the author) to answer the essential questions of interpretation. The theory is shown to be a realistic formulation of quantum mechanics, in contrast to the orthodox or Copenhagen formulation which will be referred to as an operationalist theory.
93 - A. S. Sanz 2021
The ontological aspect of Bohmian mechanics, as a hidden-variable theory that provides us with an objective description of a quantum world without observers, is widely known. Yet its practicality is getting more and more acceptance and relevance, for it has proven to be an efficient and useful resource to tackle, explore, describe and explain such phenomena. This practical aspect emerges precisely when the pragmatic application of the formalism prevails over any other interpretational question, still a matter of debate and controversy. In this regard, the purpose here is to show and discuss how Bohmian mechanics emphasizes in a natural manner a series of dynamical features difficult to find out through other quantum approaches. This arises from the fact that Bohmian mechanics allows us to establish a direct link between the dynamics exhibited by quantum systems and the local variations of the quantum phase associated with their state. To illustrate these facts, simple models of two physically insightful quantum phenomena have been chosen, namely, the dispersion of a free Gaussian wave packet and Young-type two-slit interference. As it is shown, the outcomes from their analysis render a novel, alternative understanding of the dynamics displayed by these quantum phenomena in terms of the underlying local velocity field that connects the probability density with the quantum flux. This field, nothing but the so-called guidance condition in standard Bohmian mechanics, thus acquires a prominent role to understand quantum dynamics, as the mechanism responsible for such dynamics. This goes beyond the passive role typically assigned to this field in Bohmian mechanics, where traditionally trajectories and quantum potentials have received more attention instead.
Quantum process tomography is a necessary tool for verifying quantum gates and diagnosing faults in architectures and gate design. We show that the standard approach of process tomography is grossly inaccurate in the case where the states and measurement operators used to interrogate the system are generated by gates that have some systematic error, a situation all but unavoidable in any practical setting. These errors in tomography can not be fully corrected through oversampling or by performing a larger set of experiments. We present an alternative method for tomography to reconstruct an entire library of gates in a self-consistent manner. The essential ingredient is to define a likelihood function that assumes nothing about the gates used for preparation and measurement. In order to make the resulting optimization tractable we linearize about the target, a reasonable approximation when benchmarking a quantum computer as opposed to probing a black-box function.
The quantum adiabatic theorem states that if a quantum system starts in an eigenstate of the Hamiltonian, and this Hamiltonian varies sufficiently slowly, the system stays in this eigenstate. We investigate experimentally the conditions that must be fulfilled for this theorem to hold. We show that the traditional adiabatic condition as well as some conditions that were recently suggested are either not sufficient or not necessary. Experimental evidence is presented by a simple experiment using nuclear spins.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا