Do you want to publish a course? Click here

Measurement and Quantum Dynamics in the Minimal Modal Interpretation of Quantum Theory

173   0   0.0 ( 0 )
 Added by David Kagan
 Publication date 2018
  fields Physics
and research's language is English




Ask ChatGPT about the research

Any realist interpretation of quantum theory must grapple with the measurement problem and the status of state-vector collapse. In a no-collapse approach, measurement is typically modeled as a dynamical process involving decoherence. We describe how the minimal modal interpretation closes a gap in this dynamical description, leading to a complete and consistent resolution to the measurement problem and an effective form of state collapse. Our interpretation also provides insight into the indivisible nature of measurement--the fact that you cant stop a measurement part-way through and uncover the underlying `ontic dynamics of the system in question. Having discussed the hidden dynamics of a systems ontic state during measurement, we turn to more general forms of open-system dynamics and explore the extent to which the details of the underlying ontic behavior of a system can be described. We construct a space of ontic trajectories and describe obstructions to defining a probability measure on this space.



rate research

Read More

128 - H. D. Zeh 2011
A historically important but little known debate regarding the necessity and meaning of macroscopic superpositions, in particular those containing different gravitational fields, is discussed from a modern perspective.
Quantum chromodynamics (QCD) describes the structure of hadrons such as the proton at a fundamental level. The precision of calculations in QCD limits the precision of the values of many physical parameters extracted from collider data. For example, uncertainty in the parton distribution function (PDF) is the dominant source of error in the $W$ mass measurement at the LHC. Improving the precision of such measurements is essential in the search for new physics. Quantum simulation offers an efficient way of studying quantum field theories (QFTs) such as QCD non-perturbatively. Previous quantum algorithms for simulating QFTs have qubit requirements that are well beyond the most ambitious experimental proposals for large-scale quantum computers. Can the qubit requirements for such algorithms be brought into range of quantum computation with several thousand logical qubits? We show how this can be achieved by using the light-front formulation of quantum field theory. This work was inspired by the similarity of the light-front formulation to quantum chemistry, first noted by Kenneth Wilson.
We describe a quantum mechanical measurement as a variational principle including interaction between the system under measurement and the measurement apparatus. Augmenting the action with a nonlocal term (a double integration over the duration of the interaction) results in a theory capable of describing both the measurement process (agreement between system state and pointer state) and the collapse of both systems into a single eigenstate (or superposition of degenerate eigenstates) of the relevant operator. In the absence of the interaction, a superposition of states is stable, and the theory agrees with the predictions of standard quantum theory. Because the theory is nonlocal, the resulting wave equation is an integrodifferential equation (IDE). We demonstrate these ideas using a simple Lagrangian for both systems, as proof of principle. The variational principle is time-symmetric and retrocausal, so the solution for the measurement process is determined by boundary conditions at both initial and final times; the initial condition is determined by the experimental preparation and the final condition is the natural boundary condition of variational calculus. We hypothesize that one or more hidden variables (not ruled out by Bells Theorem, due both to the retrocausality and the nonlocality of the theory) influence the outcome of the measurement, and that distributions of the hidden variables that arise plausibly in a typical ensemble of experimental realizations give rise to outcome frequencies consistent with Borns rule. We outline steps in a theoretical validation of the hypothesis. We discuss the role of both initial and final conditions to determine a solution at intermediate times, the mechanism by which a system responds to measurement, time symmetry of the new theory, causality concerns, and issues surrounding solution of the IDE.
A novel theory of hybrid quantum-classical systems is developed, utilizing the mathematical framework of constrained dynamical systems on the quantum-classical phase space. Both, the quantum and the classical descriptions of the respective parts of the hybrid system are treated as fundamental. Therefore, the description of the quantum-classical interaction has to be postulated, and includes the effects of neglected degrees of freedom. Dynamical law of the theory is given in terms of nonlinear stochastic differential equations with Hamiltonian and gradient terms. The theory provides a successful dynamical description of the collapse during quantum measurement.
We investigate the properties of a recently proposed Gradient Echo Memory (GEM) scheme for information mapping between optical and atomic systems. We show that GEM can be described by the dynamic formation of polaritons in k-space. This picture highlights the flexibility and robustness with regards to the external control of the storage process. Our results also show that, as GEM is a frequency-encoding memory, it can accurately preserve the shape of signals that have large time-bandwidth products, even at moderate optical depths. At higher optical depths, we show that GEM is a high fidelity multi-mode quantum memory.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا