ترغب بنشر مسار تعليمي؟ اضغط هنا

81 - Sean M. Carroll 2021
I defend the extremist position that the fundamental ontology of the world consists of a vector in Hilbert space evolving according to the Schrodinger equation. The laws of physics are determined solely by the energy eigenspectrum of the Hamiltonian. The structure of our observed world, including space and fields living within it, should arise as a higher-level emergent description. I sketch how this might come about, although much work remains to be done.
We study the conservation of energy, or lack thereof, when measurements are performed in quantum mechanics. The expectation value of the Hamiltonian of a system can clearly change when wave functions collapse in accordance with the standard textbook (Copenhagen) treatment of quantum measurement, but one might imagine that the change in energy is compensated by the measuring apparatus or environment. We show that this is not true; the change in the energy of a state after measurement can be arbitrarily large, independent of the physical measurement process. In Everettian quantum theory, while the expectation value of the Hamiltonian is conserved for the wave function of the universe (including all the branches), it is not constant within individual worlds. It should therefore be possible to experimentally measure violations of conservation of energy, and we suggest an experimental protocol for doing so.
125 - Sean M. Carroll 2021
Effective Field Theory (EFT) is the successful paradigm underlying modern theoretical physics, including the Core Theory of the Standard Model of particle physics plus Einsteins general relativity. I will argue that EFT grants us a unique insight: ea ch EFT model comes with a built-in specification of its domain of applicability. Hence, once a model is tested within some domain (of energies and interaction strengths), we can be confident that it will continue to be accurate within that domain. Currently, the Core Theory has been tested in regimes that include all of the energy scales relevant to the physics of everyday life (biology, chemistry, technology, etc.). Therefore, we have reason to be confident that the laws of physics underlying the phenomena of everyday life are completely known.
We study the question of how to decompose Hilbert space into a preferred tensor-product factorization without any pre-existing structure other than a Hamiltonian operator, in particular the case of a bipartite decomposition into system and environmen t. Such a decomposition can be defined by looking for subsystems that exhibit quasi-classical behavior. The correct decomposition is one in which pointer states of the system are relatively robust against environmental monitoring (their entanglement with the environment does not continually and dramatically increase) and remain localized around approximately-classical trajectories. We present an in-principle algorithm for finding such a decomposition by minimizing a combination of entanglement growth and internal spreading of the system. Both of these properties are related to locality in different ways. This formalism could be relevant to the emergence of spacetime from quantum entanglement.
207 - Sean M. Carroll 2018
It seems natural to ask why the universe exists at all. Modern physics suggests that the universe can exist all by itself as a self-contained system, without anything external to create or sustain it. But there might not be an absolute answer to why it exists. I argue that any attempt to account for the existence of something rather than nothing must ultimately bottom out in a set of brute facts; the universe simply is, without ultimate cause or explanation.
To the best of our current understanding, quantum mechanics is part of the most fundamental picture of the universe. It is natural to ask how pure and minimal this fundamental quantum description can be. The simplest quantum ontology is that of the E verett or Many-Worlds interpretation, based on a vector in Hilbert space and a Hamiltonian. Typically one also relies on some classical structure, such as space and local configuration variables within it, which then gets promoted to an algebra of preferred observables. We argue that even such an algebra is unnecessary, and the most basic description of the world is given by the spectrum of the Hamiltonian (a list of energy eigenvalues) and the components of some particular vector in Hilbert space. Everything else - including space and fields propagating on it - is emergent from these minimal elements.
114 - Sean M. Carroll 2018
Cosmological models that invoke a multiverse - a collection of unobservable regions of space where conditions are very different from the region around us - are controversial, on the grounds that unobservable phenomena shouldnt play a crucial role in legitimate scientific theories. I argue that the way we evaluate multiverse models is precisely the same as the way we evaluate any other models, on the basis of abduction, Bayesian inference, and empirical success. There is no scientifically respectable way to do cosmology without taking into account different possibilities for what the universe might be like outside our horizon. Multiverse theories are utterly conventionally scientific, even if evaluating them can be difficult in practice.
215 - Ning Bao , Sean M. Carroll , 2017
We argue in a model-independent way that the Hilbert space of quantum gravity is locally finite-dimensional. In other words, the density operator describing the state corresponding to a small region of space, when such a notion makes sense, is define d on a finite-dimensional factor of a larger Hilbert space. Because quantum gravity potentially describes superpo- sitions of different geometries, it is crucial that we associate Hilbert-space factors with spatial regions only on individual decohered branches of the universal wave function. We discuss some implications of this claim, including the fact that quantum field theory cannot be a fundamental description of Nature.
129 - Sean M. Carroll 2017
Some modern cosmological models predict the appearance of Boltzmann Brains: observers who randomly fluctuate out of a thermal bath rather than naturally evolving from a low-entropy Big Bang. A theory in which most observers are of the Boltzmann Brain type is generally thought to be unacceptable, although opinions differ. I argue that such theories are indeed unacceptable: the real problem is with fluctuations into observers who are locally identical to ordinary observers, and their existence cannot be swept under the rug by a choice of probability distributions over observers. The issue is not that the existence of such observers is ruled out by data, but that the theories that predict them are cognitively unstable: they cannot simultaneously be true and justifiably believed.
We derive a generalization of the Second Law of Thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenters knowledge to be updated by the m easurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically-evolving system degrades over time. The Bayesian Second Law can be written as $Delta H(rho_m, rho) + langle mathcal{Q}rangle_{F|m}geq 0$, where $Delta H(rho_m, rho)$ is the change in the cross entropy between the original phase-space probability distribution $rho$ and the measurement-updated distribution $rho_m$, and $langle mathcal{Q}rangle_{F|m}$ is the expectation value of a generalized heat flow out of the system. We also derive refin
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا