No Arabic abstract
When we want to predict the future, we compute it from what we know about the present. Specifically, we take a mathematical representation of observed reality, plug it into some dynamical equations, and then map the time-evolved result back to real-world predictions. But while this computational process can tell us what we want to know, we have taken this procedure too literally, implicitly assuming that the universe must compute itself in the same manner. Physical theories that do not follow this computational framework are deemed illogical, right from the start. But this anthropocentric assumption has steered our physical models into an impossible corner, primarily because of quantum phenomena. Meanwhile, we have not been exploring other models in which the universe is not so limited. In fact, some of these alternate models already have a well-established importance, but are thought to be mathematical tricks without physical significance. This essay argues that only by dropping our assumption that the universe is a computer can we fully develop such models, explain quantum phenomena, and understand the workings of our universe. (This essay was awarded third prize in the 2012 FQXi essay contest; a new afterword compares and contrasts this essay with Robert Spekkens first prize entry.)
The standard model of cosmology is based on the existence of homogeneous surfaces as the background arena for structure formation. Homogeneity underpins both general relativistic and modified gravity models and is central to the way in which we interpret observations of the CMB and the galaxy distribution. However, homogeneity cannot be directly observed in the galaxy distribution or CMB, even with perfect observations, since we observe on the past lightcone and not on spatial surfaces. We can directly observe and test for isotropy, but to link this to homogeneity, we need to assume the Copernican Principle. First, we discuss the link between isotropic observations on the past lightcone and isotropic spacetime geometry: what observations do we need to be isotropic in order to deduce spacetime isotropy? Second, we discuss what we can say with the Copernican assumption. The most powerful result is based on the CMB: the vanishing of the dipole, quadrupole and octupole of the CMB is sufficient to impose homogeneity. Real observations lead to near-isotropy on large scales - does this lead to near-homogeneity? There are important partial results, and we discuss why this remains a difficult open question. Thus we are currently unable to prove homogeneity of the Universe on large-scales, even with the Copernican Principle. However we can use observations of the CMB, galaxies and clusters to test homogeneity itself.
Dark matter (DM) comes from long-range gravitational observations, and it is considered as something that does not interact with ordinary matter or emits light. However, also on much smaller scales, a number of unexpected observations of the solar activity and the dynamic Earth atmosphere might arise from DM contradicting the aforementioned DM picture. Because, gravitational (self) focusing effects by the Sun or its planets of streaming DM fit as the interpretation of the otherwise puzzling 11-year solar cycle, the mysterious heating of the solar corona, atmospheric transients, etc. Observationally driven, an external impact by overlooked streaming invisible matter reconciles the investigated mysterious behavior showing otherwise unexpected planetary relationships; this is a signature for gravitational focusing of streaming DM by the solar system bodies. Then, focusing of DM streams could also occur in exoplanetary systems, suggesting for the first time the carrying out of investigations by searching for the associated stellar activity as a function of the exoplanetary orbital phases.
A recent article by Mathur attempts a precise formulation for the paradox of black hole information loss [S. D. Mathur, arXiv:1108.0302v2 (hep-th)]. We point out that a key component of the above work, which refers to entangled pairs inside and outside of the horizon and their associated entropy gain or information loss during black hole evaporation, is a presumptuous false outcome not backed by the very foundation of physics. The very foundation of Mathurs above work is thus incorrect. We further show that within the framework of Hawking radiation as tunneling the so-called small corrections are sufficient to resolve the information loss problem.
The measurement of present-day temperature of the Cosmic Microwave Background (CMB), $T_0 = 2.72548 pm 0.00057$ K (1$sigma$), made by the Far-InfraRed Absolute Spectrophotometer (FIRAS), is one of the most precise measurements ever made in Cosmology. On the other hand, estimates of the Hubble Constant, $H_0$, obtained from measurements of the CMB temperature fluctuations assuming the standard $Lambda$CDM model exhibit a large ($4.1sigma$) tension when compared with low-redshift, model-independent observations. Recently, some authors argued that a slightly change in $T_0$ could alleviate or solve the $H_0$-tension problem. Here, we investigate evidence for a hotter or colder universe by performing an independent analysis from currently available temperature-redshift $T(z)$ measurements. Our analysis (parametric and non-parametric) shows a good agreement with the FIRAS measurement and a discrepancy of $gtrsim 1.9sigma$ from the $T_0$ values required to solve the $H_0$ tension. This result reinforces the idea that a solution of the $H_0$-tension problem in fact requires either a better understanding of the systematic errors on the $H_0$ measurements or new physics.
We specify the semiclassical no-boundary wave function of the universe without relying on a functional integral of any kind. The wave function is given as a sum of specific saddle points of the dynamical theory that satisfy conditions of regularity on geometry and field and which together yield a time neutral state that is normalizable in an appropriate inner product. This specifies a predictive framework of semiclassical quantum cosmology that is adequate to make probabilistic predictions, which are in agreement with observations in simple models. The use of holography to go beyond the semiclassical approximation is briefly discussed.