Measurement of the dispersion of radiation from a steady cosmological source


الملخص بالإنكليزية

The `missing baryons of the near universe are believed to be principally in a partially ionized state. Although passing electromagnetic waves are dispersed by the plasma, the effect has hitherto not been utilized as a means of detection because it is generally believed that a successful observation requires the background source to be highly variable, ie~the class of sources that could potentially deliver a verdict is limited. We argue in two stages that this condition is not necessary. First, by modeling the fluctuations on macroscopic scales as interference between wave packets we show that, in accordance with the ideas advanced by Einstein in 1917, both the behavior of photons as bosons (ie~the intensity variance has contributions from Poisson and phase noise) and the van-Cittert-Zernike theorem are a consequence of wave-particle duality. Nevertheless, we then point out that in general the variance on some macroscopic timescale $tau$ consists of (a) a main contributing term $propto 1/tau$, plus (b) a small negative term $propto 1/tau^2$ due to the finite size of the wave packets. If the radiation passes through a dispersive medium, this size will be enlarged well beyond its vacuum minimum value of $Delta t approx 1/Delta u$, leading to a more negative (b) term (while (a) remains unchanged) and hence a suppression of the variance w.r.t. the vacuum scenario. The phenomenon, which is typically at the few parts in 10$^5$ level, enables one to measure cosmological dispersion in principle. Signal-to-noise estimates, along with systematic issues and how to overcome them, will be presented.

تحميل البحث