No Arabic abstract
21 cm power spectrum observations have the potential to revolutionize our understanding of the Epoch of Reionization and Dark Energy, but require extraordinarily precise data analysis methods to separate the cosmological signal from the astrophysical and instrumental contaminants. This analysis challenge has led to a diversity of proposed analyses, including delay spectra, imaging power spectra, m-mode analysis, and numerous others. This diversity of approach is a strength, but has also led to confusion within the community about whether insights gleaned by one group are applicable to teams working in different analysis frameworks. In this paper we show that all existing analysis proposals can be classified into two distinct families based on whether they estimate the power spectrum of the measured or reconstructed sky. This subtle difference in the statistical question posed largely determines the susceptibility of the analyses to foreground emission and calibration errors, and ultimately the science different analyses can pursue. In this paper we detail the origin of the two analysis families, categorize the analyses being actively developed, and explore their relative sensitivities to foreground contamination and calibration errors.
Measurement of the spatial distribution of neutral hydrogen via the redshifted 21 cm line promises to revolutionize our knowledge of the epoch of reionization and the first galaxies, and may provide a powerful new tool for observational cosmology from redshifts 1<z<4 . In this review we discuss recent advances in our theoretical understanding of the epoch of reionization (EoR), the application of 21 cm tomography to cosmology and measurements of the dark energy equation of state after reionization, and the instrumentation and observational techniques shared by 21 cm EoR and post reionization cosmology machines. We place particular emphasis on the expected signal and observational capabilities of first generation 21 cm fluctuation instruments.
We present an investigation of the horizon and its effect on global 21-cm observations and analysis. We find that the horizon cannot be ignored when modeling low frequency observations. Even if the sky and antenna beam are known exactly, forward models cannot fully describe the beam-weighted foreground component without accurate knowledge of the horizon. When fitting data to extract the 21-cm signal, a single time-averaged spectrum or independent multi-spectrum fits may be able to compensate for the bias imposed by the horizon. However, these types of fits lack constraining power on the 21-cm signal, leading to large uncertainties on the signal extraction, in some cases larger in magnitude than the 21-cm signal itself. A significant decrease in signal uncertainty can be achieved by performing multi-spectrum fits in which the spectra are modeled simultaneously with common parameters. The cost of this greatly increased constraining power, however, is that the time dependence of the horizons effect, which is more complex than its spectral dependence, must be precisely modeled to achieve a good fit. To aid in modeling the horizon, we present an algorithm and Python package for calculating the horizon profile from a given observation site using elevation data. We also address several practical concerns such as pixelization error, uncertainty in the horizon profile, and foreground obstructions such as surrounding buildings and vegetation. We demonstrate that our training set-based analysis pipeline can account for all of these factors to model the horizon well enough to precisely extract the 21-cm signal from simulated observations.
Maximally Smooth Functions (MSFs) are a form of constrained functions in which there are no inflection points or zero crossings in high order derivatives. Consequently, they have applications to signal recovery in experiments where signals of interest are expected to be non-smooth features masked by larger smooth signals or foregrounds. They can also act as a powerful tool for diagnosing the presence of systematics. The constrained nature of MSFs makes fitting these functions a non-trivial task. We introduce maxsmooth, an open source package that uses quadratic programming to rapidly fit MSFs. We demonstrate the efficiency and reliability of maxsmooth by comparison to commonly used fitting routines and show that we can reduce the fitting time by approximately two orders of magnitude. We introduce and implement with maxsmooth Partially Smooth Functions, which are useful for describing elements of non-smooth structure in foregrounds. This work has been motivated by the problem of foreground modelling in 21-cm cosmology. We discuss applications of maxsmooth to 21-cm cosmology and highlight this with examples using data from the Experiment to Detect the Global Epoch of Reionization Signature (EDGES) and the Large-aperture Experiment to Detect the Dark Ages (LEDA) experiments. We demonstrate the presence of a sinusoidal systematic in the EDGES data with a log-evidence difference of $86.19pm0.12$ when compared to a pure foreground fit. MSFs are applied to data from LEDA for the first time in this paper and we identify the presence of sinusoidal systematics. maxsmooth is pip installable and available for download at: https://github.com/htjb/maxsmooth
The separation of cosmological signal from astrophysical foregrounds is a fundamental challenge for any effort to probe the evolution of neutral hydrogen during the Cosmic Dawn and epoch of reionization (EoR) using the 21 cm hyperfine transition. Foreground separation is made possible by their intrinsic spectral smoothness, making them distinguishable from spectrally complex cosmological signal even though they are ~5 orders of magnitude brighter. Precisely calibrated radio interferometers are essential to maintaining the smoothness and thus separability of the foregrounds. One powerful calibration strategy is to use redundant measurements between pairs of antennas with the same physical separation in order to solve for each antennas spectral response without reference to a sky model. This strategy is being employed by the Hydrogen Epoch of Reionization Array (HERA), a large radio telescope in South Africa that is now observing while being built out to 350 14-m dishes. However, the deviations from perfect redundancy inherent in any real radio telescope complicate the calibration problem. Using simulations of HERA, we show how calibration with antenna-to-antenna variations in dish construction and placement generally lead to spectral structure in otherwise smooth foregrounds that significantly reduces the number of cosmological modes available to a 21 cm measurement. However, we also show that this effect can be largely eliminated by a modified redundant-baseline calibration strategy that relies predominantly on short baselines.
We use the results of previous work building a halo model formalism for the distribution of neutral hydrogen, along with experimental parameters of future radio facilities, to place forecasts on astrophysical and cosmological parameters from next generation surveys. We consider 21 cm intensity mapping surveys conducted using the BINGO, CHIME, FAST, TianLai, MeerKAT and SKA experimental configurations. We work with the 5-parameter cosmological dataset of {$Omega_m, sigma_8, h, n_s, Omega_b$} assuming a flat $Lambda$CDM model, and the astrophysical parameters {$v_{c,0}, beta$} which represent the cutoff and slope of the HI- halo mass relation. We explore (i) quantifying the effects of the astrophysics on the recovery of the cosmological parameters, (ii) the dependence of the cosmological forecasts on the details of the astrophysical parametrization, and (iii) the improvement of the constraints on probing smaller scales in the HI power spectrum. For an SKA I MID intensity mapping survey alone, probing scales up to $ell_{rm max} = 1000$, we find a factor of $1.1 - 1.3$ broadening in the constraints on $Omega_b$ and $Omega_m$, and of $2.4 - 2.6$ on $h$, $n_s$ and $sigma_8$, if we marginalize over astrophysical parameters without any priors. However, even the prior information coming from the present knowledge of the astrophysics largely alleviates this broadening. These findings do not change significantly on considering an extended HIHM relation, illustrating the robustness of the results to the choice of the astrophysical parametrization. Probing scales up to $ell_{rm max} = 2000$ improves the constraints by factors of 1.5-1.8. The forecasts improve on increasing the number of tomographic redshift bins, saturating, in many cases, with 4 - 5 redshift bins. We also forecast constraints for intensity mapping with other experiments, and draw similar conclusions.