No Arabic abstract
One of the last unexplored windows to the cosmos, the Dark Ages and Cosmic Dawn, can be opened using a simple low frequency radio telescope from the stable, quiet lunar farside to measure the Global 21-cm spectrum. This frontier remains an enormous gap in our knowledge of the Universe. Standard models of physics and cosmology are untested during this critical epoch. The messenger of information about this period is the 1420 MHz (21-cm) radiation from the hyperfine transition of neutral hydrogen, Doppler-shifted to low radio astronomy frequencies by the expansion of the Universe. The Global 21-cm spectrum uniquely probes the cosmological model during the Dark Ages plus the evolving astrophysics during Cosmic Dawn, yielding constraints on the first stars, on accreting black holes, and on exotic physics such as dark matter-baryon interactions. A single low frequency radio telescope can measure the Global spectrum between ~10-110 MHz because of the ubiquity of neutral hydrogen. Precise characterizations of the telescope and its surroundings are required to detect this weak, isotropic emission of hydrogen amidst the bright foreground Galactic radiation. We describe how two antennas will permit observations over the full frequency band: a pair of orthogonal wire antennas and a 0.3-m$^3$ patch antenna. A four-channel correlation spectropolarimeter forms the core of the detector electronics. Technology challenges include advanced calibration techniques to disentangle covariances between a bright foreground and a weak 21-cm signal, using techniques similar to those for the CMB, thermal management for temperature swings of >250C, and efficient power to allow operations through a two-week lunar night. This simple telescope sets the stage for a lunar farside interferometric array to measure the Dark Ages power spectrum.
21 cm power spectrum observations have the potential to revolutionize our understanding of the Epoch of Reionization and Dark Energy, but require extraordinarily precise data analysis methods to separate the cosmological signal from the astrophysical and instrumental contaminants. This analysis challenge has led to a diversity of proposed analyses, including delay spectra, imaging power spectra, m-mode analysis, and numerous others. This diversity of approach is a strength, but has also led to confusion within the community about whether insights gleaned by one group are applicable to teams working in different analysis frameworks. In this paper we show that all existing analysis proposals can be classified into two distinct families based on whether they estimate the power spectrum of the measured or reconstructed sky. This subtle difference in the statistical question posed largely determines the susceptibility of the analyses to foreground emission and calibration errors, and ultimately the science different analyses can pursue. In this paper we detail the origin of the two analysis families, categorize the analyses being actively developed, and explore their relative sensitivities to foreground contamination and calibration errors.
An array of low-frequency dipole antennas on the lunar farside surface will probe a unique, unexplored epoch in the early Universe called the Dark Ages. It begins at Recombination when neutral hydrogen atoms formed, first revealed by the cosmic microwave background. This epoch is free of stars and astrophysics, so it is ideal to investigate high energy particle processes including dark matter, early Dark Energy, neutrinos, and cosmic strings. A NASA-funded study investigated the design of the instrument and the deployment strategy from a lander of 128 pairs of antenna dipoles across a 10 kmx10 km area on the lunar surface. The antenna nodes are tethered to the lander for central data processing, power, and data transmission to a relay satellite. The array, named FARSIDE, would provide the capability to image the entire sky in 1400 channels spanning frequencies from 100 kHz to 40 MHz, extending down two orders of magnitude below bands accessible to ground-based radio astronomy. The lunar farside can simultaneously provide isolation from terrestrial radio frequency interference, the Earths auroral kilometric radiation, and plasma noise from the solar wind. It is thus the only location within the inner solar system from which sky noise limited observations can be carried out at sub-MHz frequencies. Through precision calibration via an orbiting beacon and exquisite foreground characterization, the farside array would measure the Dark Ages global 21-cm signal at redshifts z~35-200. It will also be a pathfinder for a larger 21-cm power spectrum instrument by carefully measuring the foreground with high dynamic range.
We present a methodology for ensuring the robustness of our analysis pipeline in separating the global 21-cm hydrogen cosmology signal from large systematics based on singular value decomposition (SVD) of training sets. We show how traditional goodness-of-fit metrics such as the $chi^2$ statistic that assess the fit to the full data may not be able to detect a suboptimal extraction of the 21-cm signal when it is fit alongside one or more additional components due to significant covariance between them. However, we find that comparing the number of SVD eigenmodes for each component chosen by the pipeline for a given fit to the distribution of eigenmodes chosen for synthetic data realizations created from training set curves can detect when one or more of the training sets is insufficient to optimally extract the signal. Furthermore, this test can distinguish which training set (e.g. foreground, 21-cm signal) needs to be modified in order to better describe the data and improve the quality of the 21-cm signal extraction. We also extend this goodness-of-fit testing to cases where a prior distribution derived from the training sets is applied and find that, in this case, the $chi^2$ statistic as well as the recently introduced $psi^2$ statistic are able to detect inadequacies in the training sets due to the increased restrictions imposed by the prior. Crucially, the tests described in this paper can be performed when analyzing any type of observations with our pipeline.
We present an investigation of the horizon and its effect on global 21-cm observations and analysis. We find that the horizon cannot be ignored when modeling low frequency observations. Even if the sky and antenna beam are known exactly, forward models cannot fully describe the beam-weighted foreground component without accurate knowledge of the horizon. When fitting data to extract the 21-cm signal, a single time-averaged spectrum or independent multi-spectrum fits may be able to compensate for the bias imposed by the horizon. However, these types of fits lack constraining power on the 21-cm signal, leading to large uncertainties on the signal extraction, in some cases larger in magnitude than the 21-cm signal itself. A significant decrease in signal uncertainty can be achieved by performing multi-spectrum fits in which the spectra are modeled simultaneously with common parameters. The cost of this greatly increased constraining power, however, is that the time dependence of the horizons effect, which is more complex than its spectral dependence, must be precisely modeled to achieve a good fit. To aid in modeling the horizon, we present an algorithm and Python package for calculating the horizon profile from a given observation site using elevation data. We also address several practical concerns such as pixelization error, uncertainty in the horizon profile, and foreground obstructions such as surrounding buildings and vegetation. We demonstrate that our training set-based analysis pipeline can account for all of these factors to model the horizon well enough to precisely extract the 21-cm signal from simulated observations.
The 21-cm signal of neutral hydrogen is a sensitive probe of the Epoch of Reionization (EoR) and Cosmic Dawn. Currently operating radio telescopes have ushered in a data-driven era of 21-cm cosmology, providing the first constraints on the astrophysical properties of sources that drive this signal. However, extracting astrophysical information from the data is highly non-trivial and requires the rapid generation of theoretical templates over a wide range of astrophysical parameters. To this end emulators are often employed, with previous efforts focused on predicting the power spectrum. In this work we introduce 21cmGEM - the first emulator of the global 21-cm signal from Cosmic Dawn and the EoR. The smoothness of the output signal is guaranteed by design. We train neural networks to predict the cosmological signal using a database of ~30,000 simulated signals which were created by varying seven astrophysical parameters: the star formation efficiency and the minimal mass of star-forming halos; the efficiency of the first X-ray sources and their spectrum parameterized by spectral index and the low energy cutoff; the mean free path of ionizing photons and the CMB optical depth. We test the performance with a set of ~2,000 simulated signals, showing that the relative error in the prediction has an r.m.s. of 0.0159. The algorithm is efficient, with a running time per parameter set of 0.16 sec. Finally, we use the database of models to check the robustness of relations between the features of the global signal and the astrophysical parameters that we previously reported.