No Arabic abstract
We report on the MIT Epoch of Reionization (MITEoR) experiment, a pathfinder low-frequency radio interferometer whose goal is to test technologies that improve the calibration precision and reduce the cost of the high-sensitivity 3D mapping required for 21 cm cosmology. MITEoR accomplishes this by using massive baseline redundancy, which enables both automated precision calibration and correlator cost reduction. We demonstrate and quantify the power and robustness of redundancy for scalability and precision. We find that the calibration parameters precisely describe the effect of the instrument upon our measurements, allowing us to form a model that is consistent with $chi^2$ per degree of freedom < 1.2 for as much as 80% of the observations. We use these results to develop an optimal estimator of calibration parameters using Wiener filtering, and explore the question of how often and how finely in frequency visibilities must be reliably measured to solve for calibration coefficients. The success of MITEoR with its 64 dual-polarization elements bodes well for the more ambitious Hydrogen Epoch of Reionization Array (HERA) project and other next-generation instruments, which would incorporate many identical or similar technologies.
Calibration precision is currently a limiting systematic in 21 cm cosmology experiments. While there are innumerable calibration approaches, most can be categorized as either `sky-based, relying on an extremely accurate model of astronomical foreground emission, or `redundant, requiring a precisely regular array with near-identical antenna response patterns. Both of these classes of calibration are inflexible to the realities of interferometric measurement. In practice, errors in the foreground model, antenna position offsets, and beam response inhomogeneities degrade calibration performance and contaminate the cosmological signal. Here we show that sky-based and redundant calibration can be unified into a highly general and physically motivated calibration framework based on a Bayesian statistical formalism. Our new framework includes sky and redundant calibration as special cases but can additionally support relaxing the rigid assumptions implicit in those approaches. Furthermore, we present novel calibration techniques such as redundant calibration for arrays with no redundant baselines, representing an alternative calibration method for imaging arrays such as the MWA Phase I. These new calibration approaches could mitigate systematics and reduce calibration error, thereby improving the precision of cosmological measurements.
In this white paper, we lay out a US roadmap for high-redshift 21 cm cosmology (30 < z < 6) in the 2020s. Beginning with the currently-funded HERA and MWA Phase II projects and advancing through the decade with a coordinated program of small-scale instrumentation, software, and analysis projects targeting technology development, this roadmap incorporates our current best understanding of the systematics confronting 21 cm cosmology into a plan for overcoming them, enabling next-generation, mid-scale 21 cm arrays to be proposed late in the decade. Submitted for consideration by the Astro2020 Decadal Survey Program Panel for Radio, Millimeter, and Submillimeter Observations from the Ground as a Medium-Sized Project.
An array of low-frequency dipole antennas on the lunar farside surface will probe a unique, unexplored epoch in the early Universe called the Dark Ages. It begins at Recombination when neutral hydrogen atoms formed, first revealed by the cosmic microwave background. This epoch is free of stars and astrophysics, so it is ideal to investigate high energy particle processes including dark matter, early Dark Energy, neutrinos, and cosmic strings. A NASA-funded study investigated the design of the instrument and the deployment strategy from a lander of 128 pairs of antenna dipoles across a 10 kmx10 km area on the lunar surface. The antenna nodes are tethered to the lander for central data processing, power, and data transmission to a relay satellite. The array, named FARSIDE, would provide the capability to image the entire sky in 1400 channels spanning frequencies from 100 kHz to 40 MHz, extending down two orders of magnitude below bands accessible to ground-based radio astronomy. The lunar farside can simultaneously provide isolation from terrestrial radio frequency interference, the Earths auroral kilometric radiation, and plasma noise from the solar wind. It is thus the only location within the inner solar system from which sky noise limited observations can be carried out at sub-MHz frequencies. Through precision calibration via an orbiting beacon and exquisite foreground characterization, the farside array would measure the Dark Ages global 21-cm signal at redshifts z~35-200. It will also be a pathfinder for a larger 21-cm power spectrum instrument by carefully measuring the foreground with high dynamic range.
We present a case study of a cloud-based computational workflow for processing large astronomical data sets from the Murchison Widefield Array (MWA) cosmology experiment. Cloud computing is well-suited to large-scale, episodic computation because it offers extreme scalability in a pay-for-use model. This facilitates fast turnaround times for testing computationally expensive analysis techniques. We describe how we have used the Amazon Web Services (AWS) cloud platform to efficiently and economically test and implement our data analysis pipeline. We discuss the challenges of working with the AWS spot market, which reduces costs at the expense of longer processing turnaround times, and we explore this tradeoff with a Monte Carlo simulation.
Contamination from instrumental effects interacting with bright astrophysical sources is the primary impediment to measuring Epoch of Reionization and BAO 21 cm power spectra---an effect called mode-mixing. In this paper we identify four fundamental power spectrum shapes produced by mode-mixing that will affect all upcoming observations. We are able, for the first time, to explain the wedge-like structure seen in advanced simulations and to forecast the shape of an EoR window that is mostly free of contamination. Understanding the origins of these contaminations also enables us to identify calibration and foreground subtraction errors below the imaging limit, providing a powerful new tool for precision observations.