No Arabic abstract
New and upcoming radio interferometers will produce unprecedented amounts of data that demand extremely powerful computers for processing. This is a limiting factor due to the large computational power and energy costs involved. Such limitations restrict several key data processing steps in radio interferometry. One such step is calibration where systematic errors in the data are determined and corrected. Accurate calibration is an essential component in reaching many scientific goals in radio astronomy and the use of consensus optimization that exploits the continuity of systematic errors across frequency significantly improves calibration accuracy. In order to reach full consensus, data at all frequencies need to be calibrated simultaneously. In the SKA regime, this can become intractable if the available compute agents do not have the resources to process data from all frequency channels simultaneously. In this paper, we propose a multiplexing scheme that is based on the alternating direction method of multipliers (ADMM) with cyclic updates. With this scheme, it is possible to simultaneously calibrate the full dataset using far fewer compute agents than the number of frequencies at which data are available. We give simulation results to show the feasibility of the proposed multiplexing scheme in simultaneously calibrating a full dataset when a limited number of compute agents are available.
The redshifted 21 cm line of neutral hydrogen is a promising probe of the Epoch of Reionization (EoR). However, its detection requires a thorough understanding and control of the systematic errors. We study two systematic biases observed in the LOFAR EoR residual data after calibration and subtraction of bright discrete foreground sources. The first effect is a suppression in the diffuse foregrounds, which could potentially mean a suppression of the 21 cm signal. The second effect is an excess of noise beyond the thermal noise. The excess noise shows fluctuations on small frequency scales, and hence it can not be easily removed by foreground removal or avoidance methods. Our analysis suggests that sidelobes of residual sources due to the chromatic point spread function and ionospheric scintillation can not be the dominant causes of the excess noise. Rather, both the suppression of diffuse foregrounds and the excess noise can occur due to calibration with an incomplete sky model containing predominantly bright discrete sources. We show that calibrating only on bright sources can cause suppression of other signals and introduce an excess noise in the data. The levels of the suppression and excess noise depend on the relative flux of sources which are not included in the model with respect to the flux of modeled sources. We discuss possible solutions such as using only long baselines to calibrate the interferometric gain solutions as well as simultaneous multi-frequency calibration along with their benefits and shortcomings.
It has recently been shown that radio interferometric gain calibration can be expressed succinctly in the language of complex optimisation. In addition to providing an elegant framework for further development, it exposes properties of the calibration problem which can be exploited to accelerate traditional non-linear least squares solvers such as Gauss-Newton and Levenberg-Marquardt. We extend existing derivations to chains of Jones terms: products of several gains which model different aberrant effects. In doing so, we find that the useful properties found in the single term case still hold. We also develop several specialised solvers which deal with complex gains parameterised by real values. The newly developed solvers have been implemented in a Python package called CubiCal, which uses a combination of Cython, multiprocessing and shared memory to leverage the power of modern hardware. We apply CubiCal to both simulated and real data, and perform both direction-independent and direction-dependent self-calibration. Finally, we present the results of some rudimentary profiling to show that CubiCal is competitive with respect to existing calibration tools such as MeqTrees.
The Epoch of Reionisation (EoR) is the period within which the neutral universe transitioned to an ionised one. This period remains unobserved using low-frequency radio interferometers which target the 21 cm signal of neutral hydrogen emitted in this era. The Murchison Widefield Array (MWA) radio telescope was built with the detection of this signal as one of its major science goals. One of the most significant challenges towards a successful detection is that of calibration, especially in the presence of the Earths ionosphere. By introducing refractive source shifts, distorting source shapes and scintillating flux densities, the ionosphere is a major nuisance in low-frequency radio astronomy. We introduce SIVIO, a software tool developed for simulating observations of the MWA through different ionospheric conditions estimated using thin screen approximation models and propagated into the visibilities. This enables us to directly assess the impact of the ionosphere on observed EoR data and the resulting power spectra. We show that the simulated data captures the dispersive behaviour of ionospheric effects. We show that the spatial structure of the simulated ionospheric media is accurately reconstructed either from the resultant source positional offsets or from parameters evaluated during the data calibration procedure. In turn, this will inform on the best strategies of identifying and efficiently eliminating ionospheric contamination in EoR data moving into the Square Kilometre Array era.
Heterodyne receivers register the sky signal on either a circular polarization basis (where it is split into left-hand and right-hand circular polarization) or a linear polarization basis (where it is split into horizontal and vertical linear polarization). We study the problem of interferometric observations performed with telescopes that observe on different polarization bases, hence producing visibilities that we call mixed basis (i.e., linear in one telescope and circular in the other). We present novel algorithms for the proper calibration and treatment of such interferometric observations and test our algorithms with both simulations and real data. The use of our algorithms will be important for the optimum calibration of forthcoming observations with the Atacama Large mm/submm Array (ALMA) in very-long-baseline interferometry (VLBI) mode. Our algorithms will also allow us to optimally calibrate future VLBI observations at very high data rates (i.e., wide bandwidths), where linear-polarization feeds will be preferable at some stations, to overcome the polarimetric limitations due to the use of quarter-wave plates.
Hydrogen intensity mapping is a new field in astronomy that promises to make three-dimensional maps of the matter distribution of the Universe using the redshifted $21,textrm{cm}$ line of neutral hydrogen gas (HI). Several ongoing and upcoming radio interferometers, such as Tianlai, CHIME, HERA, HIRAX, etc. are using this technique. These instruments are designed to map large swaths of the sky by drift scanning over periods of many months. One of the challenges of the observations is that the daytime data is contaminated by strong radio signals from the Sun. In the case of Tianlai, this results in almost half of the measured data being unusable. We try to address this issue by developing an algorithm for solar contamination removal (AlgoSCR) from the radio data. The algorithm is based on an eigenvalue analysis of the visibility matrix, and hence is applicable only to interferometers. We apply AlgoSCR to simulated visibilities, as well as real daytime data from the Tianlai dish array. The algorithm can remove most of the solar contamination without seriously affecting other sky signals and thus makes the data usable for certain applications.