No Arabic abstract
Currently, HOPS and AIPS are the primary choices for the time-consuming process of (millimeter) Very Long Baseline Interferometry (VLBI) data calibration. However, for a full end-to-end pipeline, they either lack the ability to perform easily scriptable incremental calibration or do not provide full control over the workflow with the ability to manipulate and edit calibration solutions directly. The Common Astronomy Software Application (CASA) offers all these abilities, together with a secure development future and an intuitive Python interface, which is very attractive for young radio astronomers. Inspired by the recent addition of a global fringe-fitter, the capability to convert FITS-IDI files to measurement sets, and amplitude calibration routines based on ANTAB metadata, we have developed the the CASA-based Radboud PIpeline for the Calibration of high Angular Resolution Data (rPICARD). The pipeline will be able to handle data from multiple arrays: EHT, GMVA, VLBA and the EVN in the first release. Polarization and phase-referencing calibration are supported and a spectral line mode will be added in the future. The large bandwidths of future radio observatories ask for a scalable reduction software. Within CASA, a message passing interface (MPI) implementation is used for parallelization, reducing the total time needed for processing. The most significant gain is obtained for the time-consuming fringe-fitting task where each scan be processed in parallel.
Realistic synthetic observations of theoretical source models are essential for our understanding of real observational data. In using synthetic data, one can verify the extent to which source parameters can be recovered and evaluate how various data corruption effects can be calibrated. These studies are important when proposing observations of new sources, in the characterization of the capabilities of new or upgraded instruments, and when verifying model-based theoretical predictions in a comparison with observational data. We present the SYnthetic Measurement creator for long Baseline Arrays (SYMBA), a novel synthetic data generation pipeline for Very Long Baseline Interferometry (VLBI) observations. SYMBA takes into account several realistic atmospheric, instrumental, and calibration effects. We used SYMBA to create synthetic observations for the Event Horizon Telescope (EHT), a mm VLBI array, which has recently captured the first image of a black hole shadow. After testing SYMBA with simple source and corruption models, we study the importance of including all corruption and calibration effects. Based on two example general relativistic magnetohydrodynamics (GRMHD) model images of M87, we performed case studies to assess the attainable image quality with the current and future EHT array for different weather conditions. The results show that the effects of atmospheric and instrumental corruptions on the measured visibilities are significant. Despite these effects, we demonstrate how the overall structure of the input models can be recovered robustly after performing calibration steps. With the planned addition of new stations to the EHT array, images could be reconstructed with higher angular resolution and dynamic range. In our case study, these improvements allowed for a distinction between a thermal and a non-thermal GRMHD model based on salient features in reconstructed images.
We present VOLKS2, the second release of VLBI Observation for transient Localization Keen Searcher. The pipeline aims at transient search in regular VLBI observations as well as detection of single pulses from known sources in dedicated VLBI observations. The underlying method takes the idea of geodetic VLBI data processing, including fringe fitting to maximize the signal power and geodetic VLBI solving for localization. By filtering the candidate signals with multiple windows within a baseline and by cross matching with multiple baselines, RFIs are eliminated effectively. Unlike the station auto spectrum based method, RFI flagging is not required in the VOLKS2 pipeline. EVN observation (EL060) is carried out, so as to verify the pipelines detection efficiency and localization accuracy in the whole FoV. The pipeline is parallelized with MPI and further accelerated with GPU, so as to exploit the hardware resources of modern GPU clusters. We can prove that, with proper optimization, VOLKS2 could achieve comparable performance as auto spectrum based pipelines. All the code and documents are publicly available, in the hope that our pipeline is useful for radio transient studies.
As part of an on-going effort to simplify the data analysis path for VLBI experiments, a pipeline procedure has been developed at JIVE to carry out much of the data reduction required for EVN experiments in an automated fashion. This pipeline procedure runs entirely within AIPS, the standard data reduction package used in astronomical VLBI, and is used to provide preliminary calibration of EVN experiments correlated at the EVN MkIV data processor. As well as simplifying the analysis for EVN users, the pipeline reduces the delay in providing information on the data quality to participating telescopes, hence improving the overall performance of the array. A description of this pipeline is presented here.
The calibration and analysis of polarization observations in Very Long Baseline Interferometry (VLBI) requires the use of specific algorithms that suffer from several limitations, closely related to assumptions in the data properties that may not hold in observations taken with new-generation VLBI equipment. Nowadays, the instantaneous bandwidth achievable with VLBI backends can be as high as several GHz, covering several radio bands simultaneously. In addition, the sensitivity of VLBI observations with state-of-the-art equipment may reach dynamic ranges of tens of thousands, both in total intensity and in polarization. In this paper, we discuss the impact of the limitations of common VLBI polarimetry algorithms on narrow-field observations taken with modern VLBI arrays (from the VLBI Global Observing System, VGOS, to the Event Horizon Telescope, EHT) and present new software that overcomes these limitations. In particular, our software is able to perform a simultaneous fit of multiple calibrator sources, include non-linear terms in the model of the instrumental polarization and use a self-calibration approach for the estimate of the polarization leakage in the antenna receivers.
We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response, and reconstructs the data cube using one of three extraction algorithms: aperture photometry, optimal extraction, or $chi^2$ fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a $chi^2$-based extraction of the data cube, with typical residuals of ~5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the $chi^2$ extraction allows us to model and remove correlated read noise, dramatically improving CHARIS performance. The $chi^2$ extraction produces a data cube that has been deconvolved with the line-spread function, and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.