No Arabic abstract
We present VOLKS2, the second release of VLBI Observation for transient Localization Keen Searcher. The pipeline aims at transient search in regular VLBI observations as well as detection of single pulses from known sources in dedicated VLBI observations. The underlying method takes the idea of geodetic VLBI data processing, including fringe fitting to maximize the signal power and geodetic VLBI solving for localization. By filtering the candidate signals with multiple windows within a baseline and by cross matching with multiple baselines, RFIs are eliminated effectively. Unlike the station auto spectrum based method, RFI flagging is not required in the VOLKS2 pipeline. EVN observation (EL060) is carried out, so as to verify the pipelines detection efficiency and localization accuracy in the whole FoV. The pipeline is parallelized with MPI and further accelerated with GPU, so as to exploit the hardware resources of modern GPU clusters. We can prove that, with proper optimization, VOLKS2 could achieve comparable performance as auto spectrum based pipelines. All the code and documents are publicly available, in the hope that our pipeline is useful for radio transient studies.
As the sensitivity and observing time of gravitational-wave detectors increase, a more diverse range of signals is expected to be observed from a variety of sources. Especially, long-lived gravitational-wave transients have received interest in the last decade. Because most of long-duration signals are poorly modeled, detection must rely on generic search algorithms, which make few or no assumption on the nature of the signal. However, the computational cost of those searches remains a limiting factor, which leads to sub-optimal sensitivity. Several detection algorithms have been developed to cope with this issue. In this paper, we present a new data analysis pipeline to search for un-modeled long-lived transient gravitational-wave signals with duration between 10 and 1000 s, based on an excess cross-power statistic in a network of detectors. The pipeline implements several new features that are intended to reduce computational cost and increase detection sensitivity for a wide range of signal morphologies. The method is generalized to a network of an arbitrary number of detectors and aims to provide a stable interface for further improvements. Comparisons with a previous implementation of a similar method on simulated and real gravitational-wave data show an overall increase in detection efficiency depending on the signal morphology, and a computing time reduced by at least a factor 10.
Currently, HOPS and AIPS are the primary choices for the time-consuming process of (millimeter) Very Long Baseline Interferometry (VLBI) data calibration. However, for a full end-to-end pipeline, they either lack the ability to perform easily scriptable incremental calibration or do not provide full control over the workflow with the ability to manipulate and edit calibration solutions directly. The Common Astronomy Software Application (CASA) offers all these abilities, together with a secure development future and an intuitive Python interface, which is very attractive for young radio astronomers. Inspired by the recent addition of a global fringe-fitter, the capability to convert FITS-IDI files to measurement sets, and amplitude calibration routines based on ANTAB metadata, we have developed the the CASA-based Radboud PIpeline for the Calibration of high Angular Resolution Data (rPICARD). The pipeline will be able to handle data from multiple arrays: EHT, GMVA, VLBA and the EVN in the first release. Polarization and phase-referencing calibration are supported and a spectral line mode will be added in the future. The large bandwidths of future radio observatories ask for a scalable reduction software. Within CASA, a message passing interface (MPI) implementation is used for parallelization, reducing the total time needed for processing. The most significant gain is obtained for the time-consuming fringe-fitting task where each scan be processed in parallel.
We describe the difference imaging pipeline (DiffImg) used to detect transients in deep images from the Dark Energy Survey Supernova program (DES-SN) in its first observing season from Aug 2013 through Feb 2014. DES-SN is a search for transients in which ten 3-deg^2 fields are repeatedly observed in the g,r,i,z passbands with a cadence of about 1 week. The observing strategy has been optimized to measure high-quality light curves and redshifts for thousands of Type Ia supernova (SN Ia) with the goal of measuring dark energy parameters. The essential DiffImg functions are to align each search image to a deep reference image, do a pixel-by-pixel subtraction, and then examine the subtracted image for significant positive detections of point-source objects. The vast majority of detections are subtraction artifacts, but after selection requirements and image filtering with an automated scanning program, there are 130 detections per deg^2 per observation in each band, of which only 25% are artifacts. Of the 7500 transients discovered by DES-SN in its first observing season, each requiring a detection on at least 2 separate nights, Monte Carlo simulations predict that 27% are expected to be supernova. Another 30% of the transients are artifacts, and most of the remaining transients are AGN and variable stars. Fake SNe Ia are overlaid onto the images to rigorously evaluate detection efficiencies, and to understand the DiffImg performance. The DiffImg efficiency measured with fake SNe agrees well with expectations from a Monte Carlo simulation that uses analytical calculations of the fluxes and their uncertainties. In our 8 shallow fields with single-epoch 50% completeness depth 23.5, the SN Ia efficiency falls to 1/2 at redshift z 0.7, in our 2 deep fields with mag-depth 24.5, the efficiency falls to 1/2 at z 1.1.
The Australian Square Kilometre Array Pathfinder (ASKAP) collects images of the sky at radio wavelengths with an unprecedented field of view, combined with a high angular resolution and sub-millijansky sensitivities. The large quantity of data produced is used by the ASKAP Variables and Slow Transients (VAST) survey science project to study the dynamic radio sky. Efficient pipelines are vital in such research, where searches often form a `needle in a haystack type of problem to solve. However, the existing pipelines developed among the radio-transient community are not suitable for the scale of ASKAP datasets. In this paper we provide a technical overview of the new VAST Pipeline: a modern and scalable Python-based data pipeline for transient searches, using up-to-date dependencies and methods. The pipeline allows source association to be performed at scale using the Pandas DataFrame interface and the well-known Astropy crossmatch functions. The Dask Python framework is used to parallelise operations as well as scale them both vertically and horizontally, by means of a cluster of workers. A modern web interface for data exploration and querying has also been developed using the latest Django web framework combined with Bootstrap.
The Transient Optical Sky Survey (TOSS) is an automated, ground-based telescope system dedicated to searching for optical transient events. Small telescope tubes are mounted on a tracking, semi-equatorial frame with a single polar axis. Each fixed declination telescope records successive exposures which overlap in right ascension. Nightly observations produce time-series images of fixed fields within each declination band. We describe the TOSS data pipeline, including automated routines used for image calibration, object detection and identification, astrometry, and differential photometry. Time series of nightly observations are accumulated in a database for each declination band. Despite the modest cost of the mechanical system, results from the 2009-2010 observing campaign confirm the systems capability for producing light curves of satisfactory accuracy. Transients can be extracted from the individual time-series by identifying deviations from baseline variability.