No Arabic abstract
We present a methodology for automated real-time analysis of a radio image data stream with the goal to find transient sources. Contrary to previous works, the transients we are interested in occur on a time-scale where dispersion starts to play a role, so we must search a higher-dimensional data space and yet work fast enough to keep up with the data stream in real time. The approach consists of five main steps: quality control, source detection, association, flux measurement, and physical parameter inference. We present parallelized methods based on convolutions and filters that can be accelerated on a GPU, allowing the pipeline to run in real-time. In the parameter inference step, we apply a convolutional neural network to dynamic spectra that were obtained from the preceding steps. It infers physical parameters, among which the dispersion measure of the transient candidate. Based on critical values of these parameters, an alert can be sent out and data will be saved for further investigation. Experimentally, the pipeline is applied to simulated data and images from AARTFAAC (Amsterdam Astron Radio Transients Facility And Analysis Centre), a transients facility based on the Low-Frequency Array (LOFAR). Results on simulated data show the efficacy of the pipeline, and from real data it discovered dispersed pulses. The current work targets transients on time scales that are longer than the fast transients of beam-formed search, but shorter than slow transients in which dispersion matters less. This fills a methodological gap that is relevant for the upcoming Square-Kilometer Array (SKA). Additionally, since real-time analysis can be performed, only data with promising detections can be saved to disk, providing a solution to the big-data problem that modern astronomy is dealing with.
In the preparation for ESAs Euclid mission and the large amount of data it will produce, we train deep convolutional neural networks on Euclid simulations classify solar system objects from other astronomical sources. Using transfer learning we are able to achieve a good performance despite our tiny dataset with as few as 7512 images. Our best model correctly identifies objects with a top accuracy of 94% and improves to 96% when Euclids dither information is included. The neural network misses ~50% of the slowest moving asteroids (v < 10 arcsec/h) but is otherwise able to correctly classify asteroids even down to 26 mag. We show that the same model also performs well at classifying stars, galaxies and cosmic rays, and could potentially be applied to distinguish all types of objects in the Euclid data and other large optical surveys.
This work investigates the problem of detecting gravitational wave (GW) events based on simulated damped sinusoid signals contaminated with white Gaussian noise. It is treated as a classification problem with one class for the interesting events. The proposed scheme consists of the following two successive steps: decomposing the data using a wavelet packet, representing the GW signal and noise using the derived decomposition coefficients; and determining the existence of any GW event using a convolutional neural network (CNN) with a logistic regression output layer. The characteristics of this work is its comprehensive investigations on CNN structure, detection window width, data resolution, wavelet packet decomposition and detection window overlap scheme. Extensive simulation experiments show excellent performances for reliable detection of signals with a range of GW model parameters and signal-to-noise ratios. While we use a simple waveform model in this study, we expect the method to be particularly valuable when the potential GW shapes are too complex to be characterized with a template bank.
We demonstrate a new technique for detecting radio transients based on interferometric closure quantities. The technique uses the bispectrum, the product of visibilities around a closed-loop of baselines of an interferometer. The bispectrum is calibration independent, resistant to interference, and computationally efficient, so it can be built into correlators for real-time transient detection. Our technique could find celestial transients anywhere in the field of view and localize them to arcsecond precision. At the Karl G. Jansky Very Large Array (VLA), such a system would have a high survey speed and a 5-sigma sensitivity of 38 mJy on 10 ms timescales with 1 GHz of bandwidth. The ability to localize dispersed millisecond pulses to arcsecond precision in large volumes of interferometer data has several unique science applications. Localizing individual pulses from Galactic pulsars will help find X-ray counterparts that define their physical properties, while finding host galaxies of extragalactic transients will measure the electron density of the intergalactic medium with a single dispersed pulse. Exoplanets and active stars have distinct millisecond variability that can be used to identify them and probe their magnetospheres. We use millisecond time scale visibilities from the Allen Telescope Array (ATA) and VLA to show that the bispectrum can detect dispersed pulses and reject local interference. The computational and data efficiency of the bispectrum will help find transients on a range of time scales with next-generation radio interferometers.
We present a novel application of partial convolutional neural networks (PCNN) that can inpaint masked images of the cosmic microwave background. The network can reconstruct both the maps and the power spectra to a few percent for circular and irregularly shaped masks covering up to ~10% of the image area. By performing a Kolmogorov-Smirnov test we show that the reconstructed maps and power spectra are indistinguishable from the input maps and power spectra at the 99.9% level. Moreover, we show that PCNNs can inpaint maps with regular and irregular masks to the same accuracy. This should be particularly beneficial to inpaint irregular masks for the CMB that come from astrophysical sources such as galactic foregrounds. The proof of concept application shown in this paper shows that PCNNs can be an important tool in data analysis pipelines in cosmology.
Rehabilitation is important to improve quality of life for mobility-impaired patients. Smart walkers are a commonly used solution that should embed automatic and objective tools for data-driven human-in-the-loop control and monitoring. However, present solutions focus on extracting few specific metrics from dedicated sensors with no unified full-body approach. We investigate a general, real-time, full-body pose estimation framework based on two RGB+D camera streams with non-overlapping views mounted on a smart walker equipment used in rehabilitation. Human keypoint estimation is performed using a two-stage neural network framework. The 2D-Stage implements a detection module that locates body keypoints in the 2D image frames. The 3D-Stage implements a regression module that lifts and relates the detected keypoints in both cameras to the 3D space relative to the walker. Model predictions are low-pass filtered to improve temporal consistency. A custom acquisition method was used to obtain a dataset, with 14 healthy subjects, used for training and evaluating the proposed framework offline, which was then deployed on the real walker equipment. An overall keypoint detection error of 3.73 pixels for the 2D-Stage and 44.05mm for the 3D-Stage were reported, with an inference time of 26.6ms when deployed on the constrained hardware of the walker. We present a novel approach to patient monitoring and data-driven human-in-the-loop control in the context of smart walkers. It is able to extract a complete and compact body representation in real-time and from inexpensive sensors, serving as a common base for downstream metrics extraction solutions, and Human-Robot interaction applications. Despite promising results, more data should be collected on users with impairments, to assess its performance as a rehabilitation tool in real-world scenarios.