No Arabic abstract
We demonstrate the application of a convolutional neural network to the gravitational wave signals from core collapse supernovae. Using simulated time series of gravitational wave detectors, we show that based on the explosion mechanisms, a convolutional neural network can be used to detect and classify the gravitational wave signals buried in noise. For the waveforms used in the training of the convolutional neural network, our results suggest that a network of advanced LIGO, advanced VIRGO and KAGRA, or a network of LIGO A+, advanced VIRGO and KAGRA is likely to detect a magnetorotational core collapse supernovae within the Large and Small Magellanic Clouds, or a Galactic event if the explosion mechanism is the neutrino-driven mechanism. By testing the convolutional neural network with waveforms not used for training, we show that the true alarm probabilities are 52% and 83% at 60 kpc for waveforms R3E1AC and R4E1FC L. For waveforms s20 and SFHx at 10 kpc, the true alarm probabilities are 70% and 93% respectively. All at false alarm probability equal to 10%.
We present a first proof-of-principle study for using deep neural networks (DNNs) as a novel search method for continuous gravitational waves (CWs) from unknown spinning neutron stars. The sensitivity of current wide-parameter-space CW searches is limited by the available computing power, which makes neural networks an interesting alternative to investigate, as they are extremely fast once trained and have recently been shown to rival the sensitivity of matched filtering for black-hole merger signals. We train a convolutional neural network with residual (short-cut) connections and compare its detection power to that of a fully-coherent matched-filtering search using the WEAVE pipeline. As test benchmarks we consider two types of all-sky searches over the frequency range from $20,mathrm{Hz}$ to $1000,mathrm{Hz}$: an `easy search using $T=10^5,mathrm{s}$ of data, and a `harder search using $T=10^6,mathrm{s}$. Detection probability $p_mathrm{det}$ is measured on a signal population for which matched filtering achieves $p_mathrm{det}=90%$ in Gaussian noise. In the easiest test case ($T=10^5,mathrm{s}$ at $20,mathrm{Hz}$) the DNN achieves $p_mathrm{det}sim88%$, corresponding to a loss in sensitivity depth of $sim5%$ versus coherent matched filtering. However, at higher-frequencies and longer observation time the DNN detection power decreases, until $p_mathrm{det}sim13%$ and a loss of $sim 66%$ in sensitivity depth in the hardest case ($T=10^6,mathrm{s}$ at $1000,mathrm{Hz}$). We study the DNN generalization ability by testing on signals of different frequencies, spindowns and signal strengths than they were trained on. We observe excellent generalization: only five networks, each trained at a different frequency, would be able to cover the whole frequency range of the search.
The next generation of observatories will facilitate the discovery of new types of astrophysical transients. The detection of such phenomena, whose characteristics are presently poorly constrained, will hinge on the ability to perform blind searches. We present a new algorithm for this purpose, based on deep learning. We incorporate two approaches, utilising anomaly detection and classification techniques. The first is model-independent, avoiding the use of background modelling and instrument simulations. The second method enables targeted searches, relying on generic spectral and temporal patterns as input. We compare our methodology with the existing approach to serendipitous detection of gamma-ray transients. The algorithm is shown to be more robust, especially for non-trivial spectral features. We use our framework to derive the detection prospects of low-luminosity gamma-ray bursts with the upcoming Cherenkov Telescope Array. Our method is an unbiased, completely data-driven approach for multiwavelength and multi-messenger transient detection.
Deep learning techniques have been well explored in the transiting exoplanet field, however previous work mainly focuses on classification and inspection. In this work, we develop a novel detection algorithm based on a well-proven object detection framework in the computer vision field. Through training the network on the light curves of the confirmed Kepler exoplanets, our model yields 94% precision and 95% recall for transits with signal-to-noise ratio higher than 6 (set the confidence threshold to 0.6). Giving a slightly lower confidence threshold, recall can reach higher than 97%, which makes our model applicable for large-scale search. We also transfer the trained model to the TESS data and obtain similar performance. The results of our algorithm match the intuition of the human visual perception and make it easy to find single transiting candidates. Moreover, the parameters of the output bounding boxes can also help to find multiplanet systems. Our network and detection functions are implemented in the Deep-Transit toolkit, which is an open-source Python package hosted on GitHub and PyPI.
The follow-up of external science alerts received from Gamma-Ray Bursts (GRB) and Gravitational Waves (GW) detectors is one of the AGILE Teams current major activities. The AGILE team developed an automated real-time analysis pipeline to analyse AGILE Gamma-Ray Imaging Detector (GRID) data to detect possible counterparts in the energy range 0.1-10 GeV. This work presents a new approach for detecting GRBs using a Convolutional Neural Network (CNN) to classify the AGILE-GRID intensity maps improving the GRBs detection capability over the Li&Ma method, currently used by the AGILE team. The CNN is trained with large simulated datasets of intensity maps. The AGILE complex observing pattern due to the so-called spinning mode is studied to prepare datasets to test and evaluate the CNN. A GRB emission model is defined from the Second Fermi-LAT GRB catalogue and convoluted with the AGILE observing pattern. Different p-value distributions are calculated evaluating with the CNN millions of background-only maps simulated varying the background level. The CNN is then used on real data to analyse the AGILE-GRID data archive, searching for GRB detections using the trigger time and position taken from the Swift-BAT, Fermi-GBM, and Fermi-LAT GRB catalogues. From these catalogues, the CNN detects 21 GRBs with a significance $geq 3 sigma$, while the Li&Ma method detects only two GRBs. The results shown in this work demonstrate that the CNN is more effective in detecting GRBs than the Li&Ma method in this context and can be implemented into the AGILE-GRID real-time analysis pipeline.
Ground-based $gamma$-ray observatories, such as the VERITAS array of imaging atmospheric Cherenkov telescopes, provide insight into very-high-energy (VHE, $mathrm{E}>100,mathrm{GeV}$) astrophysical transient events. Examples include the evaporation of primordial black holes, gamma-ray bursts and flaring blazars. Identifying such events with a serendipitous location and time of occurrence is difficult. Thus, employing a robust search method becomes crucial. An implementation of a transient detection method based on deep-learning techniques for VERITAS will be presented. This data-driven approach significantly reduces the dependency on the characterization of the instrument response and the modelling of the expected transient signal. The response of the instrument is affected by various factors, such as the elevation of the source and the night sky background. The study of these effects allows enhancing the deep learning method with additional parameters to infer their influences on the data. This improves the performance and stability for a wide range of observational conditions. We illustrate our method for an historic flare of the blazar BL Lac that was detected by VERITAS in October 2016. We find a promising performance for the detection of such a flare in timescales of minutes that compares well with the VERITAS standard analysis.