Do you want to publish a course? Click here

Exploring gravitational-wave detection and parameter inference using Deep Learning methods

116   0   0.0 ( 0 )
 Added by Antonio Onofre
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

We explore machine learning methods to detect gravitational waves (GW) from binary black hole (BBH) mergers using deep learning (DL) algorithms. The DL networks are trained with gravitational waveforms obtained from BBH mergers with component masses randomly sampled in the range from 5 to 100 solar masses and luminosity distances from 100 Mpc to, at least, 2000 Mpc. The GW signal waveforms are injected in public data from the O2 run of the Advanced LIGO and Advanced Virgo detectors, in time windows that do not coincide with those of known detected signals. We demonstrate that DL algorithms, trained with GW signal waveforms at distances of 2000 Mpc, still show high accuracy when detecting closer signals, within the ranges considered in our analysis. Moreover, by combining the results of the three-detector network in a unique RGB image, the single detector performance is improved by as much as 70%. Furthermore, we train a regression network to perform parameter inference on BBH spectrogram data and apply this network to the events from the the GWTC-1 and GWTC-2 catalogs. Without significant optimization of our algorithms we obtain results that are mostly consistent with published results by the LIGO-Virgo Collaboration. In particular, our predictions for the chirp mass are compatible (up to 3$sigma$) with the official values for 90% of events



rate research

Read More

108 - Neil J. Cornish 2021
Inferring the source properties of a gravitational wave signal has traditionally been very computationally intensive and time consuming. In recent years, several techniques have been developed that can significantly reduce the computational cost while delivering rapid and accurate parameter inference. One of the most powerful of these techniques is the heterodyned likelihood, which uses a reference waveform to base-band the likelihood calculation. Here an efficient implementation of the heterodyned likelihood is presented that can be used for a wide range of signal types and for both ground based and space based interferometers. The computational savings relative to direct calculation of the likelihood vary between two and four orders of magnitude depending on the system. The savings are greatest for low mass systems such as neutron star binaries. The heterodyning procedure can incorporate marginalization over calibration uncertainties and the noise power spectrum.
We construct a Bayesian inference deep learning machine for parameter estimation of gravitational wave events of binaries of black hole coalescence. The structure of our deep Bayseian machine adopts the conditional variational autoencoder scheme by conditioning both the gravitational wave strains and the variations of amplitude spectral density of the detector noise. We show that our deep Bayesian machine is capable of yielding the posteriors compatible with the ones from the nest sampling method, and of fighting against the noise outliers. We also apply our deep Bayesian machine to the LIGO/Virgo O3 events, and find that conditioning detector noise to fight against its drifting is relevant for the events with medium signal-to-noise ratios.
Folding uncertainty in theoretical models into Bayesian parameter estimation is necessary in order to make reliable inferences. A general means of achieving this is by marginalizing over model uncertainty using a prior distribution constructed using Gaussian process regression (GPR). As an example, we apply this technique to the measurement of chirp mass using (simulated) gravitational-wave signals from binary black holes that could be observed using advanced-era gravitational-wave detectors. Unless properly accounted for, uncertainty in the gravitational-wave templates could be the dominant source of error in studies of these systems. We explain our approach in detail and provide proofs of various features of the method, including the limiting behavior for high signal-to-noise, where systematic model uncertainties dominate over noise errors. We find that the marginalized likelihood constructed via GPR offers a significant improvement in parameter estimation over the standard, uncorrected likelihood both in our simple one-dimensional study, and theoretically in general. We also examine the dependence of the method on the size of training set used in the GPR; on the form of covariance function adopted for the GPR, and on changes to the detector noise power spectral density.
We review detection methods that are currently in use or have been proposed to search for a stochastic background of gravitational radiation. We consider both Bayesian and frequentist searches using ground-based and space-based laser interferometers, spacecraft Doppler tracking, and pulsar timing arrays; and we allow for anisotropy, non-Gaussianity, and non-standard polarization states. Our focus is on relevant data analysis issues, and not on the particular astrophysical or early Universe sources that might give rise to such backgrounds. We provide a unified treatment of these searches at the level of detector response functions, detection sensitivity curves, and, more generally, at the level of the likelihood function, since the choice of signal and noise models and prior probability distributions are actually what define the search. Pedagogical examples are given whenever possible to compare and contrast different approaches. We have tried to make the article as self-contained and comprehensive as possible, targeting graduate students and new researchers looking to enter this field.
We seek to achieve the Holy Grail of Bayesian inference for gravitational-wave astronomy: using deep-learning techniques to instantly produce the posterior $p(theta|D)$ for the source parameters $theta$, given the detector data $D$. To do so, we train a deep neural network to take as input a signal + noise data set (drawn from the astrophysical source-parameter prior and the sampling distribution of detector noise), and to output a parametrized approximation of the corresponding posterior. We rely on a compact representation of the data based on reduced-order modeling, which we generate efficiently using a separate neural-network waveform interpolant [A. J. K. Chua, C. R. Galley & M. Vallisneri, Phys. Rev. Lett. 122, 211101 (2019)]. Our scheme has broad relevance to gravitational-wave applications such as low-latency parameter estimation and characterizing the science returns of future experiments. Source code and trained networks are available online at https://github.com/vallis/truebayes.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا