Gravitational-wave signal-to-noise interpolation via neural networks


الملخص بالإنكليزية

Computing signal-to-noise ratios (SNRs) is one of the most common tasks in gravitational-wave data analysis. While a single SNR evaluation is generally fast, computing SNRs for an entire population of merger events could be time consuming. We compute SNRs for aligned-spin binary black-hole mergers as a function of the (detector-frame) total mass, mass ratio and spin magnitudes using selected waveform models and detector noise curves, then we interpolate the SNRs in this four-dimensional parameter space with a simple neural network (a multilayer perceptron). The trained network can evaluate $10^6$ SNRs on a 4-core CPU within a minute with a median fractional error below $10^{-3}$. This corresponds to average speed-ups by factors in the range $[120,,7.5times10^4]$, depending on the underlying waveform model. Our trained network (and source code) is publicly available at https://github.com/kazewong/NeuralSNR, and it can be easily adapted to similar multidimensional interpolation problems.

تحميل البحث