ترغب بنشر مسار تعليمي؟ اضغط هنا

Nested Sampling with Normalising Flows for Gravitational-Wave Inference

106   0   0.0 ( 0 )
 نشر من قبل Michael Williams
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a novel method for sampling iso-likelihood contours in nested sampling using a type of machine learning algorithm known as normalising flows and incorporate it into our sampler nessai. Nessai is designed for problems where computing the likelihood is computationally expensive and therefore the cost of training a normalising flow is offset by the overall reduction in the number of likelihood evaluations. We validate our sampler on 128 simulated gravitational wave signals from compact binary coalescence and show that it produces unbiased estimates of the system parameters. Subsequently, we compare our results to those obtained with dynesty and find good agreement between the computed log-evidences whilst requiring 2.07 times fewer likelihood evaluations. We also highlight how the likelihood evaluation can be parallelised in nessai without any modifications to the algorithm. Finally, we outline diagnostics included in nessai and how these can be used to tune the samplers settings.

قيم البحث

اقرأ أيضاً

A central challenge in Gravitational Wave Astronomy is identifying weak signals in the presence of non-stationary and non-Gaussian noise. The separation of gravitational wave signals from noise requires good models for both. When accurate signal mode ls are available, such as for binary Neutron star systems, it is possible to make robust detection statements even when the noise is poorly understood. In contrast, searches for un-modeled transient signals are strongly impacted by the methods used to characterize the noise. Here we take a Bayesian approach and introduce a multi-component, variable dimension, parameterized noise model that explicitly accounts for non-stationarity and non-Gaussianity in data from interferometric gravitational wave detectors. Instrumental transients (glitches) and burst sources of gravitational waves are modeled using a Morlet-Gabor continuous wavelet frame. The number and placement of the wavelets is determined by a trans-dimensional Reversible Jump Markov Chain Monte Carlo algorithm. The Gaussian component of the noise and sharp line features in the noise spectrum are modeled using the BayesLine algorithm, which operates in concert with the wavelet model.
Gravitational wave data from ground-based detectors is dominated by instrument noise. Signals will be comparatively weak, and our understanding of the noise will influence detection confidence and signal characterization. Mis-modeled noise can produc e large systematic biases in both model selection and parameter estimation. Here we introduce a multi-component, variable dimension, parameterized model to describe the Gaussian-noise power spectrum for data from ground-based gravitational wave interferometers. Called BayesLine, the algorithm models the noise power spectral density using cubic splines for smoothly varying broad-band noise and Lorentzians for narrow-band line features in the spectrum. We describe the algorithm and demonstrate its performance on data from the fifth and sixth LIGO science runs. Once fully integrated into LIGO/Virgo data analysis software, BayesLine will produce accurate spectral estimation and provide a means for marginalizing inferences drawn from the data over all plausible noise spectra.
74 - Rory Smith , Eric Thrane 2017
Roughly every 2-10 minutes, a pair of stellar mass black holes merge somewhere in the Universe. A small fraction of these mergers are detected as individually resolvable gravitational-wave events by advanced detectors such as LIGO and Virgo. The rest contribute to a stochastic background. We derive the statistically optimal search strategy for a background of unresolved binaries. Our method applies Bayesian parameter estimation to all available data. Using Monte Carlo simulations, we demonstrate that the search is both safe and effective: it is not fooled by instrumental artefacts such as glitches, and it recovers simulated stochastic signals without bias. Given realistic assumptions, we estimate that the search can detect the binary black hole background with about one day of design sensitivity data versus $approx 40$ months using the traditional cross-correlation search. This framework independently constrains the merger rate and black hole mass distribution, breaking a degeneracy present in the cross-correlation approach. The search provides a unified framework for population studies of compact binaries, which is cast in terms of hyper-parameter estimation. We discuss a number of extensions and generalizations including: application to other sources (such as binary neutron stars and continuous-wave sources), simultaneous estimation of a continuous Gaussian background, and applications to pulsar timing.
Third-generation (3G) gravitational-wave detectors will observe thousands of coalescing neutron star binaries with unprecedented fidelity. Extracting the highest precision science from these signals is expected to be challenging owing to both high si gnal-to-noise ratios and long-duration signals. We demonstrate that current Bayesian inference paradigms can be extended to the analysis of binary neutron star signals without breaking the computational bank. We construct reduced order models for $sim 90,mathrm{minute}$ long gravitational-wave signals, covering the observing band ($5-2048,mathrm{Hz}$), speeding up inference by a factor of $sim 1.3times 10^4$ compared to the calculation times without reduced order models. The reduced order models incorporate key physics including the effects of tidal deformability, amplitude modulation due to the Earths rotation, and spin-induced orbital precession. We show how reduced order modeling can accelerate inference on data containing multiple, overlapping gravitational-wave signals, and determine the speedup as a function of the number of overlapping signals. Thus, we conclude that Bayesian inference is computationally tractable for the long-lived, overlapping, high signal-to-noise-ratio events present in 3G observatories.
We discuss the detection of gravitational-wave backgrounds in the context of Bayesian inference and suggest a practical definition of what it means for a signal to be considered stochastic---namely, that the Bayesian evidence favors a stochastic sign al model over a deterministic signal model. A signal can further be classified as Gaussian-stochastic if a Gaussian signal model is favored. In our analysis we use Bayesian model selection to choose between several signal and noise models for simulated data consisting of uncorrelated Gaussian detector noise plus a superposition of sinusoidal signals from an astrophysical population of gravitational-wave sources. For simplicity, we consider co-located and co-aligned detectors with white detector noise, but the method can be extended to more realistic detector configurations and power spectra. The general trend we observe is that a deterministic model is favored for small source numbers, a non-Gaussian stochastic model is preferred for intermediate source numbers, and a Gaussian stochastic model is preferred for large source numbers. However, there is very large variation between individual signal realizations, leading to fuzzy boundaries between the three regimes. We find that a hybrid, trans-dimensional model comprised of a deterministic signal model for individual bright sources and a Gaussian-stochastic signal model for the remaining confusion background outperforms all other models in most instances.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا