Do you want to publish a course? Click here

Conditional Noise Deep Learning for Parameter Estimation of Gravitational Wave Events

104   0   0.0 ( 0 )
 Added by Feng-Li Lin
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

We construct a Bayesian inference deep learning machine for parameter estimation of gravitational wave events of binaries of black hole coalescence. The structure of our deep Bayseian machine adopts the conditional variational autoencoder scheme by conditioning both the gravitational wave strains and the variations of amplitude spectral density of the detector noise. We show that our deep Bayesian machine is capable of yielding the posteriors compatible with the ones from the nest sampling method, and of fighting against the noise outliers. We also apply our deep Bayesian machine to the LIGO/Virgo O3 events, and find that conditioning detector noise to fight against its drifting is relevant for the events with medium signal-to-noise ratios.



rate research

Read More

One of the main bottlenecks in gravitational wave (GW) astronomy is the high cost of performing parameter estimation and GW searches on the fly. We propose a novel technique based on Reduced Order Quadratures (ROQs), an application and data-specific quadrature rule, to perform fast and accurate likelihood evaluations. These are the dominant cost in Markov chain Monte Carlo (MCMC) algorithms, which are widely employed in parameter estimation studies, and so ROQs offer a new way to accelerate GW parameter estimation. We illustrate our approach using a four dimensional GW burst model embedded in noise. We build an ROQ for this model, and perform four dimensional MCMC searches with both the standard and ROQs quadrature rules, showing that, for this model, the ROQ approach is around 25 times faster than the standard approach with essentially no loss of accuracy. The speed-up from using ROQs is expected to increase for more complex GW signal models and therefore has significant potential to accelerate parameter estimation of GW sources such as compact binary coalescences.
Folding uncertainty in theoretical models into Bayesian parameter estimation is necessary in order to make reliable inferences. A general means of achieving this is by marginalizing over model uncertainty using a prior distribution constructed using Gaussian process regression (GPR). As an example, we apply this technique to the measurement of chirp mass using (simulated) gravitational-wave signals from binary black holes that could be observed using advanced-era gravitational-wave detectors. Unless properly accounted for, uncertainty in the gravitational-wave templates could be the dominant source of error in studies of these systems. We explain our approach in detail and provide proofs of various features of the method, including the limiting behavior for high signal-to-noise, where systematic model uncertainties dominate over noise errors. We find that the marginalized likelihood constructed via GPR offers a significant improvement in parameter estimation over the standard, uncorrected likelihood both in our simple one-dimensional study, and theoretically in general. We also examine the dependence of the method on the size of training set used in the GPR; on the form of covariance function adopted for the GPR, and on changes to the detector noise power spectral density.
Signal extraction out of background noise is a common challenge in high precision physics experiments, where the measurement output is often a continuous data stream. To improve the signal to noise ratio of the detection, witness sensors are often used to independently measure background noises and subtract them from the main signal. If the noise coupling is linear and stationary, optimal techniques already exist and are routinely implemented in many experiments. However, when the noise coupling is non-stationary, linear techniques often fail or are sub-optimal. Inspired by the properties of the background noise in gravitational wave detectors, this work develops a novel algorithm to efficiently characterize and remove non-stationary noise couplings, provided there exist witnesses of the noise source and of the modulation. In this work, the algorithm is described in its most general formulation, and its efficiency is demonstrated with examples from the data of the Advanced LIGO gravitational wave observatory, where we could obtain an improvement of the detector gravitational wave reach without introducing any bias on the source parameter estimation.
By listening to gravity in the low frequency band, between 0.1 mHz and 1 Hz, the future space-based gravitational-wave observatory LISA will be able to detect tens of thousands of astrophysical sources from cosmic dawn to the present. The detection and characterization of all resolvable sources is a challenge in itself, but LISA data analysis will be further complicated by interruptions occurring in the interferometric measurements. These interruptions will be due to various causes occurring at various rates, such as laser frequency switches, high-gain antenna re-pointing, orbit corrections, or even unplanned random events. Extracting long-lasting gravitational-wave signals from gapped data raises problems such as noise leakage and increased computational complexity. We address these issues by using Bayesian data augmentation, a method that reintroduces the missing data as auxiliary variables in the sampling of the posterior distribution of astrophysical parameters. This provides a statistically consistent way to handle gaps while improving the sampling efficiency and mitigating leakage effects. We apply the method to the estimation of galactic binaries parameters with different gap patterns, and we compare the results to the case of complete data.
Since the very first detection of gravitational waves from the coalescence of two black holes in 2015, Bayesian statistical methods have been routinely applied by LIGO and Virgo to extract the signal out of noisy interferometric measurements, obtain point estimates of the physical parameters responsible for producing the signal, and rigorously quantify their uncertainties. Different computational techniques have been devised depending on the source of the gravitational radiation and the gravitational waveform model used. Prominent sources of gravitational waves are binary black hole or neutron star mergers, the only objects that have been observed by detectors to date. But also gravitational waves from core collapse supernovae, rapidly rotating neutron stars, and the stochastic gravitational wave background are in the sensitivity band of the ground-based interferometers and expected to be observable in future observation runs. As nonlinearities of the complex waveforms and the high-dimensional parameter spaces preclude analytic evaluation of the posterior distribution, posterior inference for all these sources relies on computer-intensive simulation techniques such as Markov chain Monte Carlo methods. A review of state-of-the-art Bayesian statistical parameter estimation methods will be given for researchers in this cross-disciplinary area of gravitational wave data analysis.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا