ترغب بنشر مسار تعليمي؟ اضغط هنا

Improving gravitational-wave parameter estimation using Gaussian process regression

124   0   0.0 ( 0 )
 نشر من قبل Christopher Moore
 تاريخ النشر 2015
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Folding uncertainty in theoretical models into Bayesian parameter estimation is necessary in order to make reliable inferences. A general means of achieving this is by marginalizing over model uncertainty using a prior distribution constructed using Gaussian process regression (GPR). As an example, we apply this technique to the measurement of chirp mass using (simulated) gravitational-wave signals from binary black holes that could be observed using advanced-era gravitational-wave detectors. Unless properly accounted for, uncertainty in the gravitational-wave templates could be the dominant source of error in studies of these systems. We explain our approach in detail and provide proofs of various features of the method, including the limiting behavior for high signal-to-noise, where systematic model uncertainties dominate over noise errors. We find that the marginalized likelihood constructed via GPR offers a significant improvement in parameter estimation over the standard, uncorrected likelihood both in our simple one-dimensional study, and theoretically in general. We also examine the dependence of the method on the size of training set used in the GPR; on the form of covariance function adopted for the GPR, and on changes to the detector noise power spectral density.



قيم البحث

اقرأ أيضاً

One of the main bottlenecks in gravitational wave (GW) astronomy is the high cost of performing parameter estimation and GW searches on the fly. We propose a novel technique based on Reduced Order Quadratures (ROQs), an application and data-specific quadrature rule, to perform fast and accurate likelihood evaluations. These are the dominant cost in Markov chain Monte Carlo (MCMC) algorithms, which are widely employed in parameter estimation studies, and so ROQs offer a new way to accelerate GW parameter estimation. We illustrate our approach using a four dimensional GW burst model embedded in noise. We build an ROQ for this model, and perform four dimensional MCMC searches with both the standard and ROQs quadrature rules, showing that, for this model, the ROQ approach is around 25 times faster than the standard approach with essentially no loss of accuracy. The speed-up from using ROQs is expected to increase for more complex GW signal models and therefore has significant potential to accelerate parameter estimation of GW sources such as compact binary coalescences.
We construct a Bayesian inference deep learning machine for parameter estimation of gravitational wave events of binaries of black hole coalescence. The structure of our deep Bayseian machine adopts the conditional variational autoencoder scheme by c onditioning both the gravitational wave strains and the variations of amplitude spectral density of the detector noise. We show that our deep Bayesian machine is capable of yielding the posteriors compatible with the ones from the nest sampling method, and of fighting against the noise outliers. We also apply our deep Bayesian machine to the LIGO/Virgo O3 events, and find that conditioning detector noise to fight against its drifting is relevant for the events with medium signal-to-noise ratios.
By listening to gravity in the low frequency band, between 0.1 mHz and 1 Hz, the future space-based gravitational-wave observatory LISA will be able to detect tens of thousands of astrophysical sources from cosmic dawn to the present. The detection a nd characterization of all resolvable sources is a challenge in itself, but LISA data analysis will be further complicated by interruptions occurring in the interferometric measurements. These interruptions will be due to various causes occurring at various rates, such as laser frequency switches, high-gain antenna re-pointing, orbit corrections, or even unplanned random events. Extracting long-lasting gravitational-wave signals from gapped data raises problems such as noise leakage and increased computational complexity. We address these issues by using Bayesian data augmentation, a method that reintroduces the missing data as auxiliary variables in the sampling of the posterior distribution of astrophysical parameters. This provides a statistically consistent way to handle gaps while improving the sampling efficiency and mitigating leakage effects. We apply the method to the estimation of galactic binaries parameters with different gap patterns, and we compare the results to the case of complete data.
Inspiraling binaries of compact objects are primary targets for current and future gravitational-wave observatories. Waveforms computed in General Relativity are used to search for these sources, and will probably be used to extract source parameters from detected signals. However, if a different theory of gravity happens to be correct in the strong-field regime, source-parameter estimation may be affected by a fundamental bias: that is, by systematic errors induced due to the use of waveforms derived in the incorrect theory. If the deviations from General Relativity are not large enough to be detectable on their own and yet these systematic errors remain significant (i.e., larger than the statistical uncertainties in parameter estimation), fundamental bias cannot be corrected in a single observation, and becomes stealth bias. In this article we develop a scheme to determine in which cases stealth bias could be present in gravitational-wave astronomy. For a given observation, the answer depends on the detection signal-to-noise ratio and on the strength of the modified-gravity correction. As an example, we study three representative stellar-mass binary systems that will be detectable with second-generation ground-based observatories. We find that significant systematic bias can occur whether or not modified gravity can be positively detected, for correction strengths that are not currently excluded by any other experiment. Thus, stealth bias may be a generic feature of gravitational-wave detections, and it should be considered and characterized, using expanded models such as the parametrized post-Einstein framework, when interpreting the results of parameter-estimation analyses.
Optical scatterometry is a method to measure the size and shape of periodic micro- or nanostructures on surfaces. For this purpose the geometry parameters of the structures are obtained by reproducing experimental measurement results through numerica l simulations. We compare the performance of Bayesian optimization to different local minimization algorithms for this numerical optimization problem. Bayesian optimization uses Gaussian-process regression to find promising parameter values. We examine how pre-computed simulation results can be used to train the Gaussian process and to accelerate the optimization.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا