ترغب بنشر مسار تعليمي؟ اضغط هنا

Correlated magnetic noise in global networks of gravitational-wave interferometers: observations and implications

393   0   0.0 ( 0 )
 نشر من قبل Eric Thrane
 تاريخ النشر 2013
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

One of the most ambitious goals of gravitational-wave astronomy is to observe the stochastic gravitational-wave background. Correlated noise in two or more detectors can introduce a systematic error, which limits the sensitivity of stochastic searches. We report on measurements of correlated magnetic noise from Schumann resonances at the widely separated LIGO and Virgo detectors. We investigate the effect of this noise on a global network of interferometers and derive a constraint on the allowable coupling of environmental magnetic fields to test mass motion in gravitational-wave detectors. We find that while correlated noise from global electromagnetic fields could be safely ignored for initial LIGO stochastic searches, it could severely impact Advanced LIGO and third-generation detectors.



قيم البحث

اقرأ أيضاً

The recent discovery of merging black holes suggests that a stochastic gravitational-wave background is within reach of the advanced detector network operating at design sensitivity. However, correlated magnetic noise from Schumann resonances threate ns to contaminate observation of a stochastic background. In this paper, we report on the first effort to eliminate intercontinental correlated noise from Schumann resonances using Wiener filtering. Using magnetometers as proxies for gravitational-wave detectors, we demonstrate as much as a factor of two reduction in the coherence between magnetometers on different continents. While much work remains to be done, our results constitute a proof-of-principle and motivate follow-up studies with a dedicated array of magnetometers.
Advanced LIGO and the next generation of ground-based detectors aim to capture many more binary coalescences through improving sensitivity and duty cycle. Earthquakes have always been a limiting factor at low frequency where neither the pendulum susp ension nor the active controls provide sufficient isolation to the test mass mirrors. Several control strategies have been proposed to reduce the impact of teleseismic events by switching to a robust configuration with less aggressive feedback. The continental United States has witnessed a huge increase in the number of induced earthquake events primarily associated with hydraulic fracking-related waste water re-injection. Effects from these differ from teleseismic earthquakes primarily because of their depth which is in turn linked to their triggering mechanism. In this paper, we discuss the impact caused due to these low magnitude regional earthquakes and explore ways to minimize the impact of induced seismicity on the detector.
We carried out a computer simulation of a large gravitational wave (GW) interferometer using the specifications of the LIGO instruments. We find that if in addition to the carrier, a single sideband offset from the carrier by the fsr frequency (the f ree spectral range of the arm cavities) is injected, it is equally sensitive to GW signals as is the carrier. The amplitude of the fsr sideband signal in the DC region is generally much less subject to noise than the carrier, and this makes possible the detection of periodic signals with frequencies well below the so-called seismic wall.
With the advent of gravitational wave astronomy, techniques to extend the reach of gravitational wave detectors are desired. In addition to the stellar-mass black hole and neutron star mergers already detected, many more are below the surface of the noise, available for detection if the noise is reduced enough. Our method (DeepClean) applies machine learning algorithms to gravitational wave detector data and data from on-site sensors monitoring the instrument to reduce the noise in the time-series due to instrumental artifacts and environmental contamination. This framework is generic enough to subtract linear, non-linear, and non-stationary coupling mechanisms. It may also provide handles in learning about the mechanisms which are not currently understood to be limiting detector sensitivities. The robustness of the noise reduction technique in its ability to efficiently remove noise with no unintended effects on gravitational-wave signals is also addressed through software signal injection and parameter estimation of the recovered signal. It is shown that the optimal SNR ratio of the injected signal is enhanced by $sim 21.6%$ and the recovered parameters are consistent with the injected set. We present the performance of this algorithm on linear and non-linear noise sources and discuss its impact on astrophysical searches by gravitational wave detectors.
Computing signal-to-noise ratios (SNRs) is one of the most common tasks in gravitational-wave data analysis. While a single SNR evaluation is generally fast, computing SNRs for an entire population of merger events could be time consuming. We compute SNRs for aligned-spin binary black-hole mergers as a function of the (detector-frame) total mass, mass ratio and spin magnitudes using selected waveform models and detector noise curves, then we interpolate the SNRs in this four-dimensional parameter space with a simple neural network (a multilayer perceptron). The trained network can evaluate $10^6$ SNRs on a 4-core CPU within a minute with a median fractional error below $10^{-3}$. This corresponds to average speed-ups by factors in the range $[120,,7.5times10^4]$, depending on the underlying waveform model. Our trained network (and source code) is publicly available at https://github.com/kazewong/NeuralSNR, and it can be easily adapted to similar multidimensional interpolation problems.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا