Do you want to publish a course? Click here

Noise Reduction in Gravitational-wave Data via Deep Learning

97   0   0.0 ( 0 )
 Added by Rich Ormiston
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

With the advent of gravitational wave astronomy, techniques to extend the reach of gravitational wave detectors are desired. In addition to the stellar-mass black hole and neutron star mergers already detected, many more are below the surface of the noise, available for detection if the noise is reduced enough. Our method (DeepClean) applies machine learning algorithms to gravitational wave detector data and data from on-site sensors monitoring the instrument to reduce the noise in the time-series due to instrumental artifacts and environmental contamination. This framework is generic enough to subtract linear, non-linear, and non-stationary coupling mechanisms. It may also provide handles in learning about the mechanisms which are not currently understood to be limiting detector sensitivities. The robustness of the noise reduction technique in its ability to efficiently remove noise with no unintended effects on gravitational-wave signals is also addressed through software signal injection and parameter estimation of the recovered signal. It is shown that the optimal SNR ratio of the injected signal is enhanced by $sim 21.6%$ and the recovered parameters are consistent with the injected set. We present the performance of this algorithm on linear and non-linear noise sources and discuss its impact on astrophysical searches by gravitational wave detectors.

rate research

Read More

We introduce a signal processing model for signals in non-white noise, where the exact noise spectrum is a priori unknown. The model is based on a Students t distribution and constitutes a natural generalization of the widely used normal (Gaussian) model. This way, it allows for uncertainty in the noise spectrum, or more generally is also able to accommodate outliers (heavy-tailed noise) in the data. Examples are given pertaining to data from gravitational wave detectors.
We present a new event trigger generator based on the Hilbert-Huang transform, named EtaGen ($eta$Gen). It decomposes a time-series data into several adaptive modes without imposing a priori bases on the data. The adaptive modes are used to find transients (excesses) in the background noises. A clustering algorithm is used to gather excesses corresponding to a single event and to reconstruct its waveform. The performance of EtaGen is evaluated by how many injections in the LIGO simulated data are found. EtaGen is viable as an event trigger generator when compared directly with the performance of Omicron, which is currently the best event trigger generator used in the LIGO Scientific Collaboration and Virgo Collaboration.
In this paper, we report on the construction of a deep Artificial Neural Network (ANN) to localize simulated gravitational wave signals in the sky with high accuracy. We have modelled the sky as a sphere and have considered cases where the sphere is divided into 18, 50, 128, 1024, 2048 and 4096 sectors. The sky direction of the gravitational wave source is estimated by classifying the signal into one of these sectors based on its right ascension and declination values for each of these cases. In order to do this, we have injected simulated binary black hole gravitational wave signals of component masses sampled uniformly between 30-80 solar mass into Gaussian noise and used the whitened strain values to obtain the input features for training our ANN. We input features such as the delays in arrival times, phase differences and amplitude ratios at each of the three detectors Hanford, Livingston and Virgo, from the raw time-domain strain values as well as from analytic
The LIGO observatories detect gravitational waves through monitoring changes in the detectors length down to below $10^{-19}$,$m/sqrt{Hz}$ variation---a small fraction of the size of the atoms that make up the detector. To achieve this sensitivity, the detector and its environment need to be closely monitored. Beyond the gravitational wave data stream, LIGO continuously records hundreds of thousands of channels of environmental and instrumental data in order to monitor for possibly minuscule variations that contribute to the detector noise. A particularly challenging issue is the appearance in the gravitational wave signal of brief, loud noise artifacts called ``glitches, which are environmental or instrumental in origin but can mimic true gravitational waves and therefore hinder sensitivity. Currently they are primarily identified by analysis of the gravitational wave data stream. Here we present a machine learning approach that can identify glitches by monitoring textit{all} environmental and detector data channels, a task that has not previously been pursued due to its scale and the number of degrees of freedom within gravitational-wave detectors. The presented method is capable of reducing the gravitational-wave detector networks false alarm rate and improving the LIGO instruments, consequently enhancing detection confidence.
Earth-based gravitational-wave detectors will be limited by quantum noise in a large part of their spectrum. The most promising technique to achieve a broadband reduction of such noise is the injection of a frequency dependent squeezed vacuum state from the output port of the detector, whit the squeeze angle rotated by the reflection off a Fabry-Perot filter cavity. One of the most important parameters limiting the squeezing performance is represented by the optical losses of the filter cavity. We report here the operation of a 300 m filter cavity prototype installed at the National Astronomical Observatory of Japan (NAOJ). The cavity is designed to obtain a rotation of the squeeze angle below 100 Hz. After achieving the resonance of the cavity with a multi-wavelength technique, the round trip losses have been measured to be between 50 ppm and 90 ppm. This result demonstrates that with realistic assumption on the input squeeze factor and on the other optical losses, a quantum noise reduction of at least 4 dB in the frequency region dominated by radiation pressure can be achieved.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا