ترغب بنشر مسار تعليمي؟ اضغط هنا

Modelling coloured residual noise in gravitational-wave signal processing

342   0   0.0 ( 0 )
 نشر من قبل Christian R\\\"over
 تاريخ النشر 2010
والبحث باللغة English




اسأل ChatGPT حول البحث

We introduce a signal processing model for signals in non-white noise, where the exact noise spectrum is a priori unknown. The model is based on a Students t distribution and constitutes a natural generalization of the widely used normal (Gaussian) model. This way, it allows for uncertainty in the noise spectrum, or more generally is also able to accommodate outliers (heavy-tailed noise) in the data. Examples are given pertaining to data from gravitational wave detectors.



قيم البحث

اقرأ أيضاً

With the advent of gravitational wave astronomy, techniques to extend the reach of gravitational wave detectors are desired. In addition to the stellar-mass black hole and neutron star mergers already detected, many more are below the surface of the noise, available for detection if the noise is reduced enough. Our method (DeepClean) applies machine learning algorithms to gravitational wave detector data and data from on-site sensors monitoring the instrument to reduce the noise in the time-series due to instrumental artifacts and environmental contamination. This framework is generic enough to subtract linear, non-linear, and non-stationary coupling mechanisms. It may also provide handles in learning about the mechanisms which are not currently understood to be limiting detector sensitivities. The robustness of the noise reduction technique in its ability to efficiently remove noise with no unintended effects on gravitational-wave signals is also addressed through software signal injection and parameter estimation of the recovered signal. It is shown that the optimal SNR ratio of the injected signal is enhanced by $sim 21.6%$ and the recovered parameters are consistent with the injected set. We present the performance of this algorithm on linear and non-linear noise sources and discuss its impact on astrophysical searches by gravitational wave detectors.
Computing signal-to-noise ratios (SNRs) is one of the most common tasks in gravitational-wave data analysis. While a single SNR evaluation is generally fast, computing SNRs for an entire population of merger events could be time consuming. We compute SNRs for aligned-spin binary black-hole mergers as a function of the (detector-frame) total mass, mass ratio and spin magnitudes using selected waveform models and detector noise curves, then we interpolate the SNRs in this four-dimensional parameter space with a simple neural network (a multilayer perceptron). The trained network can evaluate $10^6$ SNRs on a 4-core CPU within a minute with a median fractional error below $10^{-3}$. This corresponds to average speed-ups by factors in the range $[120,,7.5times10^4]$, depending on the underlying waveform model. Our trained network (and source code) is publicly available at https://github.com/kazewong/NeuralSNR, and it can be easily adapted to similar multidimensional interpolation problems.
We present a new event trigger generator based on the Hilbert-Huang transform, named EtaGen ($eta$Gen). It decomposes a time-series data into several adaptive modes without imposing a priori bases on the data. The adaptive modes are used to find tran sients (excesses) in the background noises. A clustering algorithm is used to gather excesses corresponding to a single event and to reconstruct its waveform. The performance of EtaGen is evaluated by how many injections in the LIGO simulated data are found. EtaGen is viable as an event trigger generator when compared directly with the performance of Omicron, which is currently the best event trigger generator used in the LIGO Scientific Collaboration and Virgo Collaboration.
Future ground-based gravitational-wave detectors are slated to detect black hole and neutron star collisions from the entire stellar history of the universe. To achieve the designed detector sensitivities, frequency noise from the laser source must b e reduced below the level achieved in current Advanced LIGO detectors. This paper reviews the laser frequency noise suppression scheme in Advanced LIGO, and quantifies the noise coupling to the gravitational-wave readout. The laser frequency noise incident on the current Advanced LIGO detectors is $8 times 10^{-5}~mathrm{Hz/sqrt{Hz}}$ at $1~mathrm{kHz}$. Future detectors will require even lower incident frequency noise levels to ensure this technical noise source does not limit sensitivity. The frequency noise requirement for a gravitational wave detector with arm lengths of $40~mathrm{km}$ is estimated to be $7 times 10^{-7}~mathrm{Hz/sqrt{Hz}}$. To reach this goal a new frequency noise suppression scheme is proposed, utilizing two input mode cleaner cavities, and the limits of this scheme are explored. Using this scheme the frequency noise requirement is met, even in pessimistic noise coupling scenarios.
The Omicron software is a tool developed to perform a multi-resolution time-frequency analysis of data from gravitational-wave detectors: the LIGO, Virgo, and KAGRA detectors. Omicron generates spectrograms from whitened data streams, offering a visu al representation of transient detector noises and gravitational-wave events. In addition, these events can be parameterized with an optimized resolution. They can be written to disk to conduct offline noise characterization and gravitational-wave event validation studies. Omicron is optimized to process, in parallel, thousands of data streams recorded by gravitational-wave detectors. The Omicron software plays an important role in vetting gravitational-wave detection candidates and characterization of transient noise.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا