ترغب بنشر مسار تعليمي؟ اضغط هنا

A second generation of gravitational wave detectors will soon come online with the objective of measuring for the first time the tiny gravitational signal from the coalescence of black hole and/or neutron star binaries. In this communication, we prop ose a new time-frequency search method alternative to matched filtering techniques that are usually employed to detect this signal. This method relies on a graph that encodes the time evolution of the signal and its variability by establishing links between coefficients in the multi-scale time-frequency decomposition of the data. We provide a proof of concept for this approach.
Gravitational waves are radiative solutions of space-time dynamics predicted by Einsteins theory of General Relativity. A world-wide array of large-scale and highly sensitive interferometric detectors constantly scrutinizes the geometry of the local space-time with the hope to detect deviations that would signal an impinging gravitational wave from a remote astrophysical source. Finding the rare and weak signature of gravitational waves buried in non-stationary and non-Gaussian instrument noise is a particularly challenging problem. We will give an overview of the data-analysis techniques and associated observational results obtained so far by Virgo (in Europe) and LIGO (in the US), along with the prospects offered by the up-coming advance
Gravitational Wave (GW) burst detection algorithms typically rely on the hypothesis that the burst signal is locally stationary, that is it changes slowly with frequency. Under this assumption, the signal can be decomposed into a small number of wave lets with constant frequency. This justifies the use of a family of sine-Gaussian templates in the Omega pipeline, one of the algorithms used in LIGO-Virgo burst searches. However there are plausible scenarios where the burst frequency evolves rapidly, such as in the merger phase of a binary black hole and/or neutron star coalescence. In those cases, the local stationarity of sine-Gaussians induces performance losses, due to the mismatch between the template and the actual signal. We propose an extension of the Omega pipeline based on chirplet-like templates. Chirplets incorporate an additional parameter, the chirp rate, to control the frequency variation. In this paper, we show that the Omega pipeline can easily be extended to include a chirplet template bank. We illustrate the method on a simulated data set, with a family of phenomenological binary black-hole coalescence waveforms embedded into Gaussian LIGO/Virgo-like noise. Chirplet-like templates result in an enhancement of the measured signal-to-noise ratio.
Gravitational waves (GWs) are expected to play a crucial role in the development of multimessenger astrophysics. The combination of GW observations with other astrophysical triggers, such as from gamma-ray and X-ray satellites, optical/radio telescop es, and neutrino detectors allows us to decipher science that would otherwise be inaccessible. In this paper, we provide a broad review from the multimessenger perspective of the science reach offered by the third generation interferometric GW detectors and by the Einstein Telescope (ET) in particular. We focus on cosmic transients, and base our estimates on the results obtained by ETs predecessors GEO, LIGO, and Virgo.
The list of putative sources of gravitational waves possibly detected by the ongoing worldwide network of large scale interferometers has been continuously growing in the last years. For some of them, the detection is made difficult by the lack of a complete information about the expected signal. We concentrate on the case where the expected GW is a quasi-periodic frequency modulated signal i.e., a chirp. In this article, we address the question of detecting an a priori unknown GW chirp. We introduce a general chirp model and claim that it includes all physically realistic GW chirps. We produce a finite grid of template waveforms which samples the resulting set of possible chirps. If we follow the classical approach (used for the detection of inspiralling binary chirps, for instance), we would build a bank of quadrature matched filters comparing the data to each of the templates of this grid. The detection would then be achieved by thresholding the output, the maximum giving the individual which best fits the data. In the present case, this exhaustive search is not tractable because of the very large number of templates in the grid. We show that the exhaustive search can be reformulated (using approximations) as a pattern search in the time-frequency plane. This motivates an approximate but feasible alternative solution which is clearly linked to the optimal one. [abridged version of the abstract]
We propose a monitoring indicator of the normality of the output of a gravitational wave detector. This indicator is based on the estimation of the kurtosis (i.e., the 4th order statistical moment normalized by the variance squared) of the data selec ted in a time sliding window. We show how a low cost (because recursive) implementation of such estimation is possible and we illustrate the validity of the presented approach with a few examples using simulated random noises.
We investigate the class of quadratic detectors (i.e., the statistic is a bilinear function of the data) for the detection of poorly modeled gravitational transients of short duration. We point out that all such detection methods are equivalent to pa ssing the signal through a filter bank and linearly combine the output energy. Existing methods for the choice of the filter bank and of the weight parameters rely essentially on the two following ideas: (i) the use of the likelihood function based on a (possibly non-informative) statistical model of the signal and the noise, (ii) the use of Monte-Carlo simulations for the tuning of parametric filters to get the best detection probability keeping fixed the false alarm rate. We propose a third approach according to which the filter bank is learned from a set of training data. By-products of this viewpoint are that, contrarily to previous methods, (i) there is no requirement of an explicit description of the probability density function of the data when the signal is present and (ii) the filters we use are non-parametric. The learning procedure may be described as a two step process: first, estimate the mean and covariance of the signal with the training data; second, find the filters which maximize a contrast criterion referred to as deflection between the noise only and signal+noise hypothesis. The deflection is homogeneous to the signal-to-noise ratio and it uses the quantities estimated at the first step. We apply this original method to the problem of the detection of supernovae core collapses. We use the catalog of waveforms provided recently by Dimmelmeier et al. to train our algorithm. We expect such detector to have better performances on this particular problem provided that the reference signals are reliable.
It is known by the experience gained from the gravitational wave detector proto-types that the interferometric output signal will be corrupted by a significant amount of non-Gaussian noise, large part of it being essentially composed of long-term sin usoids with slowly varying envelope (such as violin resonances in the suspensions, or main power harmonics) and short-term ringdown noise (which may emanate from servo control systems, electronics in a non-linear state, etc.). Since non-Gaussian noise components make the detection and estimation of the gravitational wave signature more difficult, a denoising algorithm based on adaptive filtering techniques (LMS methods) is proposed to separate and extract them from the stationary and Gaussian background noise. The strength of the method is that it does not require any precise model on the observed data: the signals are distinguished on the basis of their autocorrelation time. We believe that the robustness and simplicity of this method make it useful for data preparation and for the understanding of the first interferometric data. We present the detailed structure of the algorithm and its application to both simulated data and real data from the LIGO 40meter proto-type.
We propose an adaptive denoising scheme for poorly modeled non-Gaussian features in the gravitational wave interferometric data. Preliminary tests on real data show encouraging results.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا