Do you want to publish a course? Click here

Coherent network analysis technique for discriminating gravitational-wave bursts from instrumental noise

94   0   0.0 ( 0 )
 Added by Patrick Sutton
 Publication date 2006
  fields Physics
and research's language is English




Ask ChatGPT about the research

Existing coherent network analysis techniques for detecting gravitational-wave bursts simultaneously test data from multiple observatories for consistency with the expected properties of the signals. These techniques assume the output of the detector network to be the sum of a stationary Gaussian noise process and a gravitational-wave signal, and they may fail in the presence of transient non-stationarities, which are common in real detectors. In order to address this problem we introduce a consistency test that is robust against noise non-stationarities and allows one to distinguish between gravitational-wave bursts and noise transients. This technique does not require any a priori knowledge of the putative burst waveform.



rate research

Read More

Searches for gravitational wave bursts that are triggered by the observation of astronomical events require a different mode of analysis than all-sky, blind searches. For one, much more prior information is usually available in a triggered search which can and should be used in the analysis. Second, since the data volume is usually small in a triggered search, it is also possible to use computationally more expensive algorithms for tasks such as data pre-processing that can consume significant computing resources in a high data-volume un-triggered search. From the statistical point of view, the reduction in the parameter space search volume leads to higher sensitivity than an un-triggered search. We describe here a data analysis pipeline for triggered searches, called {tt RIDGE}, and present preliminary results for simulated noise and signals.
In the coming years gravitational-wave detectors will undergo a series of improvements, with an increase in their detection rate by about an order of magnitude. Routine detections of gravitational-wave signals promote novel astrophysical and fundamental theory studies, while simultaneously leading to an increase in the number of detections temporally overlapping with instrumentally- or environmentally-induced transients in the detectors (glitches), often of unknown origin. Indeed, this was the case for the very first detection by the LIGO and Virgo detectors of a gravitational-wave signal consistent with a binary neutron star coalescence, GW170817. A loud glitch in the LIGO-Livingston detector, about one second before the merger, hampered coincident detection (which was initially achieved solely with LIGO-Hanford data). Moreover, accurate source characterization depends on specific assumptions about the behavior of the detector noise that are rendered invalid by the presence of glitches. In this paper, we present the various techniques employed for the initial mitigation of the glitch to perform source characterization of GW170817 and study advantages and disadvantages of each mitigation method. We show that, despite the presence of instrumental noise transients louder than the one affecting GW170817, we are still able to produce unbiased measurements of the intrinsic parameters from simulated injections with properties similar to GW170817.
LIGO and Virgo recently completed searches for gravitational waves at their initial target sensitivities, and soon Advanced LIGO and Advanced Virgo will commence observations with even better capabilities. In the search for short duration signals, such as coalescing compact binary inspirals or burst events, noise transients can be problematic. Interferometric gravitational-wave detectors are highly complex instruments, and, based on the experience from the past, the data often contain a large number of noise transients that are not easily distinguishable from possible gravitational-wave signals. In order to perform a sensitive search for short-duration gravitational-wave signals it is important to identify these noise artifacts, and to veto them. Here we describe such a veto, the bilinear-coupling veto, that makes use of an empirical model of the coupling of instrumental noise to the output strain channel of the interferometric gravitational-wave detector. In this method, we check whether the data from the output strain channel at the time of an apparent signal is consistent with the data from a bilinear combination of auxiliary channels. We discuss the results of the application of this veto on recent LIGO data, and its possible utility when used with data from Advanced LIGO and Advanced Virgo.
coherent WaveBurst (cWB) is a highly configurable pipeline designed to detect a broad range of gravitational-wave (GW) transients in the data of the worldwide network of GW detectors. The algorithmic core of cWB is a time-frequency analysis with the Wilson-Daubechies-Meyer wavelets aimed at the identification of GW events without prior knowledge of the signal waveform. cWB has been in active development since 2003 and it has been used to analyze all scientific data collected by the LIGO-Virgo detectors ever since. On September 14, 2015, the cWB low-latency search detected the first gravitational-wave event, GW150914, a merger of two black holes. In 2019, a public open-source version of cWB has been released with GPLv3 license.
The detection and estimation of gravitational wave burst signals, with {em a priori} unknown polarization waveforms, requires the use of data from a network of detectors. For determining how the data from such a network should be combined, approaches based on the maximum likelihood principle have proven to be useful. The most straightforward among these uses the global maximum of the likelihood over the space of all waveforms as both the detection statistic and signal estimator. However, in the case of burst signals, a physically counterintuitive situation results: for two aligned detectors the statistic includes the cross-correlation of the detector outputs, as expected, but this term disappears even for an infinitesimal misalignment. This {em two detector paradox} arises from the inclusion of improbable waveforms in the solution space of maximization. Such waveforms produce widely different responses in detectors that are closely aligned. We show that by penalizing waveforms that exhibit large signal-to-noise ratio (snr) variability, as the corresponding source is moved on the sky, a physically motivated restriction is obtained that (i) resolves the two detector paradox and (ii) leads to a better performing statistic than the global maximum of the likelihood. Waveforms with high snr variability turn out to be precisely the ones that are improbable in the sense mentioned above. The coherent network analysis method thus obtained can be applied to any network, irrespective of the number or the mutual alignment of detectors.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا