No Arabic abstract
We describe a multivariate classifier for candidate events in a templated search for gravitational-wave (GW) inspiral signals from neutron-star--black-hole (NS-BH) binaries, in data from ground-based detectors where sensitivity is limited by non-Gaussian noise transients. The standard signal-to-noise ratio (SNR) and chi-squared test for inspiral searches use only properties of a single matched filter at the time of an event; instead, we propose a classifier using features derived from a bank of inspiral templates around the time of each event, and also from a search using approximate sine-Gaussian templates. The classifier thus extracts additional information from strain data to discriminate inspiral signals from noise transients. We evaluate a Random Forest classifier on a set of single-detector events obtained from realistic simulated advanced LIGO data, using simulated NS-BH signals added to the data. The new classifier detects a factor of 1.5 -- 2 more signals at low false positive rates as compared to the standard re-weighted SNR statistic, and does not require the chi-squared test to be computed. Conversely, if only the SNR and chi-squared values of single-detector events are available, Random Forest classification performs nearly identically to the re-weighted SNR.
A major challenge of any search for gravitational waves is to distinguish true astrophysical signals from those of terrestrial origin. Gravitational-wave experiments therefore make use of multiple detectors, considering only those signals which appear in coincidence in two or more instruments. It is unclear, however, how to interpret loud gravitational-wave candidates observed when only one detector is operational. In this paper, we demonstrate that the observed rate of binary black hole mergers can be leveraged in order to make confident detections of gravitational-wave signals with one detector alone. We quantify detection confidences in terms of the probability $P(S)$ that a signal candidate is of astrophysical origin. We find that, at current levels of instrumental sensitivity, loud signal candidates observed with a single Advanced LIGO detector can be assigned $P(S)gtrsim0.4$. In the future, Advanced LIGO may be able to observe single-detector events with confidences exceeding $P(S)sim90%$.
The discovery of the astrophysical events GW150926 and GW151226 has experimentally confirmed the existence of gravitational waves (GW) and has demonstrated the existence of binary stellar-mass black hole systems. This finding marks the beginning of a new era that will reveal unexpected features of our universe. This work presents a basic insight to the fundamental theory of GW emitted by inspiral binary systems and describes the scientific and technological efforts developed to measure these waves using the interferometer-based detector called LIGO. Subsequently, the work presents a comprehensive data analysis methodology based on the matched filter algorithm, which aims to recovery GW signals emitted by inspiral binary systems of astrophysical sources. This algorithm was evaluated with freely available LIGO data containing injected GW waveforms. Results of the experiments performed to assess detection accuracy showed the recovery of 85% of the injected GW.
Construction of a large-scale cryogenic gravitational-wave telescope KAGRA has been completed and the four sapphire test masses have been installed in cryostat vacuum chambers. It recently turned out that a sapphire substrate used for one of the input test masses shows a characteristic strcuture in its transmission map due to non-uniformity of the crystal. We performed an interferometer simulation to see the influence of the non-uniformity using measured transmission/reflection maps.
Future space borne gravitational wave detectors will require a precise definition of calibration signals to ensure the achievement of their design sensitivity. The careful design of the test signals plays a key role in the correct understanding and characterisation of these instruments. In that sense, methods achieving optimal experiment designs must be considered as complementary to the parameter estimation methods being used to determine the parameters describing the system. The relevance of experiment design is particularly significant for the LISA Pathfinder mission, which will spend most of its operation time performing experiments to characterise key technologies for future space borne gravitational wave observatories. Here we propose a framework to derive the optimal signals ---in terms of minimum parameter uncertainty--- to be injected to these instruments during its calibration phase. We compare our results with an alternative numerical algorithm which achieves an optimal input signal by iteratively improving an initial guess. We show agreement of both approaches when applied to the LISA Pathfinder case.
We carried out a computer simulation of a large gravitational wave (GW) interferometer using the specifications of the LIGO instruments. We find that if in addition to the carrier, a single sideband offset from the carrier by the fsr frequency (the free spectral range of the arm cavities) is injected, it is equally sensitive to GW signals as is the carrier. The amplitude of the fsr sideband signal in the DC region is generally much less subject to noise than the carrier, and this makes possible the detection of periodic signals with frequencies well below the so-called seismic wall.