No Arabic abstract
The analysis of gravitational wave data involves many model selection problems. The most important example is the detection problem of selecting between the data being consistent with instrument noise alone, or instrument noise and a gravitational wave signal. The analysis of data from ground based gravitational wave detectors is mostly conducted using classical statistics, and methods such as the Neyman-Pearson criteria are used for model selection. Future space based detectors, such as the emph{Laser Interferometer Space Antenna} (LISA), are expected to produced rich data streams containing the signals from many millions of sources. Determining the number of sources that are resolvable, and the most appropriate description of each source poses a challenging model selection problem that may best be addressed in a Bayesian framework. An important class of LISA sources are the millions of low-mass binary systems within our own galaxy, tens of thousands of which will be detectable. Not only are the number of sources unknown, but so are the number of parameters required to model the waveforms. For example, a significant subset of the resolvable galactic binaries will exhibit orbital frequency evolution, while a smaller number will have measurable eccentricity. In the Bayesian approach to model selection one needs to compute the Bayes factor between competing models. Here we explore various methods for computing Bayes factors in the context of determining which galactic binaries have measurable frequency evolution. The methods explored include a Reverse Jump Markov Chain Monte Carlo (RJMCMC) algorithm, Savage-Dickie density ratios, the Schwarz-Bayes Information Criterion (BIC), and the Laplace approximation to the model evidence. We find good agreement between all of the approaches.
Bayesian model selection provides a powerful and mathematically transparent framework to tackle hypothesis testing, such as detection tests of gravitational waves emitted during the coalescence of binary systems using ground-based laser interferometers. Although its implementation is computationally intensive, we have developed an efficient probabilistic algorithm based on a technique known as nested sampling that makes Bayesian model selection applicable to follow-up studies of candidate signals produced by on-going searches of inspiralling compact binaries. We discuss the performance of this approach, in terms of false alarm rate and detection probability of restricted second post-Newtonian inspiral waveforms from non-spinning compact objects in binary systems. The results confirm that this approach is a viable tool for detection tests in current searches for gravitational wave signals.
Several km-scale gravitational-wave detectors have been constructed world wide. These instruments combine a number of advanced technologies to push the limits of precision length measurement. The core devices are laser interferometers of a new kind; developed from the classical Michelson topology these interferometers integrate additional optical elements, which significantly change the properties of the optical system. Much of the design and analysis of these laser interferometers can be performed using well-known classical optical techniques; however, the complex optical layouts provide a new challenge. In this review we give a textbook-style introduction to the optical science required for the understanding of modern gravitational wave detectors, as well as other high-precision laser interferometers. In addition, we provide a number of examples for a freely available interferometer simulation software and encourage the reader to use these examples to gain hands-on experience with the discussed optical methods.
We study generic tests of strong-field General Relativity using gravitational waves emitted during the inspiral of compact binaries. Previous studies have considered simple extensions to the standard post-Newtonian waveforms that differ by a single term in the phase. Here we improve on these studies by (i) increasing the realism of injections and (ii) determining the optimal waveform families for detecting and characterizing such signals. We construct waveforms that deviate from those in General Relativity through a series of post-Newtonian terms, and find that these higher-order terms can affect our ability to test General Relativity, in some cases by making it easier to detect a deviation, and in some cases by making it more difficult. We find that simple single-phase post-Einsteinian waveforms are sufficient for detecting deviations from General Relativity, and there is little to be gained from using more complicated models with multiple phase terms. The results found here will help guide future attempts to test General Relativity with advanced ground-based detectors.
I will review the most recent and interesting results from gravitational wave detection experiments, concentrating on recent results from the LIGO Scientific Collaboration (LSC). I will outline the methodologies utilized in the searches, explain what can be said in the case of a null result, what quantities may be constrained. I will compare these results with prior expectations and discuss their significance. As I go along I will outline the prospects for future improvements.
The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the eLISA concept. The data analysis team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the data analysis team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching this problem is to recover the essential parameters of a LTP model fitting the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate it: The Reversible Jump Markov Chain Monte Carlo method, the Schwarz criterion, and the Laplace approximation. They are applied to simulated LPF experiments where the most probable LTP model that explains the observations is recovered. The same type of analysis presented in this paper is expected to be followed during flight operations. Moreover, the correlation of the output of the aforementioned methods with the design of the experiment is explored.