No Arabic abstract
The current gravitational wave detectors have identified a surprising population of heavy stellar mass black holes, and an even larger population of coalescing neutron stars. The first observations have led to many dramatic discoveries and the confirmation of general relativity in very strong gravitational fields. The future of gravitational wave astronomy looks bright, especially if additional detectors with greater sensitivity, broader bandwidth, and better global coverage can be implemented. The first discoveries add impetus to gravitational wave detectors designed to detect in the nHz, mHz and kHz frequency bands. This paper reviews the century-long struggle that led to the recent discoveries, and reports on designs and possibilities for future detectors. The benefits of future detectors in the Asian region are discussed, including analysis of the benefits of a detector located in Australia.
Efficient parameter estimation is critical for Gravitational-Wave astronomy. In the case of compact binary coalescence, the high dimensional parameter space demands efficient sampling techniques - such as Markov chain Monte Carlo (MCMC). A number of degeneracies effectively reduce the dimensionality of the parameter space and, when known, can render sampling algorithms more efficient with problem-specific improvements. We present in this paper an analytical description of a degeneracy involving the extrinsic parameters of a compact binary coalescence gravitational-wave signal, when data from a three detector network (such as Advanced LIGO/Virgo) is available. We use this new formula to construct a jump proposal, a framework for a generic sampler to take advantage of the degeneracy. We show the gain in efficiency for a MCMC sampler in the analysis of the gravitational-wave signal from a compact binary coalescence.
Gravitational waves are radiative solutions of space-time dynamics predicted by Einsteins theory of General Relativity. A world-wide array of large-scale and highly sensitive interferometric detectors constantly scrutinizes the geometry of the local space-time with the hope to detect deviations that would signal an impinging gravitational wave from a remote astrophysical source. Finding the rare and weak signature of gravitational waves buried in non-stationary and non-Gaussian instrument noise is a particularly challenging problem. We will give an overview of the data-analysis techniques and associated observational results obtained so far by Virgo (in Europe) and LIGO (in the US), along with the prospects offered by the up-coming advance
The field of transient astronomy has seen a revolution with the first gravitational-wave detections and the arrival of multi-messenger observations they enabled. Transformed by the first detection of binary black hole and binary neutron star mergers, computational demands in gravitational-wave astronomy are expected to grow by at least a factor of two over the next five years as the global network of kilometer-scale interferometers are brought to design sensitivity. With the increase in detector sensitivity, real-time delivery of gravitational-wave alerts will become increasingly important as an enabler of multi-messenger followup. In this work, we report a novel implementation and deployment of deep learning inference for real-time gravitational-wave data denoising and astrophysical source identification. This is accomplished using a generic Inference-as-a-Service model that is capable of adapting to the future needs of gravitational-wave data analysis. Our implementation allows seamless incorporation of hardware accelerators and also enables the use of commercial or private (dedicated) as-a-service computing. Based on our results, we propose a paradigm shift in low-latency and offline computing in gravitational-wave astronomy. Such a shift can address key challenges in peak-usage, scalability and reliability, and provide a data analysis platform particularly optimized for deep learning applications. The achieved sub-millisecond scale latency will also be relevant for any machine learning-based real-time control systems that may be invoked in the operation of near-future and next generation ground-based laser interferometers, as well as the front-end collection, distribution and processing of data from such instruments.
Broadband suppression of quantum noise below the Standard Quantum Limit (SQL) becomes a top-priority problem for the future generation of large-scale terrestrial detectors of gravitational waves, as the interferometers of the Advanced LIGO project, predesigned to be quantum-noise-limited in the almost entire detection band, are phased in. To this end, among various proposed methods of quantum noise suppression or signal amplification, the most elaborate approach implies a so-called *xylophone* configuration of two Michelson interferometers, each optimised for its own frequency band, with a combined broadband sensitivity well below the SQL. Albeit ingenious, it is a rather costly solution. We demonstrate that changing the optical scheme to a Sagnac interferometer with weak detuned signal recycling and frequency dependent input squeezing can do almost as good a job, as the xylophone for significantly lower spend. We also show that the Sagnac interferometer is more robust to optical loss in filter cavity, used for frequency dependent squeezed vacuum injection, than an analogous Michelson interferometer, thereby reducing building cost even more.
We describe a Bayesian formalism for analyzing individual gravitational-wave events in light of the rest of an observed population. This analysis reveals how the idea of a ``population-informed prior arises naturally from a suitable marginalization of an underlying hierarchical Bayesian model which consistently accounts for selection effects. Our formalism naturally leads to the presence of ``leave-one-out distributions which include subsets of events. This differs from other approximations, also known as empirical Bayes methods, which effectively double count one or more events. We design a double-reweighting post-processing strategy that uses only existing data products to reconstruct the resulting population-informed posterior distributions. Although the correction we highlight is an important conceptual point, we find it has a limited impact on the current catalog of gravitational-wave events. Our approach further allows us to study, for the first time in the gravitational-wave literature, correlations between the parameters of individual events and those of the population.