No Arabic abstract
Time series analysis is ubiquitous in many fields of science including gravitational-wave astronomy, where strain time series are analyzed to infer the nature of gravitational-wave sources, e.g., black holes and neutron stars. It is common in gravitational-wave transient studies to apply a tapered window function to reduce the effects of spectral artifacts from the sharp edges of data segments. We show that the conventional analysis of tapered data fails to take into account covariance between frequency bins, which arises for all finite time series -- no matter the choice of window function. We discuss the origin of this covariance and show that as the number of gravitational-wave detections grows, and as we gain access to more high signal-to-noise ratio events, this covariance will become a non-negligible source of systematic error. We derive a framework that models the correlation induced by the window function and demonstrate this solution using both data from the first LIGO--Virgo transient catalog and simulated Gaussian noise.
We combine hierarchical Bayesian modeling with a flow-based deep generative network, in order to demonstrate that one can efficiently constraint numerical gravitational wave (GW) population models at a previously intractable complexity. Existing techniques for comparing data to simulation,such as discrete model selection and Gaussian process regression, can only be applied efficiently to moderate-dimension data. This limits the number of observable (e.g. chirp mass, spins.) and hyper-parameters (e.g. common envelope efficiency) one can use in a population inference. In this study, we train a network to emulate a phenomenological model with 6 observables and 4 hyper-parameters, use it to infer the properties of a simulated catalogue and compare the results to the phenomenological model. We find that a 10-layer network can emulate the phenomenological model accurately and efficiently. Our machine enables simulation-based GW population inferences to take on data at a new complexity level.
Gravitational waves from compact binaries measured by the LIGO and Virgo detectors are routinely analyzed using Markov Chain Monte Carlo sampling algorithms. Because the evaluation of the likelihood function requires evaluating millions of waveform models that link between signal shapes and the source parameters, running Markov chains until convergence is typically expensive and requires days of computation. In this extended abstract, we provide a proof of concept that demonstrates how the latest advances in neural simulation-based inference can speed up the inference time by up to three orders of magnitude -- from days to minutes -- without impairing the performance. Our approach is based on a convolutional neural network modeling the likelihood-to-evidence ratio and entirely amortizes the computation of the posterior. We find that our model correctly estimates credible intervals for the parameters of simulated gravitational waves.
The gravitational waveform of a merging stellar-mass binary is described at leading order by a quadrupolar mode. However, the complete waveform includes higher-order modes, which encode valuable information not accessible from the leading-order mode alone. Despite this, the majority of astrophysical inferences so far obtained with observations of gravitational waves employ only the leading order mode because calculations with higher-order modes are often computationally challenging. We show how to efficiently incorporate higher-order modes into astrophysical inference calculations with a two step procedure. First, we carry out Bayesian parameter estimation using a computationally cheap leading-order-mode waveform, which provides an initial estimate of binary parameters. Second, we weight the initial estimate using higher-order mode waveforms in order to fold in the extra information from the full waveform. We use mock data to demonstrate the effectiveness of this method. We apply the method to each binary black hole event in the first gravitational-wave transient catalog GWTC-1 to obtain posterior distributions and Bayesian evidence with higher-order modes. Performing Bayesian model selection on the events in GWTC-1, we find only a weak preference for waveforms with higher order modes. We discuss how this method can be generalized to a variety of other applications.
Searches for gravitational waves crucially depend on exact signal processing of noisy strain data from gravitational wave detectors, which are known to exhibit significant non-Gaussian behavior. In this paper, we study two distinct non-Gaussian effects in the LIGO/Virgo data which reduce the sensitivity of searches: first, variations in the noise power spectral density (PSD) on timescales of more than a few seconds; and second, loud and abrupt transient `glitches of terrestrial or instrumental origin. We derive a simple procedure to correct, at first order, the effect of the variation in the PSD on the search background. Given the knowledge of the existence of localized glitches in particular segments of data, we also develop a method to insulate statistical inference from these glitches, so as to cleanly excise them without affecting the search background in neighboring seconds. We show the importance of applying these methods on the publicly available LIGO data, and measure an increase in the detection volume of at least $15%$ from the PSD-drift correction alone, due to the improved background distribution.
Any abundance of black holes that was present in the early universe will evolve as matter, making up an increasingly large fraction of the total energy density as space expands. This motivates us to consider scenarios in which the early universe included an era that was dominated by low-mass ($M < 5times 10^8$ g) black holes which evaporate prior to primordial nucleosynthesis. In significant regions of parameter space, these black holes will become gravitationally bound within binary systems, and undergo mergers before evaporating. Such mergers result in three potentially observable signatures. First, any black holes that have undergone one or more mergers will possess substantial angular momentum, causing their Hawking evaporation to produce significant quantities of high-energy gravitons. These products of Hawking evaporation are predicted to constitute a background of hot ($sim$eV-keV) gravitons today, with an energy density corresponding to $Delta N_{rm eff} sim 0.01-0.03$. Second, these mergers will produce a stochastic background of high-frequency gravitational waves. And third, the energy density of these gravitational waves can be as large as $Delta N_{rm eff} sim 0.3$, depending on the length of time between the mergers and evaporation. These signals are each potentially within the reach of future measurements.