No Arabic abstract
Advanced LIGO and Virgo have detected ten binary black hole mergers by the end of their second observing run. These mergers have already allowed constraints to be placed on the population distribution of black holes in the Universe, which will only improve with more detections and increasing sensitivity of the detectors. In this paper we develop techniques to measure the angular distribution of black hole mergers by measuring their statistical N-point correlations through hierarchical Bayesian inference. We apply it to the special case of two-point angular correlations using a Legendre polynomial basis on the sky. Building on the mixture model formalism introduced in Ref.[1] we show how one can measure two-point correlations with no threshold on significance, allowing us to target the ensemble of sub-threshold binary black hole mergers not resolvable with the current generation of ground based detectors. We also show how one can use these methods to correlate gravitational waves with other probes of large scale angular structure like galaxy counts, and validate both techniques through simulations.
In the past few years, approximate Bayesian Neural Networks (BNNs) have demonstrated the ability to produce statistically consistent posteriors on a wide range of inference problems at unprecedented speed and scale. However, any disconnect between training sets and the distribution of real-world objects can introduce bias when BNNs are applied to data. This is a common challenge in astrophysics and cosmology, where the unknown distribution of objects in our Universe is often the science goal. In this work, we incorporate BNNs with flexible posterior parameterizations into a hierarchical inference framework that allows for the reconstruction of population hyperparameters and removes the bias introduced by the training distribution. We focus on the challenge of producing posterior PDFs for strong gravitational lens mass model parameters given Hubble Space Telescope (HST) quality single-filter, lens-subtracted, synthetic imaging data. We show that the posterior PDFs are sufficiently accurate (i.e., statistically consistent with the truth) across a wide variety of power-law elliptical lens mass distributions. We then apply our approach to test data sets whose lens parameters are drawn from distributions that are drastically different from the training set. We show that our hierarchical inference framework mitigates the bias introduced by an unrepresentative training sets interim prior. Simultaneously, given a sufficiently broad training set, we can precisely reconstruct the population hyperparameters governing our test distributions. Our full pipeline, from training to hierarchical inference on thousands of lenses, can be run in a day. The framework presented here will allow us to efficiently exploit the full constraining power of future ground- and space-based surveys.
Future generation of gravitational wave detectors will have the sensitivity to detect gravitational wave events at redshifts far beyond any detectable electromagnetic sources. We show that if the observed event rate is greater than one event per year at redshifts z > 40, then the probability distribution of primordial density fluctuations must be significantly non-Gaussian or the events originate from primordial black holes. The nature of the excess events can be determined from the redshift distribution of the merger rate.
We report the first plausible optical electromagnetic (EM) counterpart to a (candidate) binary black hole (BBH) merger. Detected by the Zwicky Transient Facility (ZTF), the EM flare is consistent with expectations for a kicked BBH merger in the accretion disk of an active galactic nucleus (AGN), and is unlikely ($<O(0.01%$)) due to intrinsic variability of this source. The lack of color evolution implies that it is not a supernovae and instead is strongly suggestive of a constant temperature shock. Other false-positive events, such as microlensing or a tidal disruption event, are ruled out or constrained to be $<O(0.1%$). If the flare is associated with S190521g, we find plausible values of: total mass $ M_{rm BBH} sim 100 M_{odot}$, kick velocity $v_k sim 200, {rm km}, {rm s}^{-1}$ at $theta sim 60^{circ}$ in a disk with aspect ratio $H/a sim 0.01$ (i.e., disk height $H$ at radius $a$) and gas density $rho sim 10^{-10}, {rm g}, {rm cm}^{-3}$. The merger could have occurred at a disk migration trap ($a sim 700, r_{g}$; $r_g equiv G M_{rm SMBH} / c^2$, where $M_{rm SMBH}$ is the mass of the AGN supermassive black hole). The combination of parameters implies a significant spin for at least one of the black holes in S190521g. The timing of our spectroscopy prevents useful constraints on broad-line asymmetry due to an off-center flare. We predict a repeat flare in this source due to a re-encountering with the disk in $sim 1.6, {rm yr}, (M_{rm SMBH}/10^{8}M_{odot}), (a/10^{3}r_{g})^{3/2}$.
We seek to achieve the Holy Grail of Bayesian inference for gravitational-wave astronomy: using deep-learning techniques to instantly produce the posterior $p(theta|D)$ for the source parameters $theta$, given the detector data $D$. To do so, we train a deep neural network to take as input a signal + noise data set (drawn from the astrophysical source-parameter prior and the sampling distribution of detector noise), and to output a parametrized approximation of the corresponding posterior. We rely on a compact representation of the data based on reduced-order modeling, which we generate efficiently using a separate neural-network waveform interpolant [A. J. K. Chua, C. R. Galley & M. Vallisneri, Phys. Rev. Lett. 122, 211101 (2019)]. Our scheme has broad relevance to gravitational-wave applications such as low-latency parameter estimation and characterizing the science returns of future experiments. Source code and trained networks are available online at https://github.com/vallis/truebayes.
In 2016, LIGO and Virgo announced the first observation of gravitational waves from a binary black hole merger, known as GW150914. To establish the confidence of this detection, large-scale scientific workflows were used to measure the events statistical significance. They used code written by the LIGO/Virgo and were executed on the LIGO Data Grid. The codes are publicly available, but there has not yet been an attempt to directly reproduce the results, although several analyses have replicated the analysis, confirming the detection. We attempt to reproduce the result presented in the GW150914 discovery paper using publicly available code on the Open Science Grid. We show that we can reproduce the main result but we cannot exactly reproduce the LIGO analysis as the original data set used is not public. We discuss the challenges we encountered and make recommendations for scientists who wish to make their work reproducible.