We demonstrate that mechanical waves traveling in a torsional, mechanical wave machine exhibit dispersion due to gravity and the discreteness of the medium. We also show that although the dispersion due to discreteness is negligible, the dispersion due to gravity can be easily measured, and can be shown to disappear in a zero-gravity environment.
Machine learning has emerged as a popular and powerful approach for solving problems in astrophysics. We review applications of machine learning techniques for the analysis of ground-based gravitational-wave detector data. Examples include techniques for improving the sensitivity of Advanced LIGO and Advanced Virgo gravitational-wave searches, methods for fast measurements of the astrophysical parameters of gravitational-wave sources, and algorithms for reduction and characterization of non-astrophysical detector noise. These applications demonstrate how machine learning techniques may be harnessed to enhance the science that is possible with current and future gravitational-wave detectors.
We consider how an advanced civilization might build a radiator to send gravitational waves signals by using small black holes. Micro black holes on the scale of centimeters but with masses of asteroids to planets are manipulated with a super advanced instrumentality, possibly with very large electromagnetic fields. The machine envisioned emits gravitational waves in the GHz frequency range. If the source to receiver distance is a characteristic length in the galaxy, up to 10000 light years, the masses involved are at least planetary in magnitude. To provide the energy for this system we posit a very advanced civilization that has a Kerr black hole at its disposal and can extract energy by way of super-radiance. Background gravitational radiation sets a limit on the dimensionless amplitude that can be measured at interstellar distance using a LIGO like detector.
The quantum nature of the electromagnetic field imposes a fundamental limit on the sensitivity of optical precision measurements such as spectroscopy, microscopy, and interferometry. The so-called quantum limit is set by the zero-point fluctuations of the electromagnetic field, which constrain the precision with which optical signals can be measured. In the world of precision measurement, laser-interferometric gravitational wave (GW) detectors are the most sensitive position meters ever operated, capable of measuring distance changes on the order of 10^-18 m RMS over kilometer separations caused by GWs from astronomical sources. The sensitivity of currently operational and future GW detectors is limited by quantum optical noise. Here we demonstrate a 44% improvement in displacement sensitivity of a prototype GW detector with suspended quasi-free mirrors at frequencies where the sensitivity is shot-noise-limited, by injection of a squeezed state of light. This demonstration is a critical step toward implementation of squeezing-enhancement in large-scale GW detectors.
The LIGO observatories detect gravitational waves through monitoring changes in the detectors length down to below $10^{-19}$,$m/sqrt{Hz}$ variation---a small fraction of the size of the atoms that make up the detector. To achieve this sensitivity, the detector and its environment need to be closely monitored. Beyond the gravitational wave data stream, LIGO continuously records hundreds of thousands of channels of environmental and instrumental data in order to monitor for possibly minuscule variations that contribute to the detector noise. A particularly challenging issue is the appearance in the gravitational wave signal of brief, loud noise artifacts called ``glitches, which are environmental or instrumental in origin but can mimic true gravitational waves and therefore hinder sensitivity. Currently they are primarily identified by analysis of the gravitational wave data stream. Here we present a machine learning approach that can identify glitches by monitoring textit{all} environmental and detector data channels, a task that has not previously been pursued due to its scale and the number of degrees of freedom within gravitational-wave detectors. The presented method is capable of reducing the gravitational-wave detector networks false alarm rate and improving the LIGO instruments, consequently enhancing detection confidence.
We report on advances to interpret current and future gravitational-wave events in light of astrophysical simulations. A machine-learning emulator is trained on numerical population-synthesis predictions and inserted into a Bayesian hierarchical framework. In this case study, a modest but state-of-the-art suite of simulations of isolated binary stars is interpolated across two event parameters and one population parameter. The validation process of our pipelines highlights how omitting some of the event parameters might cause errors in estimating selection effects, which propagates as systematics to the final population inference. Using LIGO/Virgo data from O1 and O2 we infer that black holes in binaries are most likely to receive natal kicks with one-dimensional velocity dispersion $sigma$ = 105+44 km/s. Our results showcase potential applications of machine-learning tools in conjunction with population-synthesis simulations and gravitational-wave data.