No Arabic abstract
As new techniques exploiting the Earths ambient seismic noise field are developed and applied, such as for the observation of temporal changes in seismic velocity structure, it is crucial to quantify the precision with which wave-type measurements can be made. This work uses array data at the Homestake mine in Lead, South Dakota and an array at Sweetwater, Texas to consider two aspects that control this precision: the types of seismic wave contributing to the ambient noise field at microseism frequencies and the effect of array geometry. Both are quantified using measurements of wavefield coherence between stations in combination with Wiener filters. We find a strong seasonal change between body-wave and surface-wave content. Regarding the inclusion of underground stations, we quantify the lower limit to which the ambient noise field can be characterized and reproduced; the applications of the Wiener filters are about 4 times more successful in reproducing ambient noise waveforms when underground stations are included in the array, resulting in predictions of seismic timeseries with less than a 1% residual, and are ultimately limited by the geometry and aperture of the array, as well as by temporal variations in the seismic field. We discuss the implications of these results for the geophysics community performing ambient seismic noise studies, as well as for the cancellation of seismic Newtonian gravity noise in ground-based, sub-Hz, gravitational-wave detectors.
Many major oceanographic internal wave observational programs of the last 4 decades are reanalyzed in order to characterize variability of the deep ocean internal wavefield. The observations are discussed in the context of the universal spectral model proposed by Garrett and Munk. The Garrett and Munk model is a good description of wintertime conditions at Site-D on the continental rise north of the Gulf Stream. Elsewhere and at other times, significant deviations in terms of amplitude, separability of the 2-D vertical wavenumber - frequency spectrum, and departure from the models functional form are noted. Subtle geographic patterns are apparent in deviations from the high frequency and high vertical wavenumber power laws of the Garrett and Munk spectrum. Moreover, such deviations tend to co-vary: whiter frequency spectra are partnered with redder vertical wavenumber spectra. Attempts are made to interpret the variability in terms of the interplay between generation, propagation and nonlinearity using a statistical radiative balance equation. This process frames major questions for future research with the insight that such integrative studies could constrain both observationally and theoretically based interpretations.
The Virgo gravitational wave detector is an interferometer (ITF) with 3km arms located in Pisa, Italy. From July to October 2010, Virgo performed its third science run (VSR3) in coincidence with the LIGO detectors. Despite several techniques adopted to isolate the interferometer from the environment, seismic noise remains an important issue for Virgo. Vibrations produced by the detector infrastructure (such as air conditioning units, water chillers/heaters, pumps) are found to affect Virgos sensitivity, with the main coupling mechanisms being through beam jitter and scattered light processes. The Advanced Virgo (AdV) design seeks to reduce ITF couplings to environmental noise by having most vibration-sensitive components suspended and in-vacuum, as well as muffle and relocate loud machines. During the months of June and July 2010, a Guralp-3TD seismometer was stationed at various locations around the Virgo site hosting major infrastructure machines. Seismic data were examined using spectral and coherence analysis with seismic probes close to the detector. The primary aim of this study was to identify noisy machines which seismically affect the ITF environment and thus require mitigation attention. Analyzed machines are located at various distances from the experimental halls, ranging from 10m to 100m. An attempt is made to measure the attenuation of emitted noise at the ITF and correlate it to the distance from the source and to seismic attenuation models in soil.
Most of the seismic inversion techniques currently proposed focus on robustness with respect to the background model choice or inaccurate physical modeling assumptions, but are not apt to large-scale 3D applications. On the other hand, methods that are computationally feasible for industrial problems, such as full waveform inversion, are notoriously bogged down by local minima and require adequate starting models. We propose a novel solution that is both scalable and less sensitive to starting model or inaccurate physics when compared to full waveform inversion. The method is based on a dual (Lagrangian) reformulation of the classical wavefield reconstruction inversion, whose robustness with respect to local minima is well documented in the literature. However, it is not suited to 3D, as it leverages expensive frequency-domain solvers for the wave equation. The proposed reformulation allows the deployment of state-of-the-art time-domain finite-difference methods, and is computationally mature for industrial scale problems.
We study the estimation of parameters in a quantum metrology scheme based on entangled many-body Unruh-DeWitt detectors. It is found that the precision for the estimation of Unruh effect can be enhanced via initial state preparations and parameter selections. It is shown that the precision in the estimation of the Unruh temperature in terms of a many-body-probe metrology is always better than the precision in two probe strategies. The proper acceleration for Bobs detector and the interaction between the accelerated detector and the external field have significant influences on the precision for the Unruh effects estimation. In addition, the probe state prepared with more excited atoms in the initial state is found to perform better than less excited initial states. However, different from the estimation of the Unruh temperature, the estimation of the effective coupling parameter for the accelerated detector requires more total atoms but less excited atoms in the estimations.
Achieving desirable receiver sampling in ocean bottom acquisition is often not possible because of cost considerations. Assuming adequate source sampling is available, which is achievable by virtue of reciprocity and the use of modern randomized (simultaneous-source) marine acquisition technology, we are in a position to train convolutional neural networks (CNNs) to bring the receiver sampling to the same spatial grid as the dense source sampling. To accomplish this task, we form training pairs consisting of densely sampled data and artificially subsampled data using a reciprocity argument and the assumption that the source-site sampling is dense. While this approach has successfully been used on the recovery monochromatic frequency slices, its application in practice calls for wavefield reconstruction of time-domain data. Despite having the option to parallelize, the overall costs of this approach can become prohibitive if we decide to carry out the training and recovery independently for each frequency. Because different frequency slices share information, we propose the use the method of transfer training to make our approach computationally more efficient by warm starting the training with CNN weights obtained from a neighboring frequency slices. If the two neighboring frequency slices share information, we would expect the training to improve and converge faster. Our aim is to prove this principle by carrying a series of carefully selected experiments on a relatively large-scale five-dimensional data synthetic data volume associated with wide-azimuth 3D ocean bottom node acquisition. From these experiments, we observe that by transfer training we are able t significantly speedup in the training, specially at relatively higher frequencies where consecutive frequency slices are more correlated.