ترغب بنشر مسار تعليمي؟ اضغط هنا

The Event Horizon Telescope recently produced the first images of a black hole. These images were synthesized by measuring the coherent correlation function of the complex electric field measured at telescopes located across the Earth. This correlati on function corresponds to the Fourier transform of the image under the assumption that the source emits spatially incoherent radiation. However, black holes differ from standard astrophysical objects: in the absence of absorption and scattering, an observer sees a series of increasingly demagnified echos of each emitting location. These echos correspond to rays that orbit the black hole one or more times before reaching the observer. This multi-path propagation introduces spatial and temporal correlations into the electric field that encode properties of the black hole, irrespective of intrinsic variability. We explore the coherent temporal autocorrelation function measured at a single telescope. Specifically, we study the simplified toy problem of scalar field correlation functions $langle Psi(t) Psi(0) rangle$ sourced by fluctuating matter located near a Schwarzschild black hole. We find that the correlation function is peaked at times equal to integer multiples of the photon orbit period; the corresponding power spectral density vanishes like $lambda/r_{rm g}$ where $r_{rm g} = G M / c^{2}$ is the gravitational radius of the black hole and $lambda$ is the wavelength of radiation observed. For supermassive black holes observed at millimeter wavelengths, the power in echos is suppressed relative to direct emission by $sim 10^{-13} lambda_{rm mm}/M_{6}$, where $lambda_{rm mm} = lambda/(1,{rm mm})$ and $M_6 = M/(10^6 M_odot)$. Consequently, detecting multi-path propagation near a black hole using the coherent electric field autocorrelation is infeasible with current technology.
The Event Horizon Telescope (EHT) has recently delivered the first resolved images of M87*, the supermassive black hole in the center of the M87 galaxy. These images were produced using 230 GHz observations performed in 2017 April. Additional observa tions are required to investigate the persistence of the primary image feature - a ring with azimuthal brightness asymmetry - and to quantify the image variability on event horizon scales. To address this need, we analyze M87* data collected with prototype EHT arrays in 2009, 2011, 2012, and 2013. While these observations do not contain enough information to produce images, they are sufficient to constrain simple geometric models. We develop a modeling approach based on the framework utilized for the 2017 EHT data analysis and validate our procedures using synthetic data. Applying the same approach to the observational data sets, we find the M87* morphology in 2009-2017 to be consistent with a persistent asymmetric ring of ~40 uas diameter. The position angle of the peak intensity varies in time. In particular, we find a significant difference between the position angle measured in 2013 and 2017. These variations are in broad agreement with predictions of a subset of general relativistic magnetohydrodynamic simulations. We show that quantifying the variability across multiple observational epochs has the potential to constrain the physical properties of the source, such as the accretion state or the black hole spin.
High-resolution imaging of supermassive black holes is now possible, with new applications to testing general relativity and horizon-scale accretion and relativistic jet formation processes. Over the coming decade, the EHT will propose to add new str ategically placed VLBI elements operating at 1.3mm and 0.87mm wavelength. In parallel, development of next-generation backend instrumentation, coupled with high throughput correlation architectures, will boost sensitivity, allowing the new stations to be of modest collecting area while still improving imaging fidelity and angular resolution. The goal of these efforts is to move from imaging static horizon scale structure to dynamic reconstructions that capture the processes of accretion and jet launching in near real time.
By linking widely separated radio dishes, the technique of very long baseline interferometry (VLBI) can greatly enhance angular resolution in radio astronomy. However, at any given moment, a VLBI array only sparsely samples the information necessary to form an image. Conventional imaging techniques partially overcome this limitation by making the assumption that the observed cosmic source structure does not evolve over the duration of an observation, which enables VLBI networks to accumulate information as the Earth rotates and changes the projected array geometry. Although this assumption is appropriate for nearly all VLBI, it is almost certainly violated for submillimeter observations of the Galactic Center supermassive black hole, Sagittarius A* (Sgr A*), which has a gravitational timescale of only ~20 seconds and exhibits intra-hour variability. To address this challenge, we develop several techniques to reconstruct dynamical images (movies) from interferometric data. Our techniques are applicable to both single-epoch and multi-epoch variability studies, and they are suitable for exploring many different physical processes including flaring regions, stable images with small time-dependent perturbations, steady accretion dynamics, or kinematics of relativistic jets. Moreover, dynamical imaging can be used to estimate time-averaged images from time-variable data, eliminating many spurious image artifacts that arise when using standard imaging methods. We demonstrate the effectiveness of our techniques using synthetic observations of simulated black hole systems and 7mm Very Long Baseline Array observations of M87, and we show that dynamical imaging is feasible for Event Horizon Telescope observations of Sgr A*.
Advanced LIGO and Advanced Virgo will be all-sky monitors for merging compact objects within a few hundred Mpc. Finding the electromagnetic counterparts to these events will require an understanding of the transient sky at low red-shift (z<0.1). We p erformed a systematic search for extragalactic, low red-shift, transient events in the XMM-Newton Slew Survey. In a flux limited sample, we found that highly-variable objects comprised 10% of the sample, and that of these, 10% were spatially coincident with cataloged optical galaxies. This led to 4x10^-4 transients per square degree above a flux threshold of 3x10^-12 erg cm-2 s-1 [0.2-2 keV] which might be confused with LIGO/Virgo counterparts. This represents the first extragalactic measurement of the soft X-ray transient rate within the Advanced LIGO/Virgo horizon. Our search revealed six objects that were spatially coincident with previously cataloged galaxies, lacked evidence for optical AGNs, displayed high luminosities around 10^43 erg s-1, and varied in flux by more than a factor of ten when compared with the ROSAT All-Sky Survey. At least four of these displayed properties consistent with previously observed tidal disruption events.
The sensitivity of searches for astrophysical transients in data from the LIGO is generally limited by the presence of transient, non-Gaussian noise artifacts, which occur at a high-enough rate such that accidental coincidence across multiple detecto rs is non-negligible. Furthermore, non-Gaussian noise artifacts typically dominate over the background contributed from stationary noise. These glitches can easily be confused for transient gravitational-wave signals, and their robust identification and removal will help any search for astrophysical gravitational-waves. We apply Machine Learning Algorithms (MLAs) to the problem, using data from auxiliary channels within the LIGO detectors that monitor degrees of freedom unaffected by astrophysical signals. The number of auxiliary-channel parameters describing these disturbances may also be extremely large; an area where MLAs are particularly well-suited. We demonstrate the feasibility and applicability of three very different MLAs: Artificial Neural Networks, Support Vector Machines, and Random Forests. These classifiers identify and remove a substantial fraction of the glitches present in two very different data sets: four weeks of LIGOs fourth science run and one week of LIGOs sixth science run. We observe that all three algorithms agree on which events are glitches to within 10% for the sixth science run data, and support this by showing that the different optimization criteria used by each classifier generate the same decision surface, based on a likelihood-ratio statistic. Furthermore, we find that all classifiers obtain similar limiting performance, suggesting that most of the useful information currently contained in the auxiliary channel parameters we extract is already being used.
Gravitational wave astronomy relies on the use of multiple detectors, so that coincident detections may distinguish real signals from instrumental artifacts, and also so that relative timing of signals can provide the sky position of sources. We show that the comparison of instantaneous time-frequency and time- amplitude maps provided by the Hilbert-Huang Transform (HHT) can be used effectively for relative signal timing of common signals, to discriminate between the case of identical coincident signals and random noise coincidences, and to provide a classification of signals based on their time-frequency trajectories. The comparison is done with a chi-square goodness-of-fit method which includes contributions from both the instantaneous amplitude and frequency components of the HHT to match two signals in the time domain. This approach naturally allows the analysis of waveforms with strong frequency modulation.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا