Do you want to publish a course? Click here

A novel approach to visibility-space modelling of interferometric gravitational lens observations at high angular resolution

68   0   0.0 ( 0 )
 Added by Devon Powell
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

We present a new gravitational lens modelling technique designed to model high-resolution interferometric observations with large numbers of visibilities without the need to pre-average the data in time or frequency. We demonstrate the accuracy of the method using validation tests on mock observations. Using small data sets with $sim 10^3$ visibilities, we first compare our approach with the more traditional direct Fourier transform (DFT) implementation and direct linear solver. Our tests indicate that our source inversion is indistinguishable from that of the DFT. Our method also infers lens parameters to within 1 to 2 per cent of both the ground truth and DFT, given sufficiently high signal-to-noise ratio (SNR). When the SNR is as low as 5, both approaches lead to errors of several tens of per cent in the lens parameters and a severely disrupted source structure, indicating that this is related to the SNR and choice of priors rather than the modelling technique itself. We then analyze a large data set with $sim 10^8$ visibilities and a SNR matching real global Very Long Baseline Interferometry observations of the gravitational lens system MG J0751+2716. The size of the data is such that it cannot be modelled with traditional implementations. Using our novel technique, we find that we can infer the lens parameters and the source brightness distribution, respectively, with an RMS error of 0.25 and 0.97 per cent relative to the ground truth.



rate research

Read More

106 - D. Rusin , M. Norbury , A.D. Biggs 2001
We present a series of high resolution radio and optical observations of the CLASS gravitational lens system B1152+199 obtained with the Multi-Element Radio-Linked Interferometer Network (MERLIN), Very Long Baseline Array (VLBA) and Hubble Space Telescope (HST). Based on the milliarcsecond-scale substructure of the lensed radio components and precise optical astrometry for the lensing galaxy, we construct models for the system and place constraints on the galaxy mass profile. For a single galaxy model with surface mass density Sigma(r) propto r^-beta, we find that 0.95 < beta < 1.21 at 2-sigma confidence. Including a second deflector to represent a possible satellite galaxy of the primary lens leads to slightly steeper mass profiles.
Since the very beginning of astronomy the location of objects on the sky has been a fundamental observational quantity that has been taken for granted. While precise two dimensional positional information is easy to obtain for observations in the electromagnetic spectrum, the positional accuracy of current and near future gravitational wave detectors is limited to between tens and hundreds of square degrees, which makes it extremely challenging to identify the host galaxies of gravitational wave events or to confidently detect any electromagnetic counterparts. Gravitational wave observations provide information on source properties and distances that is complementary to the information in any associated electromagnetic emission and that is very hard to obtain in any other way. Observing systems with multiple messengers thus has scientific potential much greater than the sum of its parts. A gravitational wave detector with higher angular resolution would significantly increase the prospects for finding the hosts of gravitational wave sources and triggering a multi-messenger follow-up campaign. An observatory with arcminute precision or better could be realised within the Voyage 2050 programme by creating a large baseline interferometer array in space and would have transformative scientific potential. Precise positional information of standard sirens would enable precision measurements of cosmological parameters and offer new insights on structure formation; a high angular resolution gravitational wave observatory would allow the detection of a stochastic background and resolution of the anisotropies within it; it would also allow the study of accretion processes around black holes; and it would have tremendous potential for tests of modified gravity and the discovery of physics beyond the Standard Model.
128 - Samir Choudhuri 2014
We present two estimators to quantify the angular power spectrum of the sky signal directly from the visibilities measured in radio interferometric observations. This is relevant for both the foregrounds and the cosmological 21-cm signal buried therein. The discussion here is restricted to the Galactic synchrotron radiation, the most dominant foreground component after point source removal. Our theoretical analysis is validated using simulations at 150 MHz, mainly for GMRT and also briefly for LOFAR. The Bare Estimator uses pairwise correlations of the measured visibilities, while the Tapered Gridded Estimator uses the visibilities after gridding in the uv plane. The former is very precise, but computationally expensive for large data. The latter has a lower precision, but takes less computation time which is proportional to the data volume. The latter also allows tapering of the sky response leading to sidelobe suppression, an useful ingredient for foreground removal. Both estimators avoid the positive bias that arises due to the system noise. We consider amplitude and phase errors of the gain, and the w-term as possible sources of errors . We find that the estimated angular power spectrum is exponentially sensitive to the variance of the phase errors but insensitive to amplitude errors. The statistical uncertainties of the estimators are affected by both amplitude and phase errors. The w-term does not have a significant effect at the angular scales of our interest. We propose the Tapered Gridded Estimator as an effective tool to observationally quantify both foregrounds and the cosmological 21-cm signal.
Our aim is to evaluate fundamental parameters from the analysis of the electromagnetic spectra of stars. We may use $10^3$-$10^5$ spectra; each spectrum being a vector with $10^2$-$10^4$ coordinates. We thus face the so-called curse of dimensionality. We look for a method to reduce the size of this data-space, keeping only the most relevant information.As a reference method, we use principal component analysis (PCA) to reduce dimensionality. However, PCA is an unsupervised method, therefore its subspace was not consistent with the parameter. We thus tested a supervised method based on Sliced Inverse Regression (SIR), which provides a subspace consistent with the parameter. It also shares analogies with factorial discriminant analysis: the method slices the database along the parameter variation, and builds the subspace which maximizes the inter-slice variance, while standardizing the total projected variance of the data. Nevertheless the performances of SIR were not satisfying in standard usage, because of the non-monotonicity of the unknown function linking the data to the parameter and because of the noise propagation. We show that better performances can be achieved by selecting the most relevant directions for parameter inference. Preliminary tests are performed on synthetic pseudo-line profiles plus noise. Using one direction, we show that compared to PCA, the error associated with SIR is 50$%$ smaller on a non-linear parameter, and 70$%$ smaler on a linear parameter. Moreover, using a selected direction, the error is 80$%$ smaller for a non-linear parameter, and 95$%$ smaller for a linear parameter.
148 - H. T. Intema 2014
High-resolution astronomical imaging at sub-GHz radio frequencies has been available for more than 15 years, with the VLA at 74 and 330 MHz, and the GMRT at 150, 240, 330 and 610 MHz. Recent developments include wide-bandwidth upgrades for VLA and GMRT, and commissioning of the aperture-array-based, multi-beam telescope LOFAR. A common feature of these telescopes is the necessity to deconvolve the very many detectable sources within their wide fields-of-view and beyond. This is complicated by gain variations in the radio signal path that depend on viewing direction. One such example is phase errors due to the ionosphere. Here I discuss the inner workings of SPAM, a set of AIPS-based data reduction scripts in Python that includes direction-dependent calibration and imaging. Since its first version in 2008, SPAM has been applied to many GMRT data sets at various frequencies. Many valuable lessons were learned, and translated into various SPAM software modifications. Nowadays, semi-automated SPAM data reduction recipes can be applied to almost any GMRT data set, yielding good quality continuum images comparable with (or often better than) hand-reduced results. SPAM is currently being migrated from AIPS to CASA with an extension to handle wide bandwidths. This is aimed at providing users of the VLA low-band system and the upcoming wide-bandwidth GMRT with the necessary data reduction tools.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا