ترغب بنشر مسار تعليمي؟ اضغط هنا

(abridged) Observations of Faraday rotation for extragalactic sources probe magnetic fields both inside and outside the Milky Way. Building on our earlier estimate of the Galactic contribution, we set out to estimate the extragalactic contributions. We discuss the problems involved; in particular, we point out that taking the difference between the observed values and the Galactic foreground reconstruction is not a good estimate for the extragalactic contributions. We point out a degeneracy between the contributions to the observed values due to extragalactic magnetic fields and observational noise and comment on the dangers of over-interpreting an estimate without taking into account its uncertainty information. To overcome these difficulties, we develop an extended reconstruction algorithm based on the assumption that the observational uncertainties are accurately described for a subset of the data, which can overcome the degeneracy with the extragalactic contributions. We present a probabilistic derivation of the algorithm and demonstrate its performance using a simulation, yielding a high quality reconstruction of the Galactic Faraday rotation foreground, a precise estimate of the typical extragalactic contribution, and a well-defined probabilistic description of the extragalactic contribution for each data point. We then apply this reconstruction technique to a catalog of Faraday rotation observations. We vary our assumptions about the data, showing that the dispersion of extragalactic contributions to observed Faraday depths is most likely lower than 7 rad/m^2, in agreement with earlier results, and that the extragalactic contribution to an individual data point is poorly constrained by the data in most cases.
Response calibration is the process of inferring how much the measured data depend on the signal one is interested in. It is essential for any quantitative signal estimation on the basis of the data. Here, we investigate self-calibration methods for linear signal measurements and linear dependence of the response on the calibration parameters. The common practice is to augment an external calibration solution using a known reference signal with an internal calibration on the unknown measurement signal itself. Contemporary self-calibration schemes try to find a self-consistent solution for signal and calibration by exploiting redundancies in the measurements. This can be understood in terms of maximizing the joint probability of signal and calibration. However, the full uncertainty structure of this joint probability around its maximum is thereby not taken into account by these schemes. Therefore better schemes -- in sense of minimal square error -- can be designed by accounting for asymmetries in the uncertainty of signal and calibration. We argue that at least a systematic correction of the common self-calibration scheme should be applied in many measurement situations in order to properly treat uncertainties of the signal on which one calibrates. Otherwise the calibration solutions suffer from a systematic bias, which consequently distorts the signal reconstruction. Furthermore, we argue that non-parametric, signal-to-noise filtered calibration should provide more accurate reconstructions than the common bin averages and provide a new, improved self-calibration scheme. We illustrate our findings with a simplistic numerical example.
NIFTY, Numerical Information Field Theory, is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency. NIFTY offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTY permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTY operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا