Quantization bias for digital correlators


الملخص بالإنكليزية

In radio interferometry, the quantization process introduces a bias in the magnitude and phase of the measured correlations which translates into errors in the measurement of source brightness and position in the sky, affecting both the system calibration and image reconstruction. In this paper we investigate the biasing effect of quantization in the measured correlation between complex-valued inputs with a circularly symmetric Gaussian probability density function (PDF), which is the typical case for radio astronomy applications. We start by calculating the correlation between the input and quantization error and its effect on the quantized variance, first in the case of a real-valued quantizer with a zero mean Gaussian input and then in the case of a complex-valued quantizer with a circularly symmetric Gaussian input. We demonstrate that this input-error correlation is always negative for a quantizer with an odd number of levels, while for an even number of levels this correlation is positive in the low signal level regime. In both cases there is an optimal interval for the input signal level for which this input-error correlation is very weak and the model of additive uncorrelated quantization noise provides a very accurate approximation. We determine the conditions under which the magnitude and phase of the measured correlation have negligible bias with respect to the unquantized values: we demonstrate that the magnitude bias is negligible only if both unquantized inputs are optimally quantized (i.e., when the uncorrelated quantization error model is valid), while the phase bias is negligible when 1) at least one of the inputs is optimally quantized, or when 2) the correlation coefficient between the unquantized inputs is small. Finally, we determine the implications of these results for radio interferometry.

تحميل البحث