A general purpose fitting model for one-dimensional astrometric signals is developed, building on a maximum likelihood framework, and its performance is evaluated by simulation over a set of realistic image instances. The fit quality is analysed as a function of the number of terms used for signal expansion, and of astrometric error, rather than RMS discrepancy with respect to the input signal. The tuning of the function basis to the statistical characteristics of the signal ensemble is discussed. The fit sensitivity to a priori knowledge of the source spectra is addressed. Some implications of the current results on calibration and data reduction aspects are discussed, in particular with respect to Gaia.