Continuously monitored atomic spin-ensembles allow, in principle, for real-time sensing of external magnetic fields beyond classical limits. Within the linear-Gaussian regime, thanks to the phenomenon of measurement-induced spin-squeezing, they attain a quantum-enhanced scaling of sensitivity both as a function of time, $t$, and the number of atoms involved, $N$. In our work, we rigorously study how such conclusions based on Kalman filtering methods change when inevitable imperfections are taken into account: in the form of collective noise, as well as stochastic fluctuations of the field in time. We prove that even an infinitesimal amount of noise disallows the error to be arbitrarily diminished by simply increasing $N$, and forces it to eventually follow a classical-like behaviour in $t$. However, we also demonstrate that, thanks to the presence of noise, in most regimes the model based on a homodyne-like continuous measurement actually achieves the ultimate sensitivity allowed by the decoherence, yielding then the optimal quantum-enhancement. We are able to do so by constructing a noise-induced lower bound on the error that stems from a general method of classically simulating a noisy quantum evolution, during which the stochastic parameter to be estimated -- here, the magnetic field -- is encoded. The method naturally extends to schemes beyond the linear-Gaussian regime, in particular, also to ones involving feedback or active control.