The impact of measurement imperfections on quantum metrology protocols has been largely ignored, even though these are inherent to any sensing platform in which the detection process exhibits noise that neither can be eradicated, nor translated onto the sensing stage and interpreted as decoherence. In this work, we approach this issue in a systematic manner. Focussing firstly on pure states, we demonstrate how the form of the quantum Fisher information must be modified to account for noisy detection, and propose tractable methods allowing for its approximate evaluation. We then show that in canonical scenarios involving $N$ probes with local measurements undergoing readout noise, the optimal sensitivity dramatically changes its behaviour depending whether global or local control operations are allowed to counterbalance measurement imperfections. In the former case, we prove that the ideal sensitivity (e.g. the Heisenberg scaling) can always be recovered in the asymptotic $N$ limit, while in the latter the readout noise fundamentally constrains the quantum enhancement of sensitivity to a constant factor. We illustrate our findings with an example of an NV-centre measured via the repetitive readout procedure, as well as schemes involving spin-1/2 probes with bit-flip errors affecting their two-outcome measurements, for which we find the input states and control unitary operations sufficient to attain the ultimate asymptotic precision.