We present a new `supercalibration technique for measuring systematic distortions in the wavelength scales of high resolution spectrographs. By comparing spectra of `solar twin stars or asteroids with a reference laboratory solar spectrum, distortions in the standard thorium--argon calibration can be tracked with $sim$10 m s$^{-1}$ precision over the entire optical wavelength range on scales of both echelle orders ($sim$50--100 AA) and entire spectrographs arms ($sim$1000--3000 AA). Using archival spectra from the past 20 years we have probed the supercalibration history of the VLT--UVES and Keck--HIRES spectrographs. We find that systematic errors in their wavelength scales are ubiquitous and substantial, with long-range distortions varying between typically $pm$200 m s$^{-1}$ per 1000 AA. We apply a simple model of these distortions to simulated spectra that characterize the large UVES and HIRES quasar samples which previously indicated possible evidence for cosmological variations in the fine-structure constant, $alpha$. The spurious deviations in $alpha$ produced by the model closely match important aspects of the VLT--UVES quasar results at all redshifts and partially explain the HIRES results, though not self-consistently at all redshifts. That is, the apparent ubiquity, size and general characteristics of the distortions are capable of significantly weakening the evidence for variations in $alpha$ from quasar absorption lines.