The use of mathematical models in the sciences often involves the estimation of unknown parameter values from data. Sloppiness provides information about the uncertainty of this task. In this paper, we develop a precise mathematical foundation for sloppiness and define rigorously its key concepts, such as `model manifold, in relation to concepts of structural identifiability. We redefine sloppiness conceptually as a comparison between the premetric on parameter space induced by measurement noise and a reference metric. This opens up the possibility of alternative quantification of sloppiness, beyond the standard use of the Fisher Information Matrix, which assumes that parameter space is equipped with the usual Euclidean metric and the measurement error is infinitesimal. Applications include parametric statistical models, explicit time dependent models, and ordinary differential equation models.