We address the use of asymptotic incompatibility (AI) to assess the quantumness of a multiparameter quantum statistical model. AI is a recently introduced measure which quantifies the difference between the Holevo and the SLD scalar bounds, and can be evaluated using only the symmetric logarithmic derivative (SLD) operators of the model. At first, we evaluate analytically the AI of the most general quantum statistical models involving two-level (qubit) and single-mode Gaussian continuous-variable quantum systems, and prove that AI is a simple monotonous function of the state purity. Then, we numerically investigate the same problem for qudits ($d$-dimensional quantum systems, with $2 < d leq 4$), showing that, while in general AI is not in general a function of purity, we have enough numerical evidence to conclude that the maximum amount of AI is achievable only for quantum statistical models characterized by a purity larger than $mu_{sf min} = 1/(d-1)$. In addition, by parametrizing qudit states as thermal (Gibbs) states, numerical results suggest that, once the spectrum of the Hamiltonian is fixed, the AI measure is in one-to-one correspondence with the fictitious temperature parameter $beta$ characterizing the family of density operators. Finally, by studying in detail the definition and properties of the AI measure we find that: i) given a quantum statistical model, one can readily identify the maximum number of asymptotically compatibile parameters; ii) the AI of a quantum statistical model bounds from above the AI of any sub-model that can be defined by fixing one or more of the original unknown parameters (or functions thereof), leading to possibly useful bounds on the AI of models involving noisy quantum dynamics.