The alpha element to iron peak element ratio, for example [Mg/Fe], is a commonly applied indicator of the galaxy star formation timescale (SFT) since the two groups of elements are mainly produced by different types of supernovae that explode over different timescales. However, it is insufficient to consider only [Mg/Fe] when estimating the SFT. The [Mg/Fe] yield of a stellar population depends on its metallicity. Therefore, it is possible for galaxies with different SFTs and at the same time different total metallicity to have the same [Mg/Fe]. This effect has not been properly taken into consideration in previous studies. In this study, we assume the galaxy-wide stellar initial mass function (gwIMF) to be canonical and invariant. We demonstrate that our computation code reproduces the SFT estimations of previous studies where only the [Mg/Fe] observational constraint is applied. We then demonstrate that once both metallicity and [Mg/Fe] observations are considered, a more severe downsizing relation is required. This means that either low-mass ellipticals have longer SFTs (> 4 Gyr for galaxies with mass below $10^{10}$ M$_odot$) or massive ellipticals have shorter SFTs ($approx 200$ Myr for galaxies more massive than $10^{11}$ M$_odot$) than previously thought. This modification increases the difficulty in reconciling such SFTs with other observational constraints. We show that applying different stellar yield modifications does not relieve this formation timescale problem. The quite unrealistically short SFT required by [Mg/Fe] and total metallicity would be prolonged if a variable stellar gwIMF were assumed. Since a systematically varying gwIMF has been suggested by various observations this could present a natural solution to this problem.