Metallicity is one of the crucial factors that determine stellar evolution. To characterize the properties of stellar populations one needs to know the fraction of stars forming at different metallicities. Knowing how this fraction evolves over time is necessary e.g. to estimate the rates of occurrence of any stellar evolution related phenomena (e.g. double compact object mergers, gamma ray bursts). Such theoretical estimates can be confronted with observational limits to validate the assumptions about the evolution of the progenitor system leading to a certain transient. However, to perform the comparison correctly one needs to know the uncertainties related to the assumed star formation history and chemical evolution of the Universe. We combine the empirical scaling relations and other observational properties of the star forming galaxies to construct the distribution of the cosmic star formation rate density at different metallicities and redshifts. We address the question of uncertainty of this distribution due to currently unresolved questions, such as the absolute metallicity scale, the flattening in the star formation--mass relation or the low mass end of the galaxy mass function. We find that the fraction of stellar mass formed at metallicities <10% solar (>solar) since z=3 varies by ~18% (~26%) between the extreme cases considered in our study. This uncertainty stems primarily from the differences in the mass metallicity relations obtained with different methods. We confront our results with the local core-collapse supernovae observations. Our model is publicly available.