Noise dominates every aspect of near-term quantum computers, rendering it exceedingly difficult to carry out even small computations. In this paper we are concerned with the modelling of noise in NISQ computers. We focus on three error groups that represent the main sources of noise during a computation and present quantum channels that model each source. We engineer a noise model that combines all three noise channels and simulates the evolution of the quantum computer using its calibrated error rates. We run various experiments of our model, showcasing its behaviour compared to other noise models and an IBM quantum computer. We find that our model provides a better approximation of the quantum computers behaviour than the other models. Following this, we use a genetic algorithm to optimize the parameters used by our noise model, bringing the behaviour of the model even closer to the quantum computer. Finally, a comparison between the pre- and post-optimization parameters reveals that, according to our mode, certain operations can be more or less erroneous than the hardware-calibrated parameters show.