We use a set of 45 simulated clusters with a wide mass range ($8times 10^{13} < M_{500}~[$M$_{odot}]~< 2times 10^{15}$) to investigate the effect of varying hydrodynamics flavours on cluster mass estimates. The cluster zooms were simulated using the same cosmological models as the BAHAMAS and C-EAGLE projects, leading to differences in both the hydrodynamic solvers and the subgrid physics but still producing clusters which broadly match observations. At the same mass resolution as BAHAMAS, for the most massive clusters ($M_{500} > 10^{15}$ M$_{odot}$), we find changes in the SPH method produce the greatest differences in the final halo, while the subgrid models dominate at lower mass. By calculating the mass of all of the clusters using different permutations of the pressure, temperature and density profiles, created with either the true simulated data or mock spectroscopic data, we find that the spectroscopic temperature causes a bias in the hydrostatic mass estimates which increases with the mass of the cluster, regardless of the SPH flavour used. For the most massive clusters, the estimated mass of the cluster using spectroscopic density and temperature profiles is found to be as low as 50 per cent of the true mass compared to $sim$ 90 per cent for low mass clusters. When including a correction for non-thermal pressure, the spectroscopic hydrostatic mass estimates are less biased on average and the mass dependence of the bias is reduced, although the scatter in the measurements does increase.