Informed MCMC methods have been proposed as scalable solutions to Bayesian posterior computation on high-dimensional discrete state spaces. We study a class of MCMC schemes called informed importance tempering (IIT), which combine importance sampling and informed local proposals. Spectral gap bounds for IIT estimators are obtained, which demonstrate the remarkable scalability of IIT samplers for unimodal target distributions. The theoretical insights acquired in this note provide guidance on the choice of informed proposals in model selection and the use of importance sampling in MCMC methods.