We propose a new cosmological test of gravity, by using the observed mass fraction of X-ray emitting gas in massive galaxy clusters. The cluster gas fraction, believed to be a fair sample of the average baryon fraction in the Universe, is a well-understood observable, which has previously mainly been used to constrain background cosmology. In some modified gravity models, such as $f(R)$ gravity, gas temperature in a massive cluster is determined by the effective mass of that cluster, which can be larger than its true mass. On the other hand, X-ray luminosity is determined by the true gas density, which in both modified gravity and $Lambda$CDM models depends mainly on $Omega_{rm b}/Omega_{rm m}$ and hence the true total cluster mass. As a result, the standard practice of combining gas temperatures and X-ray surface brightnesses of clusters to infer their gas fractions can, in modified gravity models, lead to a larger - in $f(R)$ gravity this can be $1/3$ larger - value of $Omega_{rm b}/Omega_{rm m}$ than that inferred from other observations such as the CMB. A quick calculation shows that the Hu-Sawicki $n=1$ $f(R)$ model with $|bar{f}_{R0}|=3sim5times10^{-5}$ is in tension with the gas fraction data of the 42 clusters analysed by Allen et al. (2008). We also discuss the implications for other modified gravity models.