Mismatch between X-ray and emission-weighted temperatures in galaxy clusters: cosmological implications


Abstract in English

The thermal properties of hydrodynamical simulations of galaxy clusters are usually compared to observations by relying on the emission-weighted temperature T_ew, instead of on the spectroscopic X-ray temperature T_spec, which is obtained by actual observational data. In a recent paper Mazzotta et al. show that, if the cluster is thermally complex, T_ew fails at reproducing T_spec, and propose a new formula, the spectroscopic-like temperature, T_sl, which approximates T_spec better than a few per cent. By analyzing a set of hydrodynamical simulations of galaxy clusters, we find that T_sl is lower than T_ew by 20-30 per cent. As a consequence, the normalization of the M-T_sl relation from the simulations is larger than the observed one by about 50 per cent. If masses in simulated clusters are estimated by following the same assumptions of hydrostatic equilibrium and beta--model gas density profile, as often done for observed clusters, then the M-T relation decreases by about 40 per cent, and significantly reduces its scatter. Based on this result, we conclude that using the observed M-T relation to infer the amplitude of the power spectrum from the X-ray temperature function could bias low sigma_8 by 10-20 per cent. This may alleviate the tension between the value of sigma_8 inferred from the cluster number density and those from cosmic microwave background and large scale structure.

Download