In order to investigate the spatial distribution of the ICM temperature in galaxy clusters in a quantitative way and probe the physics behind, we analyze the X-ray spectra of a sample of 50 galaxy clusters, which were observed with the Chandra ACIS instrument in the past 15 years, and measure the radial temperature profiles out to $0.45r_{500}$. We construct a physical model that takes into account the effects of gravitational heating, thermal history (such as radiative cooling, AGN feedback, and thermal conduction) and work done via gas compression, and use it to fit the observed temperature profiles by running Bayesian regressions. The results show that in all cases our model provides an acceptable fit at the 68% confidence level. To further validate this model we select nine clusters that have been observed with both Chandra (out to $gtrsim 0.3r_{500}$) and Suzaku (out to $gtrsim 1.5r_{500}$), fit their Chandra spectra with our model, and compare the extrapolation of the best-fits with the Suzaku measurements. We find that the model profiles agree with the Suzaku results very well in seven clusters. In the rest two clusters the difference between the model and observation is possibly caused by local thermal substructures. Our study also implies that for most of the clusters the assumption of hydrostatic equilibrium is safe out to at least $0.5r_{500}$, and the non-gravitational interactions between dark matter and its luminous counterpart is consistent with zero.