We consider a free surface thin film placed on a thermally conductive substrate and exposed to an external heat source in a setup where the heat absorption depends on the local film thickness. Our focus is on modeling film evolution while the film is molten. The evolution of the film modifies local heat flow, which in turn may influence the film surface evolution through thermal variation of the films material properties. Thermal conductivity of the substrate plays an important role in determining the heat flow and the temperature field in the evolving film and in the substrate itself. In order to reach a tractable formulation, we use asymptotic analysis to develop a novel thermal model that is accurate, computationally efficient, and that accounts for the heat flow in both the in-plane and out-of plane directions. We apply this model to metal films of nanoscale thickness exposed to heating and melting by laser pulses, a setup commonly used for self and directed assembly of various metal geometries via dewetting while the films are in the liquid phase. We find that thermal effects play an important role, and in particular that the inclusion of temperature dependence in the metal viscosity modifies the time scale of the evolution significantly. On the other hand, in the considered setup the Marangoni (thermocapillary) effect turns out to be insignificant.