We investigate a species selective cooling process of a trapped $mathrm{SU}(N)$ Fermi gas using entropy redistribution during adiabatic loading of an optical lattice. Using high-temperature expansion of the Hubbard model, we show that when a subset $N_A < N$ of the single-atom levels experiences a stronger trapping potential in a certain region of space, the dimple, it leads to improvement in cooling as compared to a $mathrm{SU}(N_A)$ Fermi gas only. We show that optimal performance is achieved when all atomic levels experience the same potential outside the dimple and we quantify the cooling for various $N_A$ by evaluating the dependence of the final entropy densities and temperatures as functions of the initial entropy. Furthermore, considering ${}^{87}{rm Sr}$ and ${}^{173}{rm Yb}$ for specificity, we provide a quantitative discussion of how the state selective trapping can be achieved with readily available experimental techniques.