Quantum control could be implemented by varying the system Hamiltonian. According to adiabatic theorem, a slowly changing Hamiltonian can approximately keep the system at the ground state during the evolution if the initial state is a ground state. In this paper we consider this process as an interpolation between the initial and final Hamiltonians. We use the mean value of a single operator to measure the distance between the final state and the ideal ground state. This measure could be taken as the error of adiabatic approximation. We prove under certain conditions, this error can be precisely estimated for an arbitrarily given interpolating function. This error estimation could be used as guideline to induce adiabatic evolution. According to our calculation, the adiabatic approximation error is not proportional to the average speed of the variation of the system Hamiltonian and the inverse of the energy gaps in many cases. In particular, we apply this analysis to an example on which the applicability of the adiabatic theorem is questionable.