In this paper, we give some new thoughts about the classical gradient method (GM) and recall the proposed fractional order gradient method (FOGM). It is proven that the proposed FOGM holds a super convergence capacity and a faster convergence rate around the extreme point than the conventional GM. The property of asymptotic convergence of conventional GM and FOGM is also discussed. To achieve both a super convergence capability and an even faster convergence rate, a novel switching FOGM is proposed. Moreover, we extend the obtained conclusion to a more general case by introducing the concept of p-order Lipschitz continuous gradient and p-order strong convex. Numerous simulation examples are provided to validate the effectiveness of proposed methods.