Many engineering problems involve learning hidden dynamics from indirect observations, where the physical processes are described by systems of partial differential equations (PDE). Gradient-based optimization methods are considered scalable and efficient to learn hidden dynamics. However, one of the most time-consuming and error-prone tasks is to derive and implement the gradients, especially in systems of PDEs where gradients from different systems must be correctly integrated together. To that purpose, we present a novel technique, called intelligent automatic differentiation (IAD), to leverage the modern machine learning tool $texttt{TensorFlow}$ for computing gradients automatically and conducting optimization efficiently. Moreover, IAD allows us to integrate specially designed state adjoint method codes to achieve better performance. Numerical tests demonstrate the feasibility of IAD for learning hidden dynamics in complicated systems of PDEs; additionally, by incorporating custom built state adjoint method codes in IAD, we significantly accelerate the forward and inverse simulation.