Reduced Order Modeling (ROM) for engineering applications has been a major research focus in the past few decades due to the unprecedented physical insight into turbulence offered by high-fidelity CFD. The primary goal of a ROM is to model the key physics/features of a flow-field without computing the full Navier-Stokes (NS) equations. This is accomplished by projecting the high-dimensional dynamics to a low-dimensional subspace, typically utilizing dimensionality reduction techniques like Proper Orthogonal Decomposition (POD), coupled with Galerkin projection. In this work, we demonstrate a deep learning based approach to build a ROM using the POD basis of canonical DNS datasets, for turbulent flow control applications. We find that a type of Recurrent Neural Network, the Long Short Term Memory (LSTM) which has been primarily utilized for problems like speech modeling and language translation, shows attractive potential in modeling temporal dynamics of turbulence. Additionally, we introduce the Hurst Exponent as a tool to study LSTM behavior for non-stationary data, and uncover useful characteristics that may aid ROM development for a variety of applications.