A merger of two optimization frameworks is introduced: SEquential Subspace OPtimization (SESOP) with the MultiGrid (MG) optimization. At each iteration of the algorithm, search directions implied by the coarse-grid correction (CGC) process of MG are added to the low dimensional search-spaces of SESOP, which include the (preconditioned) gradient and search directions involving the previous iterates (so-called history). The resulting accelerated technique is called SESOP-MG. The asymptotic convergence factor of the two-level version of SESOP-MG (dubbed SESOP-TG) is studied via Fourier mode analysis for linear problems, i.e., optimization of quadratic functionals. Numerical tests on linear and nonlinear problems demonstrate the effectiveness of the approach.