Nesterovs Accelerated Gradient and Momentum as approximations to Regularised Update Descent


الملخص بالإنكليزية

We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical momentum and Nesterovs accelerated gradient method, lending a new intuitive interpretation to the latter algorithm. We show that a new algorithm, which we term Regularised Gradient Descent, can converge more quickly than either Nesterovs algorithm or the classical momentum algorithm.

تحميل البحث