New Hybrid Conjugate Gradient Method with Global Convergence Properties for Unconstrained Optimization


Abstract in English

Nonlinear conjugate gradient (CG) method holds an important role in solving large-scale unconstrained optimization problems. In this paper, we suggest a new modification of CG coefficient �� that satisfies sufficient descent condition and possesses global convergence property under strong Wolfe line search. The numerical results show that our new method is more efficient compared with other CG formulas tested.

References used

Dai, Y.-H., and Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on optimization, 10(1), 177-182.
Dai, Z., and Wen, F. (2012). Another improved Wei–Yao–Liu nonlinear conjugate gradient method with sufficient descent property. Applied Mathematics and Computation, 218(14), 7421-7430.
Dolan, E. D., and Moré, J. J. (2002). Benchmarking optimization software with performance profiles. Mathematical programming, 91(2), 201-213
Hamoda, M., Mamat, M., Rivaie, M., and Salleh, Z. (2016). A Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization. Applied Mathematical Sciences, 10(15), 721-734.

Download