NUMERICAL SIMULATIONS OF SOME NONLINEAR CONJUGATE GRADIENT METHODS WITH INEXACT LINE SEARCHES
In this paper, we compare seven nonlinear conjugate gradient methods (Hestenes-Stiefel, Fletcher-Reeves, Polak-Ribière-Polyak, Conjugate Descent, Liu-Storey, Dai-Yuan and Hager-Zhang) with inexact line searches (Armijo and strong Wolfe).
The simulations (Scilab) show that the strong Wolfe’s line search accelerates the convergence of the methods cited above more than the Armijo’s rule, and that the Hager-Zhang and Dai-Yuan methods are more efficient, in general, than the other methods studied in this article.
The results are obtained by implementing algorithms of the methods in Scilab for some known test problems.
nonlinear conjugate gradient methods, unconstrained optimization, inexact line search.