Article

A globally convergent version of the Polak-Ribière conjugate gradient method

Mathematical Programming (Impact Factor: 1.98). 09/1997; 78(3):375-391. DOI: 10.1007/BF02614362
Source: DBLP

ABSTRACT In this paper we propose a new line search algorithm that ensures global convergence of the Polak-Ribière conjugate gradient
method for the unconstrained minimization of nonconvex differentiable functions. In particular, we show that with this line
search every limit point produced by the Polak-Ribière iteration is a stationary point of the objective function. Moreover,
we define adaptive rules for the choice of the parameters in a way that the first stationary point along a search direction
can be eventually accepted when the algorithm is converging to a minimum point with positive definite Hessian matrix. Under
strong convexity assumptions, the known global convergence results can be reobtained as a special case. From a computational
point of view, we may expect that an algorithm incorporating the step-size acceptance rules proposed here will retain the
same good features of the Polak-Ribière method, while avoiding pathological situations.

0 Bookmarks
 · 
125 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose two modified conjugate gradient methods, which produce sufficient descent direction at every iteration. The theoretical analysis shows that the algorithms are global convergence under some suitable conditions. The numerical results show that both algorithms are efficient for the given test problems from the Matlab library.
    2014 26th Chinese Control And Decision Conference (CCDC); 05/2014
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose a new super-memory gradient method for unconstrained optimization problems. The global convergence and linear convergence rate are proved under some mild conditions. The method uses the current and previous iterative information to generate a new search direction and uses Wolfe line search to define the step-size at each iteration. It has a possibly simple structure and avoids the computation and storage of some matrices, which is suitable to solve large scale optimization problems. Numerical experiments show that the new algorithm is effective in practical computation in many situations.
    2010 Second International Conference on Computational Intelligence and Natural Computing (CINC); 09/2010
  • [Show abstract] [Hide abstract]
    ABSTRACT: Modified Hestense–Stiefel, Polak–Ribière–Polyak and Liu–Storey conjugate gradient methods are developed using some new techniques. The proposed methods can generate sufficient descent directions without any line search. Under some conditions, global convergence results of the methods are established when the Wolfe or Armijo line search is used. Moreover, the \(r\) -linear convergence rate of the methods are analyzed. Numerical comparisons are given with some existing conjugate gradient methods using the unconstrained optimization problems in the CUTEr library.
    Computational & Applied Mathematics 07/2013; 33(2):333-347. · 0.41 Impact Factor

Full-text (3 Sources)

Download
121 Downloads
Available from
May 21, 2014