Conference Paper

A Globally Convergent Improved Hybrid Method of Quasi Newton and Conjugate Gradient Method

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
In this paper, two modifications for spectral quasi-Newton algorithm of type BFGS are imposed. In the first algorithm, named SQNEI, a certain spectral parameter is used in such a step for BFGS algorithm differs from other presented algorithms. The second algorithm, SQNEv-Iv, has both new parameter position and value suggestion. In SQNEI and SQNEv-Iv methods, the parameters are involved in a search direction after an approximated Hessian matrix is updated. It is provided that two methods are effective under some assumptions. Moreover, the sufficient descent property is proved as well as the global and superlinear convergence for SQNEv-Iv and SQNEI. Both of them are superior the standard BFGS (QNBFGS) and previous spectral quasi-Newton (SQNLC). However, SQNEv-Iv is outstanding SQNEI if it is convergent to the solution. This means that, two modified methods are in the race for the more efficiency method in terms less iteration numbers and consuming time in running CPU. Finally, numerical results are presented for the four algorithms by running list of test problems with inexact line search satisfying Armijo condition.
Article
Full-text available
The standard BFGS method is a famous quasi-Newton method for solving optimization problems. For convex functions, the study of the convergence of BFGS is relatively mature, while its global convergence for nonconvex functions with inexact line searches still remains to be solved. In this paper, a modified weak Wolfe–Powell line search (MWWP line search) and the projection technique has been used. The proposed BFGS method has been proven to converge globally for the nonconvex function on appropriate assumptions. Numerical experiments indicate that the proposed algorithm has certain advantages compared with other similar algorithms and estimates the parameter for the nonlinear Muskingum model effectively.
Article
Full-text available
The traditional BFGS algorithm has been proved very efficient. It is convergent for convex nonlinear optimization problems. However, for non-convex nonlinear optimization problems, it is known that the BFGS algorithm may not be convergent. This paper proposes a robust BFGS algorithm in the sense that the algorithm superlinearly converges to a local minimum under some mild assumptions for both convex and non-convex nonlinear optimization problems. Numerical test on the CUTEst test set is reported to demonstrate the merit of the proposed robust BFGS algorithm. This result shows that the robust BFGS algorithm is very efficient and effective.
Article
Full-text available
Conjugate gradient method and quasi-Newton (QN) method are both well known solvers for solving unconstrained optimization problems. In this paper, we proposed a new conjugate gradient method denoted as Wan, Asrul and Mustafa (WAM) method. This WAM method is then combined with the QN method to produce a new hybrid search direction which is QN-WAM. Based on numerical results, the proposed hybrid method proved to be more efficient compared to the original quasi-Newton method and other hybrid methods.
Preprint
Full-text available
p class="MsoNormal" style="text-align: justify;"> In this work we propose and analyze a hybrid conjugate gradient (CG) method in which the parameter is computed as a linear combination between Hager-Zhang [HZ] and Dai-Liao [DL] parameters. We use this proposed method to modify BFGS method and to prove the positive definiteness and QN-conditions of the matrix. Theoretical trils confirm that the new search directions aredescent directions under some conditions, as well as, the new search directions areglobally convergent using strong Wolfe conditions. The numerical experiments show that the proposed method is promising and outperforms alternative similar CG-methods using Dolan-Mor'e performance profile. </p
Article
Full-text available
In this paper, we consider an unconstrained optimization problem and propose a new family of modified BFGS methods to solve it. As it is known, classic BFGS method is not always globally convergence for nonconvex functions. To overcome this difficulty, we introduce a new modified weak-Wolfe–Powell line search technique. Under this new technique, we prove global convergence of the new family of modified BFGS methods and the classic BFGS method, for nonconvex functions. Furthermore, all members of this family have at least o(s5)o(\Vert s \Vert ^{5}) error order. Our obtained results from numerical experiments on 77 standard unconstrained problems, indicate that the algorithms developed in this paper are promising and more effective than some similar algorithms.
Article
Full-text available
A hybrid of quasi-Newton and conjugate gradient method combines the search direction of the two methods to produce a new algorithm for solving unconstrained optimization functions. In this study, a modified hybrid Quasi-Newton method is presented where a new conjugate gradient coefficient is employed in the search direction. Based on the numerical results, the proposed hybrid method proved to be robust in comparison to the original quasi-Newton method and other hybrid methods.
Article
Full-text available
The conjugate gradient method plays an important role in solving large-scaled problems and the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Therefore, in this paper, the new hybrid method between the conjugate gradient method and the quasi-newton method for solving optimization problem is suggested. The Broyden family formula is used as an approximation of Hessian in the hybrid method and the quasi-Newton method. Our numerical analysis provides strong evidence that our Broyden-CG method is more efficient than the ordinary Broyden method. Furthermore, we also prove that new algorithm is globally convergent and gratify the sufficient descent condition.
Article
Full-text available
In this paper we present a new line search method known as the HBFGS method, which uses the search direction of the conjugate gradient method with the quasi-Newton updates. The Broyden-Fletcher-Goldfarb-Shanno (BFGS) update is used as approximation of the Hessian for the methods. The new algorithm is compared with the BFGS method in terms of iteration counts and CPU-time. Our numerical analysis provides strong evidence that the proposed HBFGS method is more efficient than the ordinary BFGS method. Besides, we also prove that the new algorithm is globally convergent.
Article
Full-text available
Test functions are important to validate and compare the performance of optimization algorithms. There have been many test or benchmark functions reported in the literature; however, there is no standard list or set of benchmark functions. Ideally, test functions should have diverse properties so that can be truly useful to test new algorithms in an unbiased way. For this purpose, we have reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimization problems with diverse properties in terms of modality, separability, and valley landscape. This is by far the most complete set of functions so far in the literature, and tt can be expected this complete set of functions can be used for validation of new optimization in the future.
Article
Full-text available
We study the global convergence properties of the restricted Broyden class of quasi-Newton methods, when applied to a convex objective function. We assume that the line search satisfies a standard sufficient decrease condition and that the initial Hessian approximation is any positive definite matrix. We show global and superlinear convergence for this class of methods, except for DFP. This generalizes Powell’s well-known result for the BFGS method. The analysis gives us insight into the properties of these algorithms; in particular it shows that DFP lacks a very desirable self-correcting property possessed by BFGS.
Book
Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. For this new edition the book has been thoroughly updated throughout. There are new chapters on nonlinear interior methods and derivative-free methods for optimization, both of which are used widely in practice and the focus of much current research. Because of the emphasis on practical methods, as well as the extensive illustrations and exercises, the book is accessible to a wide audience. It can be used as a graduate text in engineering, operations research, mathematics, computer science, and business. It also serves as a handbook for researchers and practitioners in the field. The authors have strived to produce a text that is pleasant to read, informative, and rigorous - one that reveals both the beautiful nature of the discipline and its practical side.
Article
The BFGS update formula is shown to have an important property that is inde- pendent of the algorithmic context of the update, and that is relevant to both constrained and unconstrained optimization. The BFGS method for unconstrained optimization, using a variety of line searches, including backtracking, is shown to be globally and superlinearly convergent on uniformly convex problems. The analysis is particularly simple due to the use of some new tools introduced in this paper.
Article
Although quasi-Newton algorithms generally converge in fewer iterations than conjugate gradient algorithms, they have the disadvantage of requiring substantially more storage. An algorithm will be described which uses an intermediate (and variable) amount of storage and which demonstrates convergence which is also intermediate, that is, generally better than that observed for conjugate gradient algorithms but not so good as in a quasi-Newton approach. The new algorithm uses a strategy of generating a form of conjugate gradient search direction for most iterations, but it periodically uses a quasi-Newton step to improve the convergence. Some theoretical background for a new algorithm has been presented in an earlier paper; here we examine properties of the new algorithm and its implementation. We also present the results of some computational experience.
Article
Recently, Hager and Zhang (2005) [11] proposed a new conjugate gradient method which generates sufficient descent direction gkTdk≤−7/8‖gk‖2, this property is independent of the line search used. In this paper, we take a modification of this method, such that the sufficient descent direction satisfies gkTdk=−‖gk‖2, this property is also independent of the line search used. Under appropriate conditions, we prove that the proposed method is globally convergent. Moreover, we give a sufficient condition for the global convergence of the proposed general method. The numerical results show that the proposed method is efficient.
Article
We propose performance profiles-distribution functions for a performance metric-as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.
An introduction optimization test functions collection
  • N Andrei
Combining Quasi-Newton and Steepest Descent Directions
  • L Han
  • M Neumann
A globalization of L-BFGS for nonconvex unconstrained optimization
  • F Mannel
On recent developments in BFGS methods for unconstrained optimization
  • P E Gill
  • J Runnoe
Numerical Optimization: Springer Series in Optimization Research
  • Nocedal