
Mehiddin Al-BaaliSultan Qaboos University | SQU · Department of Mathematics and Statistics (DOMAS)
Mehiddin Al-Baali
PhD
Small and large scales unconstrained numerical optimization, algorithms, and software.
About
94
Publications
29,449
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,308
Citations
Citations since 2017
Introduction
Limited memory quasi-Newton methods for large-scale unconstrained optimization and nonlinear least squares problems.
Additional affiliations
September 1992 - June 1997
uae univesity
Position
- Professor (Assistant)
September 1986 - July 2016
Publications
Publications (94)
It is well known that conjugate gradient methods are useful for solving large-scale unconstrained nonlinear optimization problems. In this paper, we consider combining the best features of two conjugate gradient methods. In particular, we give a new conjugate gradient method, based on the hybridization of the useful DY (Dai-Yuan), and HZ (Hager-Zha...
Very recently, Al-Saidi and Al-Baali (2021) have illustrated the usefulness of the scaled technique for the conjugate gradient methods when introduced to the Fletcher-Reeves method for unconstrained optimization. In this paper, we study the behaviour of the scaled Dai-Yuan method with several choices for the scaling parameter. Some numerical result...
This paper introduces a scaling parameter to the Fletcher-Reeves (FR) nonlinear conjugate gradient method. The main aim is to improve its theoretical and numerical properties when applied with inexact line searches to unconstrained optimization problems. We show that the sufficient descent and global convergence properties of Al-Baali for the FR me...
We consider some diagonal quasi-Newton methods for solving large-scale unconstrained optimization problems. A simple and effective approach for diagonal quasi-Newton algorithms is presented by proposing new updates of diagonal entries of the Hessian. Moreover, we suggest employing an extra BFGS update of the diagonal updating matrix and use its dia...
This book gathers selected, peer-reviewed contributions presented at the Fifth International Conference on Numerical Analysis and Optimization (NAO-V), which was held at Sultan Qaboos University, Oman, on January 6-9, 2020. Each chapter reports on developments in key fields, such as numerical analysis, numerical optimization, numerical linear algeb...
The recent coronavirus disease 2019 (COVID-19) outbreak is of high importance in research topics due to its fast spreading and high rate of infections across the world. In this paper, we test certain optimal models of forecasting daily new cases of COVID-19 in Oman. It is based on solving a certain nonlinear least-squares optimization problem that...
This report provides some information, programme, and abstracts of the talks related to the Fifth International Conference on Numerical Analysis and Optimization: Theory, Algorithms, Applications, and Technology (NAOV-2020). The conference held during the period 6 - 9 January 2020 and organized by the Department of Mathematics, Sultan Qaboos Univer...
A simple modification technique is introduced to the limited memory BFGS (L-BFGS) method for solving large-scale nonlinear least-squares problems. The L-BFGS method computes a Hessian approximation of the objective function implicitly as the outcome of updating a basic matrix, \(H_k^0\) say, in terms of a number of pair vectors which are available...
We introduce a class of positive definite preconditioners for the solution of large symmetric indefinite linear systems or sequences of such systems, in optimization frameworks. The preconditioners are iteratively constructed by collecting information on a reduced eigenspace of the
indefinite matrix by means of a Krylov-subspace solver. A spectral...
A simple modification technique is introduced to the limited memory BFGS (L-BFGS) method for solving large-scale nonlinear least-squares problems. The L-BFGS method computes a Hessian approximation
of the objective function implicitly as the outcome of updating a basic matrix in terms of a number of pair vectors which are available from most recent...
This special edited book series of Springer Proceedings in Mathematics and Statistics contains 13 selected keynote papers presented at the Fourth International Conference on Numerical Analysis and Optimization, held in January 2017, at Sultan Qaboos University, Muscat, Oman. The conference highlights novel and advanced applications of recent resear...
In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners...
Preface.
This special issue contains selected papers presented at the Fourth International Conference on Numerical Analysis and Optimization: Theory, Methods, Applications and Technology Transfer (NAOIV-2017), held during January 2-5, 2017, at Sultan Qaboos University (SQU), Muscat, Oman. More information is available at https://conference.squ.edu...
In this paper we propose the use of damped techniques within Nonlinear
Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell
and recently reproposed by Al-Baali and till now, only applied in the framework of
quasi-Newton methods. We extend their use to NCG methods in large scale unconstrained
optimization, aiming at possibly...
A new conjugate gradient method for unconstrained optimization, is proposed by applying the Powell symmetrical technique in a sense to be defined. Using the Wolfe line search conditions, the global convergence property of the method is also obtained, on the basis of the spectral analysis of the conjugate gradient iteration matrix and the Zoutendijk...
We deal with the design of parallel algorithms by using variable partitioning techniques to solve nonlinear optimization problems. We propose an iterative solution method that is very efficient for separable functions, our scope being to discuss its performance for general functions. Experimental results on an illustrative example have suggested so...
Preface.
This Special Issue contains some selected papers presented at the Third International Conference on Numerical Analysis and Optimization: Theory, Methods, Applications and Technology Transfer (NAOIII-2014), held during January 5-9, 2014 at Sultan Qaboos University (SQU), Muscat, Oman. The conference was sponsored by SQU, The Research Counc...
We deal with the design of parallel algorithms by using variable partitioning techniques to solve nonlinear optimization problems. We propose an iterative solution method that is very efficient for separable functions, our scope being to discuss its performance for general functions. Experimental results on an illustrative example have suggested so...
Recently, Al-Baali (2014) has extended the damped-technique in the modified BFGS method of Powell (1978) for Lagrange constrained optimization functions to the Broyden family of quasi-Newton methods for unconstrained optimization. Appropriate choices for the damped-parameter, which maintain the global and superlinear convergence property of these m...
This special edited book series of Springer Proceedings in Mathematics and Statistics contains 13 selected papers presented at the Third International Conference on Numerical Analysis and Optimization: Theory, Methods, Applications and Technology Transfer (NAOIII-2014) held during January 5–9, 2014, at Sultan
Qaboos University (SQU), Muscat, Oman....
The Barzilai and Borwein (BB) gradient method has achieved a lot of attention since it performs much more better than the classical steepest descent method. In this paper, we analyze a positive BB-like gradient stepsize and discuss its possible uses. Specifically, we present an analysis of the positive stepsize for two-dimensional strictly convex q...
This paper extends the technique used in the damped BFGS method of Powell [Algorithms for nonlinear constraints that use Lagrange functions, Math. Program. 14 (1978), 224–248] to the Broyden family of quasi-Newton methods with applications to unconstrained optimization problems. Appropriate conditions on the damped technique are proposed to enforce...
This third international conference featured some of the recent research developments in theory, algorithms, and advanced applications in engineering, science and medicine to facilitate cross-fertilization among various key sectors of pure scientific and applied knowledge. The participants have presented their novel results and discussed further ne...
This edited volume presents a collection of carefully refereed articles covering the latest advances in Automorphic Forms and Number Theory, that were primarily developed from presentations given at the 2012 “International Conference on Automorphic Forms and Number Theory,” held in Muscat, Sultanate of Oman. The present volume includes original res...
Quasi-Newton methods were introduced by Charles Broyden in 1965, [16] as an alternative to Newton's method for solving nonlinear algebraic systems; in 1970 Broyden, [17] extended them to nonlinear unconstrained optimization as a generalization of the DFP method which is proposed by Davidon in 1959, [28] and investigated by Fletcher and Powell in 19...
The limited-memory L-BFGS method of Nocedal for large-scale unconstrained optimization will be considered. On each iteration of this method a fixed number, say m, of updates is usually employed. Since the number of function and gradient evaluations required to solve an optimization problem is usually decreased, while the cost of updates is increase...
In 2011, we extended the damped BFGS method of Powell (1978), which is useful for solv-
ing constrained optimization problems that uses Lagrange functions (see for example the
books of Fletcher, 1987, and Nocedal and Wright, 1999), to unconstrained optimization.
Further extension of this method to the limited memory L-BFGS method of Nocedal for
lar...
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, to the limited memory BFGS method in the case of the large-scale unconstrained optimization. It is shown that the proposed technique maintains the global convergence property on uniformly convex functions for the limited memory...
Sultan Qaboos University Journal for Science (Special Issue), Volume 17(2) 2012 (M. Al-Baali and A. Purnama, Eds.). To access online, click or copy and paste
http://web.squ.edu.om/squjs/Archives/Volume17-part(2).htm
Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates
are combined to obtain ‘better’ curvature approximations in line search methods for unconstrained optimization. It is shown
that this class of methods, like the BFGS method, has the global and superlinear convergence for conv...
The Broyden family of quasi-Newton methods for unconstrained optimization will be considered. It is well-known that if a member of this family is defined sufficiently close to the robust BFGS method, then useful convergence properties are usually obtained. These properties will be extended to other members of the family provided that the updating p...
A class of damped quasi-Newton methods for nonlinear optimization has recently been proposed by extending the damped-technique of Powell for the BFGS method to the Broyden family of quasi-Newton methods. It has been shown that this damped class has the global and superlinear convergence property that a restricted class of 'undamped' methods has for...
We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that...
A class of damped quasi-Newton methods for nonlinear optimization has recently been proposed by extending the damped-technique of Powell for the BFGS method to the Broyden family of quasi-Newton methods. It has been shown that this damped class has the global and superlinear convergence property that a restricted class of 'undamped' methods has for...
Sultan Qaboos University Journal for Science (Special Issue), Volume 17(1) 2012 (M. Al-Baali and A. Purnama, Eds.). To access online, click or copy and paste http://web.squ.edu.om/squjs/Archives/Volume17-part(1).htm
In this talk we will extend the technique in the damped BFGS method of Powell (Algorithms
for nonlinear constraints that use Lagrange functions, Mathematical Programming, 14: 224{248,
1978) to a family of quasi-Newton methods with applications to unconstrained optimization prob-
lems. An appropriate way for de�ning this damped-technique will be con...
The purpose of this conference is to bring together experts in Optimization and Numerical Analysis in hope of promoting scientific exchange and discuss possibilities of further cooperation, networking and promotion of mobility of senior and young researchers and research students. The participants have presented their results and discuss further ne...
Recently, several modification techniques have been introduced to the line search BFGS method for unconstrained optimization. These modifications replace the vector of the difference in gradients of the objective function, appearing in the BFGS updating formula, by other modified choices so that certain features are obtained. This paper measures th...
The purpose of this conference is to bring together experts in Optimization and Numerical Analysis in hope of promoting scientific exchange and discuss possibilities of further cooperation, networking and promotion of mobility of senior and young researchers and research students. The participants have presented their results and discussed further...
Quasi-Newton methods are among the most practical and efficient iterative methods for solving unconstrained minimization problems. In this paper we give an overview of some of these methods with focus primarily on the Hessian approximation updates and modifications aimed at improving their performance.
This article uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstrained optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from...
This paper studies some possible combinations of the best features of the quasi-Newton symmetric rank-one (SR1), BFGS and extra updating BFGS algorithms for solving nonlinear unconstrained optimization problems. These combinations depend on switching between the BFGS and SR1 updates so that certain desirable properties are imposed.
The presented nu...
Low storage quasi-Newton algorithms for large-scale nonlinear least-squares problems are considered with “better” modified Hessian approximations defined implicitly in terms of a set of vector pairs. The modification technique replaces one vector of each pair, namely the difference in the gradients of the objective
function, by a superior choice in...
This paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modification technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in a certain sense. Because at some iterations these updates might be redundant or worsen the q...
This paper attempts to combine the best features of certain extra-updating BFGS method and self-scaling BFGS method. It describes an algorithm similar to the BFGS method, except that extra self-scaling updates are employed at some iterations. The BFGS Hessian is scaled and updated a number of times, depending on the information of the first-order d...
Some theoretical and numerical properties of a restricted class of selfscaling BFGS methods for unconstrained nonlinear optimization are described. It is illustrated that several of these methods converge globally and superlinearly on convex problems if the scaling parameters satisfy certain conditions. It is also explained, in particular, that an...
This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and sup...
In this paper we consider two alternative choices for the factor used to scale the initial Hessian approximation, before updating by a member of the Broyden family of updates for quasi-Newton optimization methods. By extensive computational experiments carried out on a set of standard test problems from the CUTE collection, using efficient implemen...
Communicated by C. Brezinski This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian appro...
Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, and ) such that = 1, = 0, and = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that...
This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say
k
and
k
, for which the choice
k
=1 gives the Broyden family of unscaled methods, where
k
=1 corresponds to the well known DFP method. We propose s...
An analysis is given of preconditioned nonlinear conjugate gradient methods in which the preconditioning matrix is the exact Hessian matrix at each iteration (or a nearby matrix). It is shown that the order of convergence of certain preconditioned methods is less than of Newton’s method when exact line searches are used, and an example is given.
This paper is concerned with the choice of parameters in the self-scaling Broyden family of formulae, defined on the basis of minimizingr two type of measurc functions. Onc measure is the $si:l$esi:2 condition number of a certain matrix, while the other one is suggested by Dennis and Wolkowicz, which has some useful properties and acts like the for...
Simple modifications to the limited memory BFGS (L-BFGS) method for large scale optimization are described. The L-BFGS method is resembles the BFGS method, except that at each iteration the Hessian approximation is
defined as the outcome of updating a given diagonal matrix, D say, by using the BFGS formula and information from the last user’s numbe...
Textbook in Arabic.
In this paper, we extend the switching BFGS/DFP algorithm of Fletcher and the switching BFGS/SR1 algorithm of Al-Baali to a class of switching type algorithms proposed within the Broyden family of quasi-Newton methods for unconstrained optimization. We propose some members of this class, which switch among the BFGS, the SRI and other desirable meth...
In this paper we consider minimizing a certain measure for self-scaling
variable metric algorithms. We show that the minimization yields Emexplicit formula for a new
class' of self-scaling algorithms. Experiments with some algorithms are provided. We 'summarize
some numerical results which are required to solve a set of standard test probleqls. We...
In this paper, we propose new members of the Broyden family of quasi-Newton methods. We develop, on the basis of well-known least-change results for the BFGS and DFP updates, a measure for the Broyden family which seeks to take into account the change in both the Hessian approximation and its inverse. The proposal is then to choose the formula whic...
Recently, there has been an increasing interest in the self-scaling Broyden family of formulae. These formulae are usually defined by replacing the approximate Hessian matrix B by τB for some scaling parameter τ It is clear that if B is replaced by (1/τ)B in a self-scaling formula, then a member of the Broyden family follows. The author will show t...
The line search subproblem in unconstrained optimization is concerned with finding an acceptable steplength which satisfies certain standard conditions. Prototype algorithms are described which guarantee finding such a step in a finite number of operations. This is achieved by first bracketing an interval of acceptable values and then reducing this...
We consider Newton-like line search descent methods for solving non-linear least-squares problems. The basis of our approach is to choose a method, or parameters within a method, by minimizing a variational measure which estimates the error in an inverse Hessian approximation. In one approach we consider sizing methods and choose sizing parameters...
If an inexact lilne search which satisfies certain standard conditions is used . then it is proved that the Fletcher-Reeves method had a descent property and is globally convergent in a certain sense.
If an inexact line search which satisfies certain standard conditions is used, then it is proved that the Fletcher-Reeves method has a descent property and is globally convergent in a certain sense.