Warren L Hare

Warren L Hare
  • Ph.D.
  • Professor (Full) at University of British Columbia - Okanagan

About

152
Publications
47,824
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,124
Citations
Current institution
University of British Columbia - Okanagan
Current position
  • Professor (Full)
Additional affiliations
January 2012 - present
University of British Columbia - Okanagan
Position
  • University of British Columbia
Education
September 2000 - April 2004
Simon Fraser University
Field of study
  • Mathematics (Optimization)
September 1998 - March 2000
University of Alberta
Field of study
  • Mathematics (Optimization)

Publications

Publications (152)
Preprint
Full-text available
A polytope is inscribable if there is a realization where all vertices lie on the sphere. In this paper, we 4 provide a necessary and sufficient condition for a polytope to be inscribable. Based on this condition, we characterize 5 the problem of determining inscribability as a minimum rank optimization problem using slack matrices. We propose 6 an...
Article
Full-text available
Model-based derivative-free optimization (DFO) methods are an important class of DFO methods that are known to struggle with solving high-dimensional optimization problems. Recent research has shown that incorporating random subspaces into model-based DFO methods has the potential to improve their performance on high-dimensional problems. However,...
Preprint
Full-text available
Gradient approximations are a class of numerical approximation techniques that are of central importance in numerical optimization. In derivative-free optimization, most of the gradient approximations, including the simplex gradient, centred simplex gradient, and adapted centred simplex gradient, are in the form of simplex derivatives. Owing to mac...
Article
Full-text available
Background. Modern radiation therapy technologies aim to enhance radiation dose precision to the tumor and utilize hypofractionated treatment regimens. Verifying the dose distributions associated with these advanced radiation therapy treatments remains an active research area due to the complexity of delivery systems and the lack of suitable three-...
Article
Full-text available
The cosine measure was introduced in 2003 to quantify the richness of a finite positive spanning sets of directions in the context of derivative-free directional methods. A positive spanning set is a set of vectors whose nonnegative linear combinations span the whole space. The present work extends the definition of cosine measure. In particular, t...
Preprint
Full-text available
Model-based derivative-free optimization (DFO) methods are an important class of DFO methods that are known to struggle with solving high-dimensional optimization problems. Recent research has shown that incorporating random subspaces into model-based DFO methods has the potential to improve their performance on high-dimensional problems. However,...
Article
Full-text available
Modern advancements in radiation therapy require paralleled advancements in the dosimetric tools used to verify dose distributions. Optical computed tomography (CT) imaged radiochromic gel dosimeters provide comprehensive, tissue equivalent, 3D dosimetric information with high spatial resolution and low imaging times. Traditional CT image reconstru...
Article
Full-text available
Optical computed tomography (CT) is one of the leading modalities for imaging gel dosimeters. In previous research, it was shown that a design could significantly reduce the volume of the refractive index baths that are commonly found in optical CT systems. The proposed scanner has been manufactured and is in process of being commissioned. The rays...
Article
Full-text available
Objective: Optical computed tomography (CT) is one of the leading modalities for imaging gel dosimeters used in the verification of complex radiotherapy treatments. In previous work, a novel fan-beam optical CT scanner design was proposed that could significantly reduce the volume of the refractive index baths that are commonly found in optical CT...
Preprint
Derivative-free algorithms seek the minimum of a given function based only on function values queried at appropriate points. Although these methods are widely used in practice, their performance is known to worsen as the problem dimension increases. Recent advances in developing randomized derivative-free techniques have tackled this issue by worki...
Article
Full-text available
This work presents a novel matrix-based method for constructing an approximation Hessian using only function evaluations. The method requires less computational power than interpolation-based methods and is easy to implement in matrix-based programming languages such as MATLAB. As only function evaluations are required, the method is suitable for u...
Article
The centred simplex gradient (CSG) is a popular gradient approximation technique in derivative-free optimization. Its computation requires a perfectly symmetric set of sample points and is known to provide an accuracy of $\mathcal {O}(\varDelta ^2)$, where $\varDelta $ is the radius of the sampling set. In this paper, we consider the situation wher...
Article
Full-text available
Nonconvex minimization algorithms often benefit from the use of second-order information as represented by the Hessian matrix. When the Hessian at a critical point possesses negative eigenvalues, the corresponding eigenvectors can be used to search for further improvement in the objective function value. Computing such eigenpairs can be computation...
Preprint
Full-text available
Positive spanning sets span a given vector space by nonnegative linear combinations of their elements. These have attracted significant attention in recent years, owing to their extensive use in derivative-free optimization. In this setting, the quality of a positive spanning set is assessed through its cosine measure, a geometric quantity that exp...
Article
Full-text available
Background Gel dosimeters are a potential tool for measuring the complex dose distributions that characterize modern radiotherapy. A prototype tabletop solid‐tank fan‐beam optical CT scanner for readout of gel dosimeters was recently developed. This scanner does not have a straight raypath from source to detector, thus images cannot be reconstructe...
Article
Full-text available
A derivative-free optimization (DFO) method is an optimization method that does not make use of derivative information in order to find the optimal solution. It is advantageous for solving real-world problems in which the only information available about the objective function is the output for a specific input. In this paper, we develop the framew...
Article
Full-text available
This paper examines a calculus-based approach to building model functions in a derivative-free algorithm. This calculus-based approach can be used when the objective function considered is defined via more than one blackbox. Two versions of a derivative-free trust-region method are implemented. The first version builds model functions by using a ca...
Article
Full-text available
The properties of positive bases make them a useful tool in derivative-free optimization and an interesting concept in mathematics. The notion of the cosine measure helps to quantify the quality of a positive basis. It provides information on how well the vectors in the positive basis uniformly cover the space considered. The number of vectors in a...
Article
Full-text available
This work investigates the asymptotic behaviour of the gradient approximation method called the generalized simplex gradient (GSG). This method has an error bound that at first glance seems to tend to infinity as the number of sample points increases, but with some careful construction, we show that this is not the case. For functions in finite dim...
Article
Model-based methods are popular in derivative-free optimization (DFO). In most of them, a single model function is built to approximate the objective function. This is generally based on the assumption that the objective function is one black box. However, some real-life and theoretical problems show that the objective function may consist of sever...
Preprint
Full-text available
A derivative-free optimization (DFO) method is an optimization method that does not make use of derivative information in order to find the optimal solution. It is advantageous for solving real-world problems in which the only information available about the objective function is the output for a specific input. In this paper, we develop the framew...
Preprint
Full-text available
This paper examines a calculus-based approach to build model functions in a derivative-free algorithm. This calculus-based approach can be used when the objective function considered is defined via more than one blackbox. Two versions of a derivative-free trust-region method are implemented. The first version builds model functions using a calculus...
Preprint
Full-text available
Advancements in radiation therapy technologies are characterized by personalized treatments plans and increased conformality of the radiation dose to the tumour. Gel dosimeters are a potential tool for measuring these complex dose distributions. Here we develop a method to reduce the storage size of optical CT system matrices through use of polar c...
Preprint
Full-text available
This work presents a novel matrix-based method for constructing an approximation Hessian using only function evaluations. The method requires less computational power than interpolation-based methods and is easy to implement in matrix-based programming languages such as MATLAB. As only function evaluations are required, the method is suitable for u...
Preprint
Full-text available
The centred simplex gradient (CSG) is a popular gradient approximation technique in derivative-free optimization. Its computation requires a perfectly symmetric set of sample points and is known to provide an accuracy of O($\Delta^2$) where $\Delta$ is the radius of the sampling set. In this paper, we consider the situation where the set of sample...
Preprint
Full-text available
Nonconvex minimization algorithms often benefit from the use of second-order information as represented by the Hessian matrix. When the Hessian at a critical point possesses negative eigenvalues, the corresponding eigenvectors can be used to search for further improvement in the objective function value. Computing such eigenpairs can be computation...
Article
A new convex quadratically‐constrained quadratic programming (QCQP) model is proposed for modeling side‐slopes volumes in the minimization of earthwork operations to compute the vertical alignment of a resource road while satisfying design and safety constraints. The new QCQP model is convex but nonlinear; it is compared to a state‐of‐the‐art mixed...
Article
Full-text available
We consider the problem of minimizing an objective function that is provided by an oracle. We assume that while the optimization problem seeks a real-valued solution, the oracle is capable of accepting complex-valued input and returning complex-valued output. We explore using complex-variables in order to approximate gradients and Hessians within a...
Article
Full-text available
This paper presents a new algorithm to build feasible solutions to a MILP formulation of the vertical alignment problem in road design. This MILP involves a large number of special ordered set of type 2 variables used to describe piecewise linear functions. The principle of the algorithm is to successively solve LPs adapted from the MILP by replaci...
Article
Full-text available
This manuscript defines a bi-objective optimization model to finds road profiles that optimize the road construction cost and the vehicle operating costs, specifically the fuel consumption. The research implements and validates the formula for the fuel consumption cost. It further presents and examines a variety of well-known methods: three classic...
Article
Full-text available
Optical computed tomography (CT) is one of the leading modalities for imaging gel dosimeters. There exist many prototype designs, as well as some commercial optical CT scanners that have showcased the value that gel dosimeters can provide to improve 3D dose verification for radiation treatments. However, due to factors including image accuracy, sca...
Article
Full-text available
Purpose A system matrix can be built in order to account for the refractions in an optical computed tomography (CT) system. In order to utilize this system matrix, iterative methods are employed to solve the image reconstruction problem. The purpose of this study is to compare potential iterative algorithms to solve this image reconstruction proble...
Preprint
Full-text available
The properties of positive bases make them a useful tool in derivative-free optimization (DFO) and an interesting concept in mathematics. The notion of the \emph{cosine measure} helps to quantify the quality of a positive basis. It provides information on how well the vectors in the positive basis uniformly cover the space considered. The number of...
Preprint
Full-text available
Model-based methods are popular in derivative-free optimization (DFO). In most of them, a single model function is built to approximate the objective function. This is generally based on the assumption that the objective function is one blackbox. However, some real-life and theoretical problems show that the objective function may consist of severa...
Article
Full-text available
Changing weather patterns may impose increased risk to the creditworthiness of financial institutions in the agriculture sector. To reduce the credit risk caused by climate change, financial institutions need to update their agricultural lending portfolios to consider climate change scenarios. In this paper we introduce a framework to compute the o...
Preprint
Full-text available
14 Purpose: A system matrix can be built in order to account for the refractions in 15 an optical computed tomography (CT) system. In order to utilize this system matrix, 16 iterative methods are employed to solve the image reconstruction problem. The purpose 17 of this study is to compare potential algorithms to solve this image reconstruction 18...
Preprint
Full-text available
This work investigates the asymptotic behaviour of the gradient approximation method called the generalized simplex gradient (GSG). This method has an error bound that at first glance seems to tend to infinity as the number of sample points increases, but with some careful construction, we show that this is not the case. For functions in finite dim...
Article
Full-text available
Using the Moore–Penrose pseudoinverse this work generalizes the gradient approximation technique called the centred simplex gradient to allow sample sets containing any number of points. This approximation technique is called the generalized centred simplex gradient. We develop error bounds and, under a full-rank condition, show that the error boun...
Article
Full-text available
Optical computed tomography (CT) is one of the leading modalities for imaging gel dosimeters for 3D radiation dosimetry. There exist multiple scanner designs that have showcased excellent 3D dose verification capabilities of optical CT gel dosimetry. However, due to multiple experimental and reconstruction based factors there is currently no single...
Article
Full-text available
Variational Analysis studies mathematical objects under small variations. With regards to optimization, these objects are typified by representations of first-order or second-order information (gradients, subgradients, Hessians, etc). On the other hand, Derivative-Free Optimization studies algorithms for continuous optimization that do not use firs...
Preprint
Full-text available
This work introduces the nested-set Hessian approximation, a second-order approximation method that can be used in any derivative-free optimization routine that requires such information. It is built on the foundation of the generalized simplex gradient and proved to have an error bound that is on the order of the maximal radius of the two sets use...
Article
Full-text available
We consider the question of numerically approximating the derivative of a smooth function using only function evaluations. In particular, we examine the regression gradient, the generalized simplex gradient and the generalized centered simplex gradient, three numerical techniques based on using function values at a collection of sample points to co...
Article
Full-text available
Originally developed in 1954, positive bases and positive spanning sets have been found to be a valuable concept in derivative-free optimization (DFO). The quality of a positive basis (or positive spanning set) can be quantified via the cosine measure and convergence properties of certain DFO algorithms are intimately linked to the value of this me...
Preprint
Full-text available
Using the Moore--Penrose pseudoinverse, this work generalizes the gradient approximation technique called centred simplex gradient to allow sample sets containing any number of points. This approximation technique is called the \emph{generalized centred simplex gradient}. We develop error bounds and, under a full-rank condition, show that the error...
Article
Inter-story isolation (ISI) is a passive control technique that involves the placement of seismic isolation devices between stories. Multiple isolation layers installed at different story levels can reduce the response of the isolation devices, while maintaining control of the primary building. A multi-objective optimization (MOO) study is conducte...
Preprint
Full-text available
Originally developed in 1954, positive bases and positive spanning sets have been found to be a valuable concept in derivative-free optimization (DFO). The quality of a positive basis (or positive spanning set) can be quantified via the {\em cosine measure} and convergence properties of certain DFO algorithms are intimately linked to the value of t...
Article
Full-text available
When building a road, it is critical to select a vertical alignment which ensures design and safety constraints. Finding such a vertical alignment is not necessarily a feasible problem, and the models describing it generally involve a large number of variables and constraints. This paper is dedicated to rapidly proving the feasibility or the infeas...
Preprint
Full-text available
In this paper, we examine the framework to estimate financial risk called conditional-value-at-risk (CVaR) and examine models to optimize portfolios by minimizing CVaR. We note that total risk can be a function of multiple risk factors combined in a linear or nonlinear forms. We demonstrate that, when using CVaR, several common nonlinear models can...
Preprint
Full-text available
Portfolio optimization is the process of choosing the best investment decision across a set of financial instruments or assets. Investors seek to maximize their (expected) returns, but higher expected return usually means taking on more risk. So, investors are faced with a trade-off between risk and expected return. Most researchers have considered...
Preprint
Full-text available
Originally developed in 1954, positive bases and positive spanning sets have been found to be a valuable concept in derivative-free optimization (DFO). The quality of a positive basis (or positive spanning set) can be quantified via the cosine measure and convergence properties of certain DFO algorithms are intimately linked to the value of this me...
Preprint
Full-text available
This paper presents a new algorithm to build feasible solutions to a MILPs formulation of the vertical alignment problem in road design. This MILP involves a large number of SOS2 variables used to describe piece-wise linear functions. The principle of the algorithm is to successively solve LPs adapted from the MILP by replacing the SOS2 constraints...
Article
Full-text available
Multi-fidelity algorithms for solving the horizontal alignment problem in road design are considered. A multi-fidelity surrogate model is built and quantile regression is used to understand its accuracy at various fidelity levels. Two algorithms are compared: a generalized pattern search algorithm with adaptive precision control, and a trust-region...
Preprint
Prox-regularity is a generalization of convexity that includes all C2, lower-C2, strongly amenable and primal-lower-nice functions. The study of prox-regular functions provides insight on a broad spectrum of important functions. Parametrically prox-regular (para-prox-regular) functions are a further extension of this family, produced by adding a pa...
Preprint
The NC-proximal average is a parametrized function used to continuously transform one proper, lsc, prox-bounded function into another. Until now, it has been defined for two functions. The purpose of this article is to redefine it so that any finite number of functions may be used. The layout generally follows that of [11], extending those results...
Article
Full-text available
In Variational Analysis, VU-theory provides a set of tools that is helpful for understanding and exploiting the structure of nonsmooth functions. The theory takes advantage of the fact that at any point, the space can be separated into two orthogonal subspaces: one that describes the direction of nonsmoothness of the function, and the other on whic...
Preprint
Full-text available
Engineering design problems are often multi-objective in nature, which means trade-offs are required between conflicting objectives. In this study, we examine the multi-objective algorithms for the optimal design of reinforced concrete structures. We begin with a review of multi-objective optimization approaches in general and then present a more f...
Article
Engineering design problems are often multi-objective in nature, which means trade-offs are required between conflicting objectives. In this study, we examine the multi-objective algorithms for the optimal design of reinforced concrete structures. We begin with a review of multi-objective optimization approaches in general and then present a more f...
Article
Full-text available
Proximal gradient methods have been found to be highly effective for solving minimization problems with non-negative constraints or L1-regularization. Under suitable nondegeneracy conditions, it is known that these algorithms identify the optimal sparsity pattern for these types of problems in a finite number of iterations. However, it is not known...
Preprint
Full-text available
The $\mathcal{VU}$-algorithm is a superlinearly convergent method for minimizing nonsmooth, convex functions. At each iteration, the algorithm works with a certain $\mathcal{V}$-space and its orthogonal $\U$-space, such that the nonsmoothness of the objective function is concentrated on its projection onto the $\mathcal{V}$-space, and on the $\math...
Preprint
Full-text available
We consider the question of numerically approximating the derivative of a smooth function using only function evaluations. In particular, we examine the regression gradient, the generalized simplex gradient and the generalized centered simplex gradient, three numerical techniques based on using function values at a collection of sample points to co...
Article
Full-text available
We consider the challenge of numerically comparing optimization algorithms that employ random-restarts under the assumption that only limited test data is available. We develop a bootstrapping technique to estimate the incumbent solution of the optimization problem over time as a stochastic process. The asymptotic properties of the estimator are ex...
Preprint
Full-text available
In Variational Analysis, VU-theory provides a set of tools that is helpful for understanding and exploiting the structure of nonsmooth functions. The theory takes advantage of the fact that at any point, the space can be separated into two orthogonal subspaces, one that describes the direction of nonsmoothness of the function, and the other on whic...
Preprint
Full-text available
Simplex gradients, essentially the gradient of a linear approximation, are a popular tool in derivative-free optimization (DFO). In 2015, a product rule, a quotient rule and a sum rule for simplex gradients were introduced by Regis [14]. Unfortunately, those calculus rules only work under a restrictive set of assumptions. The purpose of this paper...
Article
Full-text available
The subdifferential of a function is a generalization for nonsmooth functions of the concept of gradient. It is frequently used in variational analysis, particularly in the context of nonsmooth optimization. The present work proposes algorithms to reconstruct a polyhedral subdifferential of a function from the computation of finitely many direction...
Article
Full-text available
Locating proximal points is a component of numerous minimization algorithms. This work focuses on developing a method to find the proximal point of a convex function at a point, given an inexact oracle. Our method assumes that exact function values are at hand, but exact subgradients are either not available or not useful. We use approximate subgra...
Chapter
Full-text available
Derivative-free optimization (DFO) is the mathematical study of the optimization algorithms that do not use derivatives. One branch of DFO focuses on model-based DFO methods, where an approximation of the objective function is used to guide the optimization algorithm. Historically, model-based DFO has often assumed that the objective function is sm...
Preprint
Proximal gradient methods have been found to be highly effective for solving minimization problems with non-negative constraints or L1-regularization. Under suitable nondegeneracy conditions, it is known that these algorithms identify the optimal sparsity pattern for these types of problems in a finite number of iterations. However, it is not known...
Chapter
For a full appreciation of the material in this book, it is assumed that the reader has followed courses in Multivariate Calculus and Linear Algebra. This chapter contains definitions, notation, and results that, although fairly common in mathematics literature, are not necessarily covered in standard multivariate calculus and linear algebra course...
Chapter
Section 1.4 enumerated some common features of target applications of BBO. A recurring difficulty in such applications originates from the fact that the evaluation of the objective and constraint functions is computationally expensive. In some situations, a surrogate optimization problem is available. That is, a problem that is considerably cheaper...
Chapter
While the line-search-based method of Chapter 10 is quick and easy to understand, it does not exploit the full power of a model function. In essence, the model-based descent algorithm for unconstrained optimization only uses gradient approximations to confirm descent directions. This can be seen in the convergence analysis of Chapter 10, where only...
Chapter
As mentioned in Chapter 6, the generalised pattern search (GPS) algorithm is an advancement on the CS algorithm first introduced in Chapter 3 The GPS algorithm follows the same basic logic behind the CS algorithm, namely it iteratively searches a collection of points seeking improvement over the incumbent solution. In order to increase the flexibil...
Chapter
In 1965, John Nelder and Roger Mead published a short (6 pages) note on a new method for unconstrained minimisation. The resulting algorithm has become one of the most widely applied and researched optimization algorithms in the world. Originally, Nelder and Mead titled their method the “Simplex Method”, as the method hinged around using function e...
Chapter
Chapter 3 introduced a first practical DFO algorithm for unconstrained optimization, the coordinate search (CS) algorithm. While it was proven to converge to the first order in some circumstance (see Theorem 3.4), it was also noted that the algorithm can fail on very simple nondifferentiable convex functions (see Example 3.3). The CS algorithm is a...
Chapter
Through this book, and in all areas of optimization, it should always be remembered that whenever derivatives are available, they should be used. Derivatives, or more generally gradients, provide powerful information about a function. In unconstrained optimization, descent directions and first order optimality hinge on these basic objects of multiv...
Chapter
Chapter 3 proposed the coordinate search (CS) algorithm for unconstrained optimization. The algorithm worked based on local exploration around the incumbent solution in the positive and negative orthogonal coordinate directions. In Chapter 7, this algorithm was expanded to allow a broader use of search directions, resulting in the generalised patte...
Chapter
In this chapter, we explore some naive approaches to solving BBO problems, and explain why they are not acceptable. This will lead to a better understanding of why DFO is preferred for real applications and provide some foundational material that is used throughout this book.
Chapter
In this introductory chapter, we present a high-level description of optimization, blackbox optimization, and derivative-free optimization. We introduce some basic optimization notation used throughout this book, and some of the standard classifications of optimization problems. We end with three examples where blackbox optimization problems have a...
Chapter
Throughout this book, we have considered the general problem of minimising a multivariate objective function f over a constraint set \(\Omega \subseteq \mathbb{R}^{n}\). Let us now be more specific about the nature of the variables and of the constraint set.
Chapter
Optimization algorithms for blackbox functions can be broadly split into two categories: heuristic and non-heuristic. A heuristic is any approach that, while supported by some argument of why it should succeed, does not include a guarantee of success. In the framework of blackbox optimization, we take this statement to mean that the algorithm does...
Chapter
Model-based methods in DFO proceed from the idea that if it is possible to build a “good” model of the true objective function, then information from the model can be used to guide the optimization. In Chapter 9, we studied several methods for constructing model functions using only objective function evaluations. We also defined fully linear, as a...
Chapter
There are situations in which the optimization problem is driven by more than one objective function. Typically, these objectives are conflicting; for example, one may wish to maximise the solidity of a structure, while minimising its weight. In such a situation, one desires to take into account the relative tradeoffs between pairs of solutions.
Article
Full-text available
Benchmarking of optimization algorithms is a complicated task that involves many subtle considerations to yield a fair and unbiased evaluation. In this paper, we systematically review the benchmarking process of optimization algorithms, and discuss the challenges of fair comparison. We provide suggestions for each step of the comparison process and...
Article
Full-text available
Derivative-free optimization (DFO) is the mathematical study of the optimization algorithms that do not use derivatives. One branch of DFO focuses on model-based DFO methods, where an approximation of the objective function is used to guide the optimization algorithm. Proving convergence of such methods often applies an assumption that the approxim...
Preprint
Computing explicitly the {\epsilon}-subdifferential of a proper function amounts to computing the level set of a convex function namely the conjugate minus a linear function. The resulting theoretical algorithm is applied to the the class of (convex univariate) piecewise linear-quadratic functions for which existing numerical libraries allow practi...
Preprint
Comparing, or benchmarking, of optimization algorithms is a complicated task that involves many subtle considerations to yield a fair and unbiased evaluation. In this paper, we systematically review the benchmarking process of optimization algorithms, and discuss the challenges of fair comparison. We provide suggestions for each step of the compari...
Article
Full-text available
The VU-algorithm is a superlinearly convergent method for minimizing nonsmooth convex functions. At each iteration, the algorithm separates R n into the V-space and the orthogonal U-space, such that the nonsmoothness of the objective function is concentrated on its projection onto the V-space, and on the U-space the projection is smooth. This struc...
Article
Full-text available
Computing explicitly the \(\varepsilon \)-subdifferential of a proper function amounts to computing the level set of a convex function namely the conjugate minus a linear function. The resulting theoretical algorithm is applied to the the class of (convex univariate) piecewise linear–quadratic functions for which existing numerical libraries allow...
Article
Full-text available
The vertical alignment optimization problem for road design aims to generate a vertical alignment of a new road with a minimum cost, while satisfying safety and design constraints. We present a new model called multi-haul quasi network flow (MH-QNF) for vertical alignment optimization that improves the accuracy and reliability of previous mixed int...
Preprint
The vertical alignment optimization problem for road design aims to generate a vertical alignment of a new road with a minimum cost, while satisfying safety and design constraints. We present a new model called multi-haul quasi network flow (MH-QNF) for vertical alignment optimization that improves the accuracy and reliability of previous mixed int...
Book
This book is designed as a textbook, suitable for self-learning or for teaching an upper-year university course on derivative-free and blackbox optimization. The book is split into 5 parts and is designed to be modular; any individual part depends only on the material in Part I. Part I of the book discusses what is meant by Derivative-Free and Bla...
Article
Full-text available
Positive bases, which play a key role in understanding derivative free optimization methods that use a direct search framework, are positive spanning sets that are positively linearly independent. The cardinality of a positive basis in $\R^n$ has been established to be between $n+1$ and $2n$ (with both extremes existing). The lower bound is immedia...
Article
Full-text available
Optimization of three-dimensional road alignments is a nonlinear non-convex optimization problem. The development of models that fully optimize a three-dimensional road alignment problem is challenging due to numerous factors involved and complexities in the geometric specification of the alignment. In this study, we developed a novel bi-objective...
Article
Full-text available
Many theoretical and experimental studies have used heuristic methods to investigate the dynamic behaviour of the passive coupling of adjacent structures. However, few papers have used optimization techniques with guaranteed convergence in order to increase the efficiency of the passive coupling of adjacent structures. In this paper, the combined p...
Preprint
Optimization of three-dimensional road alignments is a nonlinear non-convex optimization problem. The development of models that fully optimize a three-dimensional road alignment problem is challenging due to numerous factors involved and complexities in the geometric specification of the alignment. In this study, we developed a novel bi-objective...
Article
Full-text available
Constrained blackbox optimization is a difficult problem, with most approaches coming from the mathematical programming literature. The statistical literature is sparse, especially in addressing problems with nontrivial constraints. This situation is unfortunate because statistical methods have many attractive properties: global scope, handling noi...
Article
Full-text available
Finding an optimal alignment connecting two end-points in a specified corridor is a complex problem that requires solving three interrelated sub-problems, namely the horizontal alignment, vertical alignment and earthwork optimization problems. In this research, we developed a novel bi-level optimization model combining those three problems. In the...

Network

Cited By