
Hans-Georg BeyerFachhochschule Vorarlberg · Research Center Business Informatics
Hans-Georg Beyer
Prof. Dr.
About
206
Publications
38,874
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
10,648
Citations
Introduction
Additional affiliations
September 2004 - present
Publications
Publications (206)
A first order progress rate is derived for the intermediate multi-recombinative Evolution Strategy $$(\mu /\mu _I, \lambda )$$ ( μ / μ I , λ ) -ES on the highly multimodal Rastrigin test function. The progress is derived within a linearized model applying the method of so-called noisy order statistics. To this end, the mutation-induced variance of...
An evolution strategy design is presented that allows for an evolution on general quadratic manifolds. That is, it covers elliptic, parabolic, and hyperbolic equality constraints. The peculiarity of the presented algorithm design is that it is an interior point method. It evaluates the objective function only for feasible search parameter vectors a...
This work concerns the design of matrix adaptation evolution strategies for black-box optimization under nonlinear equality constraints. First, constraints in form of elliptical manifolds are considered. For those constraints, an algorithm is proposed that evolves itself on that manifold while optimizing the objective function. The specialty about...
This paper concerns the theoretical analysis of a multi-recombinative meta-ES with repair by projection applied to a conically constrained problem. Using theoretical results for the mean value dynamics and steady state considerations of the inner ES, approximate closed-form expressions for the mean value dynamics and the steady state behavior of th...
This paper presents the multi-recombinative constraint active matrix adaptation evolution strategy (Constraint Active-MA-ES). It extends the MA-ES recently introduced by Beyer and Sendhoff in order to handle constrained black-box optimization problems. The active covariance matrix adaptation approach for constraint handling similar to the method pr...
A theoretical performance analysis of the (μ/μI,λ)-σ-Self-Adaptation Evolution Strategy (σSA-ES) is presented considering a conically constrained problem. Infeasible offspring are repaired using projection onto the boundary of the feasibility region. Closed-form approximations are used for the one-generation progress of the evolution strategy. Appr...
The paper presents the theoretical performance analysis of a hierarchical Evolution Strategy (meta-ES) variant for mutation strength control on a conically constrained problem. Infeasible offspring are repaired by projection onto the boundary of the feasibility region. Closed-form approximations are used for the one-generation progress of the lower...
The Rotated Klee-Minty Problem represents an advancement of the well-known linearly constrained Klee-Minty Problem that was introduced to illustrate the worst case running time of the Simplex algorithm. Keeping the linearity as well as the complexity of the original Klee-Minty Problem, the Rotated Klee-Minty Problem remedies potential biases with r...
Theoretical analyses of evolution strategies are indispensable for gaining a deep understanding of their inner workings. For constrained problems, rather simple problems are of interest in the current research. This work presents a theoretical analysis of a multi-recombinative evolution strategy with cumulative step size adaptation applied to a con...
Theoretical analyses of evolution strategies are indispensable for gaining a deep understanding of their inner workings. For constrained problems, rather simple problems are of interest in the current research. This work presents a theoretical analysis of a multi-recombinative evolution strategy with cumulative step size adaptation applied to a con...
A theoretical performance analysis of the $(\mu/\mu_I,\lambda)$-$\sigma$-Self-Adaptation Evolution Strategy ($\sigma$SA-ES) is presented considering a conically constrained problem. Infeasible offspring are repaired using projection onto the boundary of the feasibility region. Closed-form approximations are used for the one-generation progress of t...
A thorough theoretical analysis of evolution strategies with constraint handling is important for the understanding of the inner workings of evolution strategies applied to constrained problems. Simple problems are of interest for the first analyses. To this end, the behavior of the (1,λ)-σ-Self-Adaptation Evolution Strategy applied to a conically...
The development, assessment, and comparison of randomized search algorithms heavily rely on benchmarking. Regarding the domain of constrained optimization, the small number of currently available benchmark environments bears no relation to the vast number of distinct problem features. The present paper advances a proposal of a scalable linear const...
This paper applies an evolution strategy (ES) that evolves rays to single-objective real-valued constrained optimization problems. The algorithm is called Ray-ES. It was proposed as an ad hoc optimization approach for dealing with the unconstrained real-parameter optimization problem class called HappyCat. To our knowledge, the application of the R...
Benchmarking plays an important role in the development of novel search algorithms as well as for the assessment and comparison of contemporary algorithmic ideas. This paper presents common principles that need to be taken into account when considering benchmarking problems for constrained optimization. Current benchmark environments for testing Ev...
By combination of successful constraint handling techniques known within the context of Differential Evolution with the recently suggested Matrix Adaptation Evolution Strategy (MA-ES), a new Evolution Strategy for constrained optimization is presented. The novel MA - ES variant is applied to the benchmark problems specified for the CEC 2018 competi...
This paper addresses the development of a covariance matrix self-adaptation evolution strategy (CMSA-ES) for solving optimization problems with linear constraints. The proposed algorithm is referred to as Linear Constraint CMSA-ES (lcCMSA-ES). It uses a specially built mutation operator together with repair by projection to satisfy the constraints....
The development, assessment, and comparison of randomized search algorithms heavily rely on benchmarking. Regarding the domain of constrained optimization, the number of currently available benchmark environments bears no relation to the number of distinct problem features. The present paper advances a proposal of a scalable linear constrained opti...
This paper addresses the development of a covariance matrix self-adaptation evolution strategy (CMSA-ES) for solving optimization problems with linear constraints. The proposed algorithm is referred to as Linear Constraint CMSA-ES (lcCMSA-ES). It uses a specially built mutation operator together with repair by projection to satisfy the constraints....
Benchmarking plays an important role in the development of novel search algorithms as well as for the assessment and comparison of contemporary algorithmic ideas. This paper presents common principles that need to be taken into account when considering benchmarking problems for constrained optimization. Current benchmark environments for testing Ev...
This paper addresses the analysis of covariance matrix self-adaptive Evolution Strategies (CMSA-ES) on a subclass of quadratic functions subject to additive Gaussian noise: the noisy ellipsoid model. To this end, it is demonstrated that the dynamical systems approach from the context of isotropic mutations can be transferred to ES that also control...
Regarding the noisy ellipsoid model with additive Gaussian noise, the population control covariance matrix self-adaptation Evolution Strategy (pcCMSA-ES) by Hellwig and Beyer was empirically observed to exhibit a convergence rate (CR) close to the theoretical lower bound of - 1 for all comparison-based direct search algorithms. The present paper pr...
The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is a popular method to deal with nonconvex and/or stochastic optimization problems when the gradient information is not available. Being based on the CMA-ES, the recently proposed Matrix Adaptation Evolution Strategy (MA-ES) provides a rather surprising result that the covariance matrix a...
The standard Covariance Matrix Adaptation Evolution Strategy (CMA-ES) comprises two evolution paths, one for the learning of the mutation strength and one for the rank- 1 update of the covariance matrix. In this paper is is shown that one can approximately transform this algorithm in such a manner that one of the evolution paths and the covariance...
A steady state analysis of the optimization quality of a classical self-adaptive Evolution Strategy (ES) on a class of robust optimization problems is presented. A novel technique for calculating progress rates for non-quadratic noisy fitness landscapes is presented. This technique yields asymptotically exact results in the infinite population size...
According to a theorem by Astete-Morales, Cauwet, and Teytaud, "simple Evolution Strategies (ES)" that optimize quadratic functions disturbed by additive Gaussian noise of constant variance can only reach a simple regret log-log convergence slope ≥ −1/2 (lower bound). Our paper presents a population size controlled ES that is able to perform better...
According to a theorem by Astete-Morales, Cauwet, and Teytaud, “simple Evolution Strategies (ES)” that optimize quadratic functions disturbed by additive Gaussian noise of constant variance can only reach a simple regret log-log convergence slope \(\ge -1/2\) (lower bound). In this paper a population size controlled ES is presented that is able to...
The ability of a hierarchically organized evolution strategy (meta evolution strategy) with isolation periods of length one to optimally control its mutation strength is investigated on convex-quadratic functions (referred to as ellipsoid model). Applying the dynamical systems analysis approach a first step towards the analysis of the meta evolutio...
This paper analyzes the multi-recombinant self-adaptive evolution strategy (ES), denoted as(μ/μI, λ)-σSA-ES on the convex-quadratic function class under the influence of noise, which is referred to as noisy ellipsoid model. Asymptotically exact progress rate and self-adaptation response measures are derived (i.e., for N → ∞, N - search space dimens...
Abstract The behavior of the (μ/μI, λ)-Evolution Strategy (ES) with cumulative step-size adaptation (CSA) on the ellipsoid model is investigated using dynamical systems analysis. At first a nonlinear system of difference equations is derived that describes the mean-value evolution of the ES. This system is successively simplified to finally allow f...
The optimization behavior of the self-adaptation (SA) evolution strategy (ES) with intermediate multirecombination $[left(mu/mu_{I},lambdaright)hbox{-}sigma{rm SA}hbox{-}{rm ES}]$ using isotropic mutations is investigated on convex-quadratic functions (referred to as ellipsoid model). An asymptotically exact quadratic progress rate formula is deriv...
Scenario-based optimization is a problem class often occurring in finance, planning and control. While the standard approach is usually based on linear stochastic programming, this paper develops an Evolution Strategy (ES) that can be used to treat nonlinear planning problems arising from Value at Risk (VaR)-constraints and not necessarily proporti...
The convergence behaviors of so-called natural evolution strategies (NES) and of the information-geometric optimization (IGO) approach are considered. After a review of the NES/IGO ideas, which are based on information geometry, the implications of this philoso-phy w.r.t. optimization dynamics are investigated considering the optimization performan...
In this paper, the parameters of a multiconductor transmission line (MTL) are optimized using the constraint covariance matrix self-adaptation evolution strategy (cCMSA-ES). The cCMSA-ES performance scaling behavior is experimentally investigated. Practically relevant optimization results are reported for MTL configurations with 3, 4,⋯, 9 conductor...
This paper investigates strategy parameter control by Meta-ES using the noisy sphere model. The fitness noise considered is normally distributed with constant noise variance. An asymptotical analysis concerning the mutation strength and the population size is presented. It allows for the prediction of the Meta-ES dynamics. An expression describing...
This paper describes the algorithm's engineering of a covariance matrix self-adaptation evolution strategy (CMSA-ES) for solving a mixed linear/nonlinear constrained optimization problem arising in portfolio optimization. While the feasible solution space is defined by the (probabilistic) simplex, the nonlinearity comes in by a cardinality constrai...
In the above-named article [ibid., vol 16, no 4, pp. 578-596, Aug. 2012] an error occurred during the print production process that resulted in the incorrect display of Greek characters within many of the figures in the printed publication. However, the electronic PDF version available on IEEEXplore was not affected and all of the articles' figures...
A new class of simple and scalable test functions for unconstrained real-parameter optimization will be proposed. Even though these functions have only one minimizer, they yet appear difficult to be optimized using standard state-of-the-art EAs such as CMA-ES, PSO, and DE. The test functions share properties observed when evolving at the edge of fe...
This paper investigates mutation strength control using Meta-ES on the sharp ridge. The asymptotical analysis presented allows for the prediction of the dynamics in ridge as well as in radial direction. Being based on this analysis the problem of the choice of population size lambda and isolation parameter gamma will be tackled. Remarkably, the qua...
To theoretically compare the behavior of different algorithms, compatible performance measures are necessary. Thus in the first part, an analysis approach, developed for evolution strategies, was applied to simultaneous perturbation stochastic approximation on the noisy sphere model. A considerable advantage of this approach is that convergence res...
This chapter considers local progress and the dynamical systems approach. The approach can be used for a quantitative analysis of the behavior of evolutionary algorithms with respect to the question of convergence and the working mechanism of these algorithms. Results obtained so far for evolution strategies on various fitness functions are describ...
This paper presents a performance comparison of 4 direct search strategies in continuous search spaces using the noisy sphere as test function. While the results of the Evolution Strategy (ES), Evolutionary Gradient Search (EGS), Simultaneous Perturbation Stochastic Approximation (SPSA) considered are already known from literature, Implicit Filteri...
This paper studies the performance of multi-recombinative evolution strategies using isotropically distributed mutations with cumulative step length adaptation when applied to optimising cigar functions. Cigar functions are convex-quadratic objective functions that are characterised by the presence of only two distinct eigenvalues of their Hessian,...
This paper describes the implementation and the results for CMA-EGS on the BBOB 2010 testbed. The CMA-EGS is a hybrid strategy which combines elements from gradient search and evolutionary algorithms. The paper describes the algorithm used and the experimental setup. The strategy is able to solve 17 of 30 functions in at least one dimensionality.
This paper describes the implementation and the results for CMA-EGS on the BBOB 2010 noiseless function testbed. The CMA-EGS is a hybrid strategy which combines elements from gradient search and evolutionary algorithms. The paper describes the algorithm used and the experimental setup. The strategy is able to solve 11 of 24 test functions for at le...
This paper presents the result for Simultaneous Perturbation Stochastic Approximation (SPSA) on the BBOB 2010 noisy testbed. SPSA is a stochastic gradient approximation strategy which uses random directions for the gradient estimate. The paper describes the steps performed by the strategy and the experimental setup. The chosen setup represents a ra...
In this paper, first results on the analysis of self-adaptive evolution strategies (ES) with intermediate multirecombination on the elliptic model are presented. Equations describing the ES behavior on the ellipsoid will be derived using a deterministic approach and experimentally verified. A relationship between newly obtained formulae for the ell...
This paper presents the result for Simultaneous Perturbation Stochastic Approximation (SPSA) on the BBOB 2010 noiseless testbed. SPSA is a stochastic gradient approximation strategy which uses random directions for the gradient estimate. The paper describes the steps performed by the strategy and the experimental setup. The chosen setup represents...
This paper investigates the behavior of (µ/µI, λ)- σSA-ES on a class of positive definite quadratic forms. After introducing the fitness environment and the strategy, the self-adaptation mechanism is analyzed with the help of the self-adaptation response function. Afterward, the steady state of the strategy is analyzed. The dynamical equations for...
Cigar functions are convex quadratic functions that are characterised by the presence of only two distinct eigenvalues of their Hessian, the smaller one of which occurs with multiplicity one. Their ridge-like topology makes them a useful test case for optimisation strategies. This paper extends previous work on modelling the behaviour of evolution...
This paper introduces simple control rules for the mutation strength and the parental population size using the Meta-ES approach. An in-depth analysis is presented on the mutation strength control using the sphere model. A heuristic formula for the outer mutation parameter will be proposed based on the theoretical analysis. Finally, a new evolution...
This work is concerned with a weighted recombination method for Evolution Strategies (ES) on a class of positive definite quadratic forms (PDQF). In particular, the λopt-ES and the λopt-CSA-ES will be analyzed. A characteristic of both strategies is the use of weighted recombination of all offspring within an iteration step. After obtaining equatio...
This paper presents an analysis of the performance of the (=;,)- ES with isotropic mutations and cumulative step length adaptation on the noisy parabolic ridge. Several forms of dependency of the noise strength on the distance from the ridge axis are considered. Closed form expressions are derived that describe the mutation strength and the progres...
The covariance matrix adaptation evolution strategy (CMA-ES) rates among,the most successful evolutionary algorithms for continuous parameter op- timization. Nevertheless, it is plagued with some drawbacks like the complexity of the adaptation process and the reliance on a number,of sophisticatedly con- structed strategy parameter formulae for whic...
This paper presents a performance analysis of the recently proposed σ-self-adaptive weighted recombination evolution strategy (ES) with scaled weights. The steady state behavior of this ES is
investigated for the non-noisy and noisy case, and formulas for the optimal choice of the learning parameter are derived allowing
the strategy to reach maxima...
This paper proposes the σ-self-adaptive weighted multirecombination evolution strategy (ES) and presents a performance analysis of this newly engineered ES. The steady state behavior of this strategy is investigated on the sphere model and a formula for the optimal choice of the learning parameter is derived allowing the ES to reach maximal perform...
This paper considers self-adaptive (mu/mu_I,lambda)-evolution strategies on the noisy sharp ridge. The evolution strategy (ES) is treated as a dynamical system using the so-called evolution equations to model the ES's behavior. The approach requires the determination of the one-generational expected changes of the state variables - the progress mea...
This paper analyzes the behavior of the (mu/mu<sub>I</sub>,lambda) ES on a class of noisy positive definite quadratic forms (PDQFs). First the equations for the normalized progress rates are derived and then analyzed for constant normalized noise strength and constant (non-normalized) noise strength. Since in the latter case the strategy is not abl...
This paper reviews the state-of-the-art in robust design optimization – the search for designs and solutions which are immune with respect to production tolerances, parameter drifts during operation time, model sensitivities and others. Starting with a short glimps of Taguchi’s robust design methodology, a detailed survey of approaches to robust op...
In this paper, we empirically analyze the convergence behavior of evolutionary algorithms (evolution strategies - ES and genetic algorithms A) for two noisy optimization problems which belong to the class of functions with noise induced multi-modality (FNIMs). Although, both functions are qualitatively very similar, the ES is only able to converge...
In this chapter, we will give an overview over self-adaptive methods in evolutionary algorithms. Self-adaptation in its purest meaning is a state-of-the-art method to adjust the setting of control parameters. It is called self-adaptive because the algorithm controls the setting of these parameters itself - embedding them into an individual's genome...
In this paper, the behavior of intermediate (μ/μ
I
,λ)-ES with self-adaptation is considered for two classes of ridge functions: the sharp and the parabolic ridge. Using a step-by-step approach to describe the system’s dynamics, we will investigate the underlying causes for the different behaviors of the ES on these function types and the effects o...
The canonical versions of the ES are denoted by \[ (\mu/\rho, \lambda)\mbox{-ES} \quad \mbox{and} \quad (\mu/\rho + \lambda)\mbox{-ES}, \] respectively. Here \(\mu\) denotes the number of parents, \(\rho \leq \mu\) the mixing number (i.e., the number of parents involved in the procreation of an offspring), and \(\lambda\) the number of offspring. T...
Abstract This paper investigates the self-adaptation behavior of (1, λ)- evolution strategies (ES) on the noisy sphere model. To this end, the stochastic system dynamics is approximated on the level of the mean value dynamics. Being based on this “microscopic” analysis, the steady state behavior of the ES for the scaled noise scenario and the const...
This paper proposes and analyzes a class of test functions for evolutionary robust optimization, the "functions with noise-induced multimodality" (FNIMs). After a motivational introduction gleaned from a real-world optimization problem, the robust optimizer properties of this test class are investigated with respect to different robustness measures...
Most studies concerned with the effects of noise on the performance of optimization strategies, in general, and on evolutionary approaches, in particular, have assumed a Gaussian noise model. However, practical optimization strategies frequently face situations where the noise is not Gaussian. Noise distributions may be skew or biased, and outliers...
The paper presents the asymptotical analysis of a technique for improving the convergence of evolution strategies (ES) on
noisy fitness data. This technique that may be called “Mutate large, but inherit small”, is discussed in light of the EPP
(evolutionary progress principle). The derivation of the progress rate formula is sketched, its prediction...
The probabilities for generating improved solutions under two forms of selection under Gaussian mutation are studied. The results indicate that, under some simplifying assumptions, there can be advantage to retaining offspring that are of lesser value than the parent that generates them. The limitations of the analysis are identified, as well as di...
Evolutionary algorithms are frequently applied to dynamic optimization problems in which the objective varies with time. It is desirable to gain an improved understanding of the influence of different genetic operators and of the parameters of a strategy on its tracking performance. An approach that has proven useful in the past is to mathematicall...
This paper presents first results of an analysis of the s-self-adaptation mechanism on the sharp ridge for non-recombinative (1, ) evolution strategies (ES). To analyze the ES's evolution, we consider the so-called evolution equa-tions which describe the one-generation change. Neglecting stochastic perturba-tions and considering only the mean value...
In this paper, we propose two evolutionary strategies for the optimization of problems with actuator noise as encountered in robust optimization, where the design or objective parameters are subject to noise: the ROSAES and the ROCSAES. Both algorithms use a control rule for increasing the population size when the residual error to the optimizer st...
This paper presents first results on the analysis of self-adaptive (μ/μ<sub>I</sub>, λ)-evolution strategies (ES). Applying a deterministic approach to model the evolution of the ES, equations describing the stationary state behavior of the normalized mutation strength and of the progress rate is derived. The analysis provides a deeper insight as t...
Differential-geometric methods are applied to derive steady state conditions for the (μ/μ
I
,λ)-ES on the general quadratic test function disturbed by fitness noise of constant strength. A new approach for estimating
the expected final fitness deviation observed under such conditions is presented. The theoretical results obtained are compared
with...
Noise is a common problem encountered in real-world optimization. Although it is folklore that evolution strategies perform well in the presence of noise, even their performance is degraded. One effect on which we will focus in this paper is the reaching of a steady state that deviates from the actual optimal solution.
The quality gain is a local p...
Quality evaluations in optimization processes are frequently noisy. In particular evolutionary algorithms have been shown to cope with such stochastic variations better than other optimization algorithms. So far mostly additive noise models have been assumed for the analysis. However, we will argue in this paper that this restriction must be relaxe...
In optimization tasks that deal with real-world applications noise is very common leading to degradation of the performance
of Evolution Strategies. We will consider the quality gain of an (1,λ)-ES under noisy fitness evaluations for arbitrary fitness functions. The equation developed will be applied to several test
functions to check its predictiv...
In this chapter, we will analyse the influence of noise on the search behaviour of evolutionary algorithms. We will introduce different classes of functions which go beyond the simple additive noise model. The first function demonstrates a trade-off between an expectation and a variance based measure for the evaluation of the quality in the context...