Content uploaded by Efrén Mezura-Montes

Author content

All content in this area was uploaded by Efrén Mezura-Montes

Content may be subject to copyright.

An Improved Diversity Mechanism for Solving

Constrained Optimization Problems using a

Multimembered Evolution Strategy

Efr´

en Mezura-Montes and Carlos A. Coello Coello

CINVESTAV-IPN

Evolutionary Computation Group (EVOCINV)

Electrical Engineering Department

Computer Science Section

Av. IPN No. 2508, Col. San Pedro Zacatenco

M´

exico D.F. 07300, M´

EXICO

emezura@computacion.cs.cinvestav.mx

ccoello@cs.cinvestav.mx

Abstract. This paper presents an improved version of a simple evolution strategy

(SES) to solve global nonlinear optimization problems. As its previous version,

the approach does not require the use of a penalty function, it does not require the

deﬁnition by the user of any extra parameter (besides those used with an evolu-

tion strategy), and it uses some simple selection criteria to guide the process to the

feasible region of the search space. Unlike its predecessor, this new version uses

a multimembered Evolution Strategy (ES) and an improved diversity mechanism

based on allowing infeasible solutions close to the feasible region to remain in the

population. This new version was validated using a well-known set of test func-

tions. The results obtained are very competitive when comparing the proposed

approach against the previous version and other approaches representative of the

state-of-the-art in constrained evolutionary optimization. Moreover, its computa-

tional cost (measured in terms of the number of ﬁtness function evaluations) is

lower than the cost required by the other techniques compared.

1 Introduction

Evolutionary algorithms (EAs) have been successfully used to solve different types of

optimization problems [1]. However, in their original form, they lack an explicit mech-

anism to handle the constraints of a problem. This has motivated the development of a

considerable number of approaches to incorporate constraints into the ﬁtness function

of an EA [2,3]. Particularly, in this paper we are interested in the general nonlinear

programming problem in which we want to: Find xwhich optimizes f(x)subject to:

gi(x)≤0, i = 1,...,nhj(x) = 0, j = 1,...,pwhere xis the vector of decision

variables x= [x1, x2,...,xr]T,nis the number of inequality constraints and pis the

number of equality constraints (in both cases, constraints could be linear or nonlinear).

The most common approach adopted to deal with constrained search spaces is the

use of penalty functions [4]. When using a penalty function, the amount of constraint

violation is used to punish or “penalize” an infeasible solution so that feasible solu-

tions are favored by the selection process. Nonetheless, the main drawback of penalty

functions is that they require a careful ﬁne tuning of the penalty factors that accurately

estimates the degree of penalization to be applied so that we can approach efﬁciently

the feasible region [3].

The algorithm presented in this paper is an improved version of two previous ap-

proaches. The ﬁrst version [5] was based on a (µ+ 1) Evolution Strategy coupled with

three simple selection criteria based on feasibility to guide the search to the feasible

region of the search space. A second version of the approach was proposed in [6] but

now using a (1 + λ)-ES and adding a diversity mechanism whichconsisted of allowing

solutions with a good value of the objective function to remain as a new starting point

in the next generation of the search, regardless of feasibility. The version presented in

this paper still uses the self-adaptive mutation mechanism of an ES, but we now adopt

a multimembered (µ+λ)-ES to explore constrained search spaces. This mechanism is

combined with the same three simple selection criteria used before to guide the search

towards the global optima of constrained optimization problems [6]. However, we now

add an improved diversity mechanism which, although simple, provides a signiﬁcant

improvement in terms of performance. The idea of this mechanism is to allow the in-

dividual with both the lowest amount of constraint violation and the best value of the

objective function to be selected for the next generation. This solution can be chosen

with a 50% of probability either from the parents or from the offspring population. With

the combination of the above elements, the algorithm ﬁrst focuses on reaching the fea-

sible region of the search space. After that, it is capable of moving over the feasible

region as to reach the global optimum. The infeasible solutions that remain in the pop-

ulation are then used to sample points in the boundaries between the feasible and the

infeasible regions. Thus, the main focus of this paper is to show how a multimembered

ES coupled with the previously described diversity mechanism, has a highly compet-

itive performance in constrained problems when compared with respect to algorithms

representative of the state-of-the-art in the area.

This paper is organized as follows: In Section 2 a description of previous approaches

based on similar ideas to our own is provided. Section 3 includes the description of

the diversity mechanism that we propose. Then, in Section 4, we present the results

obtained and also a comparison against the previous version and state-of-the-art algo-

rithms. Such results are discussed in Section 5. Finally, in Section 6 we provide some

conclusions and possible paths for future research.

2 Related Work

The hypothesis that originated this work is the following: (1) The self-adaptation mech-

anism of an ES helps to sample the search space well enough as to reach the feasible

region reasonably fast and (2) the simple addition of simple selection criteria based on

feasibility to an ES should be enough to guide the search in such a way that the global

optimum can be approached efﬁciently.

The three simple selection criteria used are the following:

1. Between 2 feasible solutions, the one with the highest ﬁtness value wins (assuming

a maximization problem/task).

2. If one solution is feasible and the other one is infeasible, the feasible solution wins.

3. If both solutions are infeasible, the one with the lowest sum of constraint violation

is preferred.

The use of these criteria has been explored by other authors. Jim´

enez and Verde-

gay [7] proposed an approach similar to a min-max formulation used in multiobjective

optimization combined with tournament selection. The rules used by them are similar to

those adopted in this work. However, Jim´

enez and Verdegay’s approach lacks an explicit

mechanism to avoid the premature convergence produced by the random sampling of

the feasible region because their approach is guided by the ﬁrst feasible solution found.

Deb [8] used the same tournament rules previously indicated in his approach. However,

Deb proposed to use niching as a diversity mechanism, which introduces some extra

computational time (niching has time-complexity O(N2)). In Deb’s approach, feasible

solutions are always considered better than infeasible ones. This contradicts the idea

of allowing infeasible individuals to remain in the population. Therefore, this approach

will have difﬁculties in problems in which the global optimum lies on the boundary

between the feasible and the infeasible regions.

Motivated by the fact that some of the most recent and competitive approaches to

incorporate constraints into an EA use an ES (see for example [9,10]), we proposed [5]

a Simple (µ+ 1) Evolution Strategy (SES) to solve constrained optimization problems

in which one child created from µmutations of the current solution competes against it

and the better one is selected as the new current solution. This approach is based on the

two mechanisms previously indicated.

However, the approach in [5] used to get trapped in local optimum solutions. Thus,

in order to improve the quality and robustness of the results, a diversity mechanism was

added in [6]. In this case, a (1 + λ)-ES was adopted and the diversity mechanism con-

sisted on allowing solutions with a good value of the objective function to remain as a

new starting point for the search at each generation, regardless of feasibility. Addition-

ally, we introduced a self-adaptive parameter called Selection Ratio (Sr), which refers

to the percentage of selections that will be performed in a deterministic way (as used

in the ﬁrst version of the SES [5] where the child replaces the current solution based

on the three selection criteria previously indicated). In the remaining 1−Srselections,

there were two choices: (1) either the individual (out of the λ) with the best value of

the objective function would replace the current solution (regardless of its feasibility)

or (2) the best parent (based on the three selection criteria) would replace the current

solution. Both options are given a 50% probability each. The results improved, but for

some test problems no feasible solutions could be found and for other functions the

statistical results did not show enough robustness.

3 The new diversity mechanism

The two previous versions of the algorithm [5,6] are based on a single-membered ES

and they lack the explorative power to sample large search spaces. Thus, we decided to

re-evaluate the use of a (µ+λ)-ES to solve this limitation, but in this case, improving

the diversity mechanism implemented in the second version of our approach [6] and

eliminating the use of the self-adaptive Srparameter. The new version of the SES is

based on the same concepts that its predecessors as discussed before.

The detailed features of the improved diversity mechanism are the following: At

each generation, we allow the infeasible individual with the best value of the objec-

tive function and with the lowest amount of constraint violation to survive for the next

generation. We call this solution the best infeasible solution. In fact, there are two best

infeasible solutions at each generation, one from the µparents and one from the λoff-

spring. Either of them can be chosen with a 50% of probability. With 0.03 probability,

the selection process will choose the best infeasible individual with equal probability to

be the best infeasible parent or the best infeasible offspring.

Therefore, the same best infeasible solution can be copied more than once into the

next population. However, this is a desired behavior because a few copies of this so-

lution will allow its recombination with several solutions in the population, specially

with feasible ones. Recombining feasible solutions with infeasible solutions in promis-

ing areas (based on the good value of the objective function) and close to the boundary

of the feasible region will allow the ES to reach global optimum solutions located in

the boundary of the feasible region of the search space (which are known to be the most

difﬁcult solutions to reach). See Figure 1.

Feasible Region

Boundaries

Best infeasible solution

Feasible solutions

Possible crossover

Fig.1. Diagram that illustrates the idea of searching the boundaries with the new diversity mech-

anism proposed in this paper.

When the selection process occurs, the best individuals among the parents and off-

spring are selected based on the three selection criteria previously indicated. The se-

lection process will pick feasible solutions with a better value of the objective func-

tion ﬁrst, followed by infeasible solutions with a lower constraint violation. However,

3times from every 100 picks, the best infeasible solution (from either the parents or

the offspring population with a 50% of probability each) the best infeasible solution is

copied in the population for the next generation. The pseudocode is listed in Figure 2.

We chose the value of 3based on the previous version [6] which used a population of

just 3offspring. With this low number of solutions, the approach provided good results.

function selection()

For i=1 to µDo

If ﬂip(0.97)

Select the best individual based on the selection criteria from the union

of the parents and offspring population, add it to the population for the

next generation and delete it from this union.

Else

If ﬂip(0.5)

Select the best infeasible individual from the parents population and

add it to the population for the

next generation.

Else

Select the best infeasible individual from the offspring population and

add it to the population for

the next generation.

End If

End If

End For

End

Fig.2. Pseudocode of the selection procedure with the diversity mechanism incorpo-

rated. flip(P)is a function that returns TRUE with probability P

4 Experiments and Results

To evaluate the performance of the proposed approach we used the 13 test functions

described in [9]. The test functions chosen contain characteristics that are representative

of what can be considered “difﬁcult” global optimization problems for an evolutionary

algorithm. Their expressions are provided in the Appendix at the end of this paper.

To get an estimate of the ratio between the feasible region and the entire search

space for these problems, a ρmetric (as suggested by Michalewicz and Schoenauer [2])

was computed using the following expression: ρ=|F|/|S|where |F|is the number of

feasible solutions and |S|is the total number of solutions randomly generated. In this

work, S= 1,000,000 random solutions.

The different values of ρfor each of the functions chosen are shown in Table 4, where n

is the number of decision variables, LI is the number of linear inequalities, NI the num-

ber of nonlinear inequalities, LE is the number of linear equalities and NE is the number

of nonlinear equalities. We performed 30 independent runs for each test function. The

learning rates values were calculated using the formulas proposed by Schwefel [12]

(where nis the number of decision variables of the problem): τ= (p2√n)−1and

τ0= (√2n)−1. In order to favor ﬁner movements in the search space (as we observed

in the previous versions of the approach where only one sigma value was used and

when it had values close to zero the improvements of the result increased) we decided

to experiment with just a percentage of the quantity obtained by the formula proposed

by Schwefel [12]. We initialized the sigma values for all the individuals in the initial

Statistical Results of the New SES with the Improved Diversity Mechanism

Problem Optimal Best Mean Median Worst St. Dev.

g01 −15.00 −15.00 −15.00 −15.00 −15.00 0

g02 0.803619 0.803601 0.785238 0.792549 0.751322 1.67E-2

g03 1.00 1.00 1.00 1.00 1.00 2.09E-4

g04 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539 0

g05 5126.498 5126.599 5174.492 5160.198 5304.167 50.05E+0

g06 −6961.814 −6961.814 −6961.284 −6961.814 −6952.482 1.85E+0

g07 24.306 24.327 24.475 24.426 24.843 1.32E-1

g08 0.095825 0.095825 0.095825 0.095825 0.095825 0

g09 680.630 680.632 680.643 680.642 680.719 1.55E-2

g10 7049.25 7051.90 7253.05 7253.60 7638.37 136.0E+0

g11 0.75 0.75 0.75 0.75 0.75 1.52E-4

g12 1.00 1.00 1.00 1.00 1.00 0

g13 0.053950 0.053986 0.166385 0.061873 0.468294 1.76E-1

Table 1. Statistical results obtained by our SES for the 13 test functions with 30 inde-

pendent runs. A result in boldface means global optimum solution found.

Best Result Mean Result Worst Result

Problem Optimal NEW SES OLD NEW SES OLD NEW SES OLD

g01 −15.00 −15.00 −15.00 −15.00 −15.00 −15.00 −15.00

g02 0.803619 0.803601 0.803569 0.785238 0.769612 0.751322 0.702322

g03 1.00 1.00 1.00 1.00 1.00 1.00 1.00

g04 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539

g05 5126.498 5126.599 −5174.492 −5304.167 −

g06 −6961.814 −6961.814 −6961.814 −6961.284 −6961.814 −6952.482 −6961.814

g07 24.306 24.327 24.314 24.475 24.419 24.843 24.561

g08 0.095825 0.095825 0.095825 0.095825 0.095784 0.095825 0.095473

g09 680.630 680.632 680.669 680.643 680.810 680.719 681.199

g10 7049.25 7051.90 7057.04 7253.05 10771.42 7638.37 16375.27

g11 0.75 0.75 0.75 0.75 0.75 0.75 0.76

g12 1.00 1.00 1.00 1.00 1.00 1.00 1.00

g13 0.053950 0.053986 0.053964 0.166385 0.264135 0.468294 0.544346

Table 2. Comparison of results between the new SES and the old one proposed in [6].

“-” means no feasible solutions were found. A result in boldface means a better value

obtained by our new approach.

Best Result Mean Result Worst Result

Problem Optimal New SES HM New SES HM New SES HM

g01 −15.00 −15.00 −14.7886 −15.00 −14.7082 −15.00 −14.6154

g02 0.803619 0.803601 0.79953 0.785238 0.79671 0.751322 0.79119

g03 1.00 1.00 0.9997 1.00 0.9989 1.00 0.9978

g04 −30665.539 −30665.539 −30664.5−30665.539 −30655.3−30665.539 −30645.9

g05 5126.498 5126.599 −5174.492 −5304.167 −

g06 −6961.814 −6961.814 −6952.1−6961.284 −6342.6−6952.482 −5473.9

g07 24.306 24.327 24.620 24.475 24.826 24.843 25.069

g08 0.095825 0.095825 0.0958250 0.095825 0.0891568 0.095825 0.0291438

g09 680.63 680.632 680.91 680.643 681.16 680.719 683.18

g10 7049.25 7051.90 7147.97253.05 8163.67638.37 9659.3

g11 0.75 0.75 0.75 0.75 0.75 0.75 0.75

g12 1.00 1.00 0.999999 1.00 0.999134 1.00 0.991950

g13 0.053950 0.053986 NA 0.166385 N A 0.468294 N A

Table 3. Comparison of the new version of the SES with respect to the Homomorphous

Maps (HM) [11]. “-” means no feasible solutions were found. A result in boldface

means a better value obtained by our new approach.

Problem n Function ρLI NI LE NE

g01 13 quadratic 0.0003% 9 0 0 0

g02 20 nonlinear 99.9973% 1 1 0 0

g03 10 nonlinear 0.0026% 0 0 0 1

g04 5 quadratic 27.0079% 0 6 0 0

g05 4 nonlinear 0.0000% 2 0 0 3

g06 2 nonlinear 0.0057% 0 2 0 0

g07 10 quadratic 0.0000% 3 5 0 0

g08 2 nonlinear 0.8581% 0 2 0 0

g09 7 nonlinear 0.5199% 0 4 0 0

g10 8 linear 0.0020% 3 3 0 0

g11 2 quadratic 0.0973% 0 0 0 1

g12 3 quadratic 4.7697% 0930 0

g13 5 nonlinear 0.0000% 0 0 1 2

Table 4. Values of ρfor the 13 test problems chosen.

population with only a 40% of the value obtained by the following formula (where n

is the number of decision variables): σi(0) = 0.4×(∆xi/√n)where ∆xiis approx-

imated with the expression (suggested in [9]), ∆xi≈xu

i−xl

i, where xu

i−xl

iare the

upper and lower limits of the decision variable i. For the experiments we used the fol-

lowing parameters: (100 + 300)-ES, number of generations = 800, number of objective

function evaluations = 240,000.

To increase the exploitation feature of the global crossover operator we combine

discrete and intermediate crossover. Each gene in the chromosome can be processed

with any of these two crossover operators with a 50% of probability. This operator

is applied to both, strategy parameters (sigma values) and decision variables of the

problem. Note that we do not use correlated mutation. To deal with equality constraints,

a parameterless dynamic mechanism originally proposed in ASCHEA [10] and used in

[5] and in [6] is adopted. The tolerance value is decreased with respect to the current

generation using the following expression: j(t+ 1) = j(t)/1.00195. The initial 0

was set to 0.001. For problem g13, 0was set to 3.0and, in consequence, the factor

to decrease the tolerance value was modiﬁed to j(t+ 1) = j(t)/1.0145. Also, for

problems g03 and g13 the initial stepsize required a more dramatic decrease of the

stepsize. They were deﬁned as 0.01 (just a 5% instead of the 40%) for g03 and 0.05 (a

2.5% instead of the 40%) for g13. These two test functions seem to provide better results

with very smooth movements. It is important to note that these two problems share the

following features: moderately high dimensionality (ﬁve or more decision variables),

nonlinear objective function, one or more equality constraints, and moderate size of the

search space (based on the range of the decision variables). These common features

suggest that for this type of problem, ﬁner movements provide a better sampling of the

search space using an evolution strategy.

The statistical results of this new version of the SES with the improved diversity

mechanism are summarized in Table 1. The comparison of the improved version against

the previous one [6] is presented in Table 2. We compared our approach against the pre-

vious version of the SES [6] in Table 2 and against three state-of-the-art approaches:

the Homomorphous Maps (HM) [11] in Table 3, Stochastic Ranking (SR) [9] in Table

5 and the Adaptive Segregational Constraint Handling Evolutionary Algorithm (AS-

CHEA) [10] in Table 6.

The Homomorphous Maps performs a homomorphous mapping between an n-

dimensional cube and the feasible search region (either convex or non-convex). The

main idea of this approach is to transform the original problem into another (topolog-

ically equivalent) function that is easier to optimize by the EA. Both, the Stochastic

Ranking and ASCHEA are based on a penalty function approach. SR sorts the indi-

viduals in the population in order to assign them a rank value. However, based on the

value of a user-deﬁned parameter, the comparison between two adjacent solutions will

be performed using only the objective function. The remaining comparisons will be per-

formed using only the penalty value (the sum of constraint violation). ASCHEA uses

three combined mechanisms: (1) an adaptive penalty function, (2) a constraint-driven

recombination that forces to select a feasible individual to recombine with an infeasible

one and (3) a segregational selection based on feasibility which maintains a balance

between feasible and infeasible solutions in the population. ASCHEA also requires a

niching mechanism to improve the diversity in the population. Each mechanism requires

the deﬁnition by the user of extra parameters.

Best Result Mean Result Worst Result

Problem Optimal New SES SR New SES SR New SES SR

g01 −15.00 −15.00 −15.000 −15.00 −15.000 −15.00 −15.000

g02 0.803619 0.803601 0.803515 0.785238 0.781975 0.751322 0.726288

g03 1.00 1.00 1.000 1.00 1.000 1.00 1.000

g04 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539 −30665.539

g05 5126.498 5126.599 5126.497 5174.492 5128.881 5304.165 5142.472

g06 −6961.814 −6961.814 −6961.814 −6961.284 −6875.940 −6952.482 −6350.262

g07 24.306 24.327 24.307 24.475 24.374 24.843 24.642

g08 0.095825 0.095825 0.095825 0.095825 0.095825 0.095825 0.095825

g09 680.63 680.632 680.630 680.643 680.656 680.719 680.763

g10 7049.25 7051.90 7054.316 7253.05 7559.192 7638.37 8835.655

g11 0.75 0.75 0.750 0.75 0.750 0.75 0.750

g12 1.00 1.00 1.00 1.00 1.00 1.00 1.00

g13 0.053950 0.053986 0.053957 0.166385 0.057006 0.468294 0.216915

Table 5. Comparison of our new version of the SES with respect to Stochastic Ranking

(SR) [9]. A result in boldface means a better value obtained by our new approach.

5 Discussion of Results

As described in Table 1, our approach was able to ﬁnd the global optimum in seven test

functions (g01, g03, g04, g06, g08, g11 and g12) and it found solutions very close to

the global optimum in the remaining six (g02, g05, g07, g09, g10, g13). Compared with

its previous version [6] (Table 2) this new diversity mechanism improved the quality of

the results in problems g02, g05, g09 and g10. Also, the robustness of the results was

better in problems g02, g05, g08, g09, g10 and g13.

Best Result Mean Result Worst Result

Problem Optimal New SES ASCHEA New SES ASCHEA New SES ASCHEA

g01 −15.0−15.00 −15.0−15.00 −14.84 −15.00 NA

g02 0.803619 0.803601 0.785 0.785238 0.59 0.751322 NA

g03 1.00 1.00 1.01.00 0.99989 1.00 NA

g04 −30665.539 −30665.539 30665.5−30665.539 30665.5−30665.539 NA

g05 5126.498 5126.599 5126.5 5174.492 5141.65 5304.167 NA

g06 −6961.814 −6961.814 −6961.81 −6961.284 −6961.81 −6952.482 NA

g07 24.306 24.327 24.3323 24.475 24.66 24.843 NA

g08 0.095825 0.095825 0.095825 0.095825 0.095825 0.095825 NA

g09 680.630 680.632 680.630 680.643 680.641 680.719 NA

g10 7049.25 7051.90 7061.13 7253.05 7193.11 7638.37 NA

g11 0.75 0.75 0.75 0.75 0.75 0.75 NA

g12 1.00 1.00 NA 1.00 N A 1.00 N A

g13 0.053950 0.053986 NA 0.166385 N A 0.468294 N A

Table 6. Comparison of our new version of the SES with respect to ASCHEA [10].

NA = Not Available. A result in boldface means a better value obtained by our new

approach.

When compared with respect to the three state-of-the-art techniques previously in-

dicated, we found the following: Compared with the Homomorphous Maps (Table 3)

the new SES found a better “best” solution in ten problems (g01, g02, g03, g04, g05,

g06, g07, g09, g10 and g12) and a similar “best” result in other two (g08 and g11).

Also, our technique reached better “mean” and “worst” results in ten problems (g01,

g03, g04, g05, g06, g07, g08, g09, g10 and g12). A “similar” mean and worst result

was found in problem g11. The Homomorphous maps found a “better” mean and worst

result in function g02. No comparisons were made with respect to function g13 because

such results were not available for HM.

With respect to Stochastic Ranking (Table 5), our approach was able to ﬁnd a better

“best” result in functions g02 and g10. In addition, it found a “similar” best solution in

seven problems (g01, g03, g04, g06, g08, g11 and g12). Slightly better “best” results

were found by SR in the remaining functions (g05, g07, g09 and g13). The new SES

found better “mean” and “worst” results in four test functions (g02, g06, g09 and g10).

It also provided similar “mean” and “worst” results in six functions (g01, g03, g04, g08,

g11 and g12). Finally, SR found again justslightly better “mean” and “worst” results in

functions g05, g07 and g13.

Compared against the Adaptive Segregational Constraint Handling Evolutionary

Algorithm (Table 6), our algorithm found “better” best solutions in three problems (g02,

g07 and g10) and it found “similar” best results in six functions (g01, g03, g04, g06,

g08, g11). ASCHEA found slightly “better” best results in function g05 and g09. Ad-

ditionally, the new SES found “better” mean results in four problems (g01, g02, g03

and g07) and it found “similar” mean results in three functions (g04, g08 and g11).

ASCHEA surpassed our mean results in four functions (g05, g06, g09 and g10). We did

not compare the worst results because they were not available for ASCHEA. We did

not perform comparisons with respect to ASCHEA using functions g12 and g13 for the

same reason. As we can see, our approach showed a very competitive performance with

respect to these three state-of-the-art approaches.

Our approach can deal with moderately constrained problems (g04), highly con-

strained problems, problems with low (g06, g08), moderated (g09) and high (g01, g02,

g03, g07) dimensionality, with different types of combined constraints (linear, non-

linear, equality and inequality) and with very large (g02), very small (g05 and g13) or

even disjoint (g12) feasible regions. Also, the algorithm is able to deal with large search

spaces (based on the intervals of the decision variables) with a very small feasible region

(g10). Furthermore, the approach can ﬁnd the global optimum in problems where such

optimum lies on the boundaries of the feasible region (g01, g02, g04, g06, g07, g09).

This behavior suggests that the mechanism of maintaining the best infeasible solution

helps the search to sample the boundaries of the feasible region.

Besides still being a very simple approach, it is worth reminding that our algorithm

does not require the ﬁne-tuning of any extra parameters (other than those used with an

evolution strategy) since the only parameters required by the approach have remained

ﬁxed in all cases. In contrast, the Homomorphous maps require an additional parameter

(called v) which has to be found empirically [11]. Stochastic ranking requires the deﬁni-

tion of a parameter called Pf, whose value has an important impact on the performance

of the approach [9]. ASCHEA also requires the deﬁnition of several extra parameters,

and in its latest version, it uses niching, which is a process that also has at least one

additional parameter [10].

The computational cost measured in terms of the number of ﬁtness function eval-

uations (FFE) performed by any approach is lower for our algorithm with respect to

the others to respect to which it was compared. This is an additional (and important)

advantage, mainly if we wish to use this approach for solving real-world problems. Our

new approach performed 240,000 FFE, the previous version required 330,000 FFE,

the Stochastic Ranking performed 350,000 FFE, the Homomorphous Maps performed

1,400,000 FFE, and ASCHEA required 1,500,000 FFE.

6 Conclusions and Future Work

An improved diversity mechanism added to a multimembered Evolution Strategy com-

bined with some selection criteria based on feasibility were proposed to solve (rather

efﬁciently) constrained optimization problems. The proposed approach does not require

the use of a penalty function and it does not require the ﬁne-tuning of any extra parame-

ters (other than those required by an evolution strategy), since they assume ﬁxed values.

The proposed approach uses the self-adaptation mechanism of a multimembered ES to

sample the search space in order to reach the feasible region and it uses three simple

selection criteria based on feasibility to guide the search towards the global optimum.

Moreover, the proposed technique adopts a diversity mechanism which consists of al-

lowing infeasible solutions close to the boundaries of the feasible region to remain in

the next population. This approach is very easy to implement and its computational cost

(measured in terms of the number of ﬁtness function evaluations) is considerably lower

than the cost reported by other three constraint-handling techniques which are repre-

sentative of the state-of-the-art in evolutionary optimization. Despite its lower compu-

tational cost, the proposed approach was able to match (and even improve) on the results

obtained by the other algorithms with respect to which it was compared.

As part of our future work, we plan to evaluate the rate at which our algorithm

reaches the feasible region. This is an important issue when dealing with real-world

applications, since in highly constrained search spaces, reaching the feasible region

may be a rather costly task. Additionally, we have to perform more experiments in

order to establish which of the three mechanisms of the approach (diversity mechanism,

combined crossover or the reduced stepsize) is mandatory or if only their combined

effect makes the algorithm work.

Acknowledgments

The ﬁrst author acknowledges support from the Mexican Consejo Nacional de Ciencia y

Tecnolog´

ıa (CONACyT) through a scholarship to pursue graduate studies at CINVES-

TAV-IPN’s. The second author acknowledges support from CONACyT through project

number 34201-A.

Appendix: Test Functions

1. g01: Minimize: f(x) = 5 P4

i=1 xi−5P4

i=1 x2

i−P13

i=5 xisubject to:

g1(x) = 2x1+ 2x2+x10 +x11 −10 ≤0, g2(x) = 2x1+ 2x3+x10 +x12 −10 ≤0

g3(x) = 2x2+ 2x3+x11 +x12 −10 ≤0, g4(x) = −8x1+x10 ≤0,

g5(x) = −8x2+x11 ≤0, g6(x) = −8x3+x12 ≤0, g7(x) = −2x4−x5+x10 ≤0,

g8(x) = −2x6−x7+x11 ≤0, g9(x) = −2x8−x9+x12 ≤0

where the bounds are 0≤xi≤1(i= 1,...,9), 0≤xi≤100 (i= 10,11,12) and

0≤x13 ≤1. The global optimum is at x∗= (1,1,1,1,1,1,1,1,1,3,3,3,1) where

f(x∗) = −15. Constraints g1,g2,g3,g4,g5and g6are active.

2. g02: Maximize: f(x) =

Pn

i=1 cos4(xi)−2Qn

i=1 cos2(xi)

√Pn

i=1 ix2

i

subject to:

g1(x) = 0.75 −

n

Y

i=1

xi≤0, g2(x) =

n

X

i=1

xi−7.5n≤0

where n= 20 and 0≤xi≤10 (i= 1, . . . , n). The global maximum is unknown;

the best reported solution is [9] f(x∗) = 0.803619. Constraint g1is close to being active

(g1=−10−8).

3. g03: Maximize: f(x) = (√n)nQn

i=1 xisubject to: h(x) = Pn

i=1 x2

i−1 = 0 where n=

10 and 0≤xi≤1 (i= 1, . . . , n). The global maximum is at x∗

i= 1/√n(i= 1, . . . , n)

where f(x∗) = 1.

4. g04: Minimize: f(x) = 5.3578547x2

3+ 0.8356891x1x5+ 37.293239x1−40792.141

subject to:

g1(x) = 85.334407 + 0.0056858x2x5+ 0.0006262x1x4−0.0022053x3x5−92 ≤0

g2(x) = −85.334407 −0.0056858x2x5−0.0006262x1x4+ 0.0022053x3x5≤0

g3(x) = 80.51249 + 0.0071317x2x5+ 0.0029955x1x2+ 0.0021813x2

3−110 ≤0

g4(x) = −80.51249 −0.0071317x2x5−0.0029955x1x2−0.0021813x2

3+ 90 ≤0

g5(x) = 9.300961 + 0.0047026x3x5+ 0.0012547x1x3+ 0.0019085x3x4−25 ≤0

g6(x) = −9.300961 −0.0047026x3x5−0.0012547x1x3−0.0019085x3x4+ 20 ≤0

where: 78 ≤x1≤102,33 ≤x2≤45,27 ≤xi≤45 (i= 3,4,5). The optimum solution

is x∗= (78,33,29.995256025682,45,36.775812905788) where f(x∗) = −30665.539.

Constraints g1yg6are active.

5. g05: Minimize:f(x) = 3x1+ 0.000001x3

1+ 2x2+ (0.000002/3)x3

2subject to: g1(x) =

−x4+x3−0.55 ≤0, g2(x) = −x3+x4−0.55 ≤0

h3(x) = 1000 sin(−x3−0.25) + 1000 sin(−x4−0.25) + 894.8−x1= 0

h4(x) = 1000 sin(x3−0.25) + 1000 sin(x3−x4−0.25) + 894.8−x2= 0

h5(x) = 1000 sin(x4−0.25) + 1000 sin(x4−x3−0.25) + 1294.8 = 0 where 0≤

x1≤1200,0≤x2≤1200,−0.55 ≤x3≤0.55, and −0.55 ≤x4≤0.55. The best

known solution is x∗= (679.9453,1026.067,0.1188764,−0.3962336) where f(x∗) =

5126.4981.

6. g06: Minimize: f(x) = (x1−10)3+ (x2−20)3subject to: g1(x) = −(x1−5)2−(x2−

5)2+ 100 ≤0,g2(x) = (x1−6)2+ (x2−5)2−82.81 ≤0where 13 ≤x1≤100

and 0≤x2≤100. The optimum solution is x∗= (14.095,0.84296) where f(x∗) =

−6961.81388. Both constraints are active.

7. g07: Minimize: f(x) = x2

1+x2

2+x1x2−14x1−16x2+ (x3−10)2+ 4(x4−5)2+

(x5−3)2+ 2(x6−1)2+ 5x2

7+ 7(x8−11)2+ 2(x9−10)2+ (x10 −7)2+ 45

subject to: g1(x) = −105 + 4x1+ 5x2−3x7+ 9x8≤0

g2(x) = 10x1−8x2−17x7+ 2x8≤0, g3(x) = −8x1+ 2x2+ 5x9−2x10 −12 ≤0

g4(x) = 3(x1−2)2+ 4(x2−3)2+ 2x2

3−7x4−120 ≤0

g5(x) = 5x2

1+ 8x2+ (x3−6)2−2x4−40 ≤0

g6(x) = x2

1+ 2(x2−2)2−2x1x2+ 14x5−6x6≤0

g7(x) = 0.5(x1−8)2+ 2(x2−4)2+ 3x2

5−x6−30 ≤0

g8(x) = −3x1+6x2+12(x9−8)2−7x10 ≤0where −10 ≤xi≤10 (i= 1, . . . , 10). The

global optimum is x∗= (2.171996,2.363683,8.773926,5.095984,0.9906548,1.430574,

1.321644,9.828726,8.280092,8.375927) where f(x∗) = 24.3062091. Constraints g1,g2,

g3,g4,g5and g6are active.

8. g08: Maximize: f(x) = sin3(2π x1) sin(2πx2)

x3

1(x1+x2)

subject to: g1(x) = x2

1−x2+ 1 ≤0,g2(x) = 1 −x1+ (x2−4)2≤0where 0≤x1≤10

and 0≤x2≤10. The optimum solution is located at x∗= (1.2279713,4.2453733) where

f(x∗) = 0.095825.

9. g09: Minimize: f(x) = (x1−10)2+ 5(x2−12)2+x4

3+ 3(x4−11)2+ 10x6

5+ 7x2

6+

x4

7−4x6x7−10x6−8x7

subject to: g1(x) = −127 + 2x2

1+ 3x4

2+x3+ 4x2

4+ 5x5≤0,g2(x) = −282 + 7x1+

3x2+ 10x2

3+x4−x5≤0

g3(x) = −196 + 23x1+x2

2+ 6x2

6−8x7≤0,g4(x) = 4x2

1+x2

2−3x1x2+ 2x2

3+ 5x6−

11x7≤0where −10 ≤xi≤10 (i= 1,...,7). The global optimum is x∗= (2.330499,

1.951372,−0.4775414,4.365726,−0.6244870,1.038131,1.594227) where f(x∗) =

680.6300573. Two constraints are active (g1and g4).

10. g10: Minimize: f(x) = x1+x2+x3subject to: g1(x) = −1 + 0.0025(x4+x6)≤0

g2(x) = −1 + 0.0025(x5+x7−x4)≤0, g3(x) = −1 + 0.01(x8−x5)≤0

g4(x) = −x1x6+ 833.33252x4+ 100x1−83333.333 ≤0

g5(x) = −x2x7+ 1250x5+x2x4−1250x4≤0

g6(x) = −x3x8+ 1250000 + x3x5−2500x5≤0where 100 ≤x1≤10000,1000 ≤

xi≤10000,(i= 2,3),10 ≤xi≤1000,(i= 4,...,8). The global optimum is: x∗=

(579.19,1360.13,5109.92,182.0174,295.5985,217.9799,286.40,395.5979), where

f(x∗) = 7049.25.g1,g2and g3are active.

11. g11: Minimize: f(x) = x2

1+ (x2−1)2subject to: h(x) = x2−x2

1= 0 where: −1≤

x1≤1,−1≤x2≤1. The optimum solution is x∗= (±1/√2,1/2) where f(x∗) = 0.75.

12. g12: Maximize: f(x) = 100−(x1−5)2−(x2−5)2−(x3−5)2

100 subject to: g1(x) = (x1−p)2+

(x2−q)2+ (x3−r)2−0.0625 ≤0where 0≤xi≤10 (i= 1,2,3) and p, q, r =

1,2,...,9. The feasible region of the search space consists of 93disjointed spheres. A point

(x1, x2, x3)is feasible if and only if there exist p, q, r such the above inequality (12) holds.

The global optimum is located at x∗= (5,5,5) where f(x∗) = 1.

13. g13: Minimize: f(x) = ex1x2x3x4x5subject to: h1(x) = x2

1+x2

2+x2

3+x2

4+x2

5−10 = 0

h2(x) = x2x3−5x4x5= 0,h3(x) = x3

1+x3

2+1 = 0 where −2.3≤xi≤2.3 (i= 1,2)

and −3.2≤xi≤3.2 (i= 3,4,5). The optimum solution is x∗= (−1.717143,1.595709,

1.827247,−0.7636413,−0.763645) where f(x∗) = 0.0539498.

References

1. B¨

ack, T.: Evolutionary Algorithms in Theory and Practice. Oxford University Press, New

York (1996)

2. Michalewicz, Z., Schoenauer, M.: Evolutionary Algorithms for Constrained Parameter Op-

timization Problems. Evolutionary Computation 4(1996) 1–32

3. Coello Coello, C.A.: Theoretical and Numerical Constraint Handling Techniques used with

Evolutionary Algorithms: A Survey of the State of the Art. Computer Methods in Applied

Mechanics and Engineering 191 (2002) 1245–1287

4. Smith, A.E., Coit, D.W.: Constraint Handling Techniques—Penalty Functions. In B¨

ack,

T., Fogel, D.B., Michalewicz, Z., eds.: Handbook of Evolutionary Computation. Oxford

University Press and Institute of Physics Publishing (1997)

5. Mezura-Montes, E., Coello Coello, C.A.: A Simple Evolution Strategy to Solve Constrained

Optimization Problems. In Cant ´

u-Paz, E., Foster, J.A., Deb, K., Davis, L.D., Roy, R., Reilly,

U.M.O., Beyer, H.G., Standish, R., Kendall, G., Wilson, S., Harman, M., Wegener, J., Das-

gupta, D., Potter, M.A., Schultz, A.C., Dowsland, K.A., Jonoska, N., Miller, J., eds.: Proceed-

ings of the Genetic and Evolutionary Computation Conference (GECCO’2003), Heidelberg,

Germany, Chicago, Illinois, Springer Verlag (2003) 640–641 Lecture Notes in Computer

Science Vol. 2723.

6. Mezura-Montes, E., Coello Coello, C.A.: Adding a Diversity Mechanism to a Simple Evolu-

tion Strategy to Solve Constrained Optimization Problems. In: Proceedings of the Congress

on Evolutionary Computation 2003 (CEC’2003). Volume 1., Piscataway, New Jersey, Can-

berra, Australia, IEEE Service Center (2003) 6–13

7. Jim´

enez, F., Verdegay, J.L.: Evolutionary techniques for constrained optimization problems.

In Zimmermann, H.J., ed.: 7th European Congress on Intelligent Techniques and Soft Com-

puting (EUFIT’99), Aachen, Germany, Verlag Mainz (1999) ISBN 3-89653-808-X.

8. Deb, K.: An Efﬁcient Constraint Handling Method for Genetic Algorithms. Computer

Methods in Applied Mechanics and Engineering 186 (2000) 311–338

9. Runarsson, T.P., Yao, X.: Stochastic Ranking for Constrained Evolutionary Optimization.

IEEE Transactions on Evolutionary Computation 4(2000) 284–294

10. Hamida, S.B., Schoenauer, M.: ASCHEA: New Results Using Adaptive Segregational

Constraint Handling. In: Proceedings of the Congress on Evolutionary Computation 2002

(CEC’2002). Volume 1., Piscataway, New Jersey, IEEE Service Center (2002) 884–889

11. Koziel, S., Michalewicz, Z.: Evolutionary Algorithms, Homomorphous Mappings, and Con-

strained Parameter Optimization. Evolutionary Computation 7(1999) 19–44

12. Schwefel, H.P.: Evolution and Optimal Seeking. John Wiley & Sons Inc., New York (1995)