Content uploaded by Matthias Trapp
Author content
All content in this area was uploaded by Matthias Trapp on Jan 02, 2019
Content may be subject to copyright.
Evolutionary Algorithms
An Introduction
"[G]enetic algorithms are based on a biological metaphor: They view learning as a competition among a population of evolving candidate
problem solutions. A 'fitness' function evaluates each solution to decide whether it will contribute to the next generation of solutions. Then,
through operations analogous to gene transfer in sexual reproduction, the algorithm creates a new population of candidate solutions."
Matthias Trapp,
Theoretical Ecology Group - University of Potsdam,
Stanislaw Lem Workshop on Evolution –10.-14. October - Lviv 2005,
Agenda
Introduction
Structure of an EA
Genetic Operators
Classification
Implementation
Discussion
Introduction
Evolutionary Algoritms - Introduction 4
Motivation - The Problem(s)
•Global optimization problem:
–Function has many local optima
–Function is changing over time
–Function have many parameters
→very large search space
•Combinatorial problems / Data Mining
•Classical NP-hard problems:
–TSP
–SAT
• …
}:)(min{ Μxxf
Evolutionary Algoritms - Introduction 5
Overview Application Domains
EA/EC
Optimization
Automatic
Programming Machine Learning
Economics
Operations
Research
Ecology
Population
Genetics
Social Systems
Evolutionary Algoritms - Introduction 6
Evolution and Problem Solving
•Algorithm = Automated Problem Solver
•Broad Scope: Natural Computing
•Family of algorithms which mimicking
natural processes:
–Neural Networks
–Simulated Annealing
–DNA Computing
–Evolutionary Algorithms
Evolution vs. Problem Solving
Environment → Problem
Individual → Candidate Solution
Fitness → Quality
Approximation → Optimization
Evolutionary Algoritms - Introduction 7
Evolutionary Algorithms
•EAs are adaptive heuristic search algorithms
•Metaphor: trail and error (a.k.a generate and test)
•EAs are inspired by Darwin's theory of evolution:
problems are solved by an evolutionary process
resulting in a best (fittest) solution (survivor) from
a population of solution candidates
•EAs has been successfully applied to a wide range of
problems:
Aircraft Design, Routing in Communications Networks,
Tracking Windshear, Game Playing, Robotics, Air Traffic Control,
Design, Scheduling, Machine Learning, Pattern Recognition, Job Shop
Scheduling, VLSI Circuit Layout, Strike Force Allocation, Market Forecasting,
Egg Price Forecasting, Design of Filters and Barriers, Data-Mining, User-Mining, Resource
Allocation, Path Planning, Theme Park Tours …
Evolutionary Algoritms - Introduction 8
Characteristics
Differences to classical algorithms/optimization methods:
•EAs search a set of possible solutions in parallel
•EAs do not require derivative information
•EAs use probabilistic transition rules
•EAs are generally straightforward to apply
•EAs provide a number of potential solutions
•EAs are able to apply self-adaptation
Structure of an EA
Evolutionary Algoritms - Structure of an EA 10
EA Components
•Representation mechanism (definition of individuals)
•Evaluation function (or fitness function)
•Population as container data structure
•Parent/Survivor selection mechanism
•Variation operators (Recombination, Mutation)
•Initialization procedure / Termination condition
Coding of Solutions
Objective Function
Genetic Operators
Specific Knowlegde
Problem Evolutionary
Search Solution(s)
Encoding of Problem Implementation
Evolutionary Algoritms - Structure of an EA 11
General Schema EA
Evolutionary Search (Flow Chart Model)
Population
Parents
Offspring
Parent Selection
Survivor Selection
Initialization
Termination
Recombination
Mutation
Data Activity/Control Flow
LEGEND:
Evolutionary Algoritms - Structure of an EA 12
General Schema EA
Evolutionary Search (Pseudo Code)
procedure EA {
t = 0;
Initialize(Pop(t));
Evaluate(Pop(t));
while(!TerminalCondition(Pop(t))
{
Parents(t) = ParentSelection(Pop(t));
Offspring(t) = Recombination(Parents(t));
Offspring(t) = Mutation(Offspring(t));
Evaluate(Offspring(t));
Pop(t+1)= Replace(Pop(t),Offspring(t));
t = t + 1;
}
Evolutionary Algoritms - Back Matter 13
Representation x = E(D(x))
•Mapping: Problem context →Problem solving space:
–Phenotype space P(candidate solution,individuals)
–Genotype space G (chromosomes, individuals)
–Encoding E : P
→
G
–Decoding D : G
→
P
•Encoding: Technical representation of individuals
–GA:Binary Encoding: (1110110000100) = 7556
–ES: Valued vectors: (ABDJEIFJDHDIE)||(136578924)
–EP: Finite state machines:
–GP: Tree of objects (LISP):
(IF_THEN_ELSE(> x 0)(SetX(*(* x 3) (- 4 y)))(SetY(+ x (- y 1))))
s0s1
s2
s2
{01}*1 Acceptor
s3
0
100,1
1
Evolutionary Algoritms - Back Matter 14
Population P(t) = {x1t,..., xnt}
•Multi-set of genotypes = unit of evolution
•Invariants:
–Population Size n:
•static (common)
•dynamic (unusually)
–Non-overlapping (Simple GA):
•entire population is replaced each generation
–Overlapping (Steady-State GA):
•few individuals are replaced each generation
•Sometimes associated with spatial structure
•Diversity: number of different solutions in P(t)
•Multi-Population approaches (Pohlheim, 1995)
Genetic Operators
Evolutionary Algoritms - Genetic Operators 16
Selection/Sampling Operators
•Distinguish between parent and survivor selection
•Typically probabilistic; work on population level
•Use fitness assignment of solution candidates
•Role: pushing quality improvement
•Generational selection vs. steady-state selection
•Common steady state selection methods:
Elitist Selection
Roulette Wheel Selection
Tournament Selection
Scaling Selection
Rank Selection
Fitness-proportionate Selection
Hierarchical Selection
Boltzmann Selection
Remainder stochastic sampling
Stochastic uniform sampling
Evolutionary Algoritms - Genetic Operators 17
Mutation Operator mi: G
→
G
•Unary operator, always stochastic
•Bit-Strings: Bit-flips (00101) →(10101)
•Tree:
–Sub tree destructive
–Sub tree/Node swap
•List:
–Generative/Destructive
–Node/Sequence Swap
•Array:
–Destructive
–Element Flip/Swap
Evolutionary Algoritms - Genetic Operators 18
Recombination ci: G ×…×G
→
G
•Inherit genotype traits, typically stochastic
•Often binary operator: Offspring = Sex(Mum, Dad)
•Bit-Strings:
–k-Point Recombination
–Uniform Recombination
•Genetic Programming:
(seldom used)
0 1 1 0 1
1 0 1 1 0
0 1 1 1 0
Snip
A =
B =
A’=
=
=otherwiseiB
iBMifiA
iA ],[
1][],[
]['
+
X 1 *
X 2
++
X*
X 2
=
(X+1) (X·2) X+(X·2)
Evolutionary Algoritms - Genetic Operators 19
A Simple Example
Representation {0,1}n
Recombination 1-Point Crossover
Recombination probability 70%
Mutation Uniform Bit-Flip
Mutation probability pm1/n
Parent selection Best out of random two
Survival selection Generational
Population size 500
Number of offspring 500
Initialization Random
Termination condition No improvement in last 25 generations
EA for Knapsack Problem
Evolutionary Algoritms - Genetic Operators 20
Effects of Genetic Operators
•Selection alone will tend to fill the population with
copies of the best individual
•Selection and crossover operators will tend to cause
the algorithms to converge on a good but sub-optimal
solution
•Mutation alone induces a random walk through the
search space.
•Selection and mutation creates a parallel, noise-
tolerant, hill-climbing algorithm
Evolutionary Algoritms - Genetic Operators 21
Terminal Conditions
•Discovery of an optimal solution (precision ε> 0),
•Discovery of an optimal or near optimal solution,
•Convergence on a single or set of similar solutions,
•A user-specified threshold has been reached,
•A maximum number of cycles are evaluated,
•EA detects the problem has no feasible solution
→often disjunction of different conditions
Classification
Evolutionary Algoritms - Classification 23
Classification - Overview
Genetic Algorithms (GA)
Evolutionary Strategies (ES)
Evolutionary Programming (EP)
Genetic Programming (GP)
Evolutionary Computation
• 1948 Alan Turing: „genetically or evolutionary search“
•>1950 Idea:
simulate evolution to solve engineering and design problems
Box, 1957
Friedberg, 1958
Bremermann, 1962
Evolutionary Algoritms - Classification 24
Genetic Algorithms (GA)
•By Holland (1975), USA
•concerned with developing robust adaptive systems
•Initially as abstraction of biological evolution
•Use of bit-strings for solution representation
•First EA which uses recombination
•Recombination seen as main operator
•very successful for combinatory optimization problems
Evolutionary Algoritms - Classification 25
•By Rechenberg (1973), Schwefel (1981), Germany
•Parameter optimization of real-valued functions
•Accentuation on mutation
•Selection (μ–Parents, λ–Offspring):
–(μ, λ): choose fittest of λ > μ offspring
–(μ + λ): choose fittest of λ + μ solutions
•Recombination (u,vparent vectors, wchild vector):
• More soon… (Implementation Example)
Evolutionary Strategies (ES)
+
=ionrecombinatdiscretevu
ionrecombinatryintermediavu
w
ii
ii
i||
,2/)(
Evolutionary Algoritms - Classification 26
Evolutionary Programming (EP)
•By Fogel, Owens, and Walsh (1966), USA
•Application: Artificial Intelligence,
•Initially for evolution of finite-state machines,
•Using mutation and selection,
•Later applications to mainly real-valued functions
•Strong similarity to evolutionary strategies
Example: Prediction of binary cycles
Evolutionary Algoritms - Classification 27
Genetic Programming (GP)
•Koza (1992), developed to simulate special functions
•Application: Function fitting f(x)
•Using parse trees of Terminal and Non-Terminals:
•Assumptions:
–Completeness
–Seclusion
•Problems:
–Variable count
–Variable types
−=
iixprogramxftiontargetfunc )()(
max(x², x+3y)
max
3
·
x x
+
x·
y
T = {x, y, 3},
N = {max,+,·}
void ga::rank(void) {
fitness_struct temp;
int pos;
calc_fitness();
for (int pass=1; pass<POP_SIZE; ++pass) {
temp = rankings[pass];
pos = pass;
while ((pos > 0) && temp.fitness < rankings[pos-1].fitness) {
rankings[pos] = rankings[pos-1];
--pos;
}
rankings[pos] = temp;
}
best_sol = rankings[0].fitness;
worst_sol = rankings[POP_SIZE-1].fitness;
if (best_sol < best_overall)
best_overall = best_sol;
if (worst_sol > worst_overall)
worst_overall = worst_sol;
}
Implementation and Software
Evolutionary Algoritms - Implementation 29
•Search Space:
•Evolutionary strategy:
•Solution candidate (no encoding necessary):
•Fitness-Function :
•Parent selection: Elitist
•Recombination: 1-Point, fixed
•Non-overlapping population
Another Simple Example
0443322110
)( xxxxxxxxxxxf −+−+−+−+−=
5
,50;50 MGPIRM==−=
Gxxxxxx = 43210 ,,,,
MPf →:
→Hybrid approach: evolutionary strategy and genetic program
Evolutionary Algoritms - Implementation 30
Applying Self-Adaptation
•Evolution of the Evolution:
Self-adaptation = specific on-line parameter calibration technique
•Random number from Gaussian distribution
with zero mean and standard deviation σ
•Mutation operator:
•Extending the candidate representation:
}4,...,0{),,0(
'+= iNxx ii
),0(
,,,,,,,,,,
''
),0('
'
4
'
0
'
4
'
04140
iii
N
ii
Nxx
e
xxxx
+=
=
→
),0(
N
Evolutionary Algoritms - Implementation 31
Working of an EA
•Distinct search phases:
–Exploration
–Exploitation
•Trade-Off between exploration and exploitation:
–Inefficient search vs. Propensity to quick search
focus
•Premature Convergence: “climbing the wrong hill”
–Losing diversity →Converge in local optimum
–Techniques to prevent this well-known effect
• „Any-time“ behaviour
t
Best value in population
Evolutionary Algoritms - Implementation 32
Multiobjective EA
•Multiobjective GA (MOGA)
(Fonseca und Fleming (1993)),
•Niched Pareto GA (NPGA)
(Horn und Nafpliotis (1993)),
•Nondominated Sorting GA (NSGA-II)
(Deb u. a. (2000)) ,
•Strength Pareto EA (SPEA)
(Zitzler und Thiele (1998)),
•Strength Pareto EA (SPEA2)
(Zitzler u. a. (2001))
Evolutionary Algoritms - Implementation 33
Parallel Implementation of EA
•Subpopulation on MIMD (Belew and Booker (1991))
•Decrease of execution time
•Migration Model
–Unrestricted migration
–Ring migration
–Neighbourhood migration
•Global Model (Worker/Farmer)
•Diffusion Model:
–handles every individual separately
–selects the mating partner in a local neighbourhood
–diffusion of information takes place
Evolutionary Algoritms - Implementation 34
API Comparison
Name Language Licence Target
PGAPack Fortran / C Freeware All
EO C++ GNU LGPL All
GALib C++ BSD Lnx,Win
GAGS C++ Freeware Lnx,Win
JAGA Java Freeware All
JGAP Jave Freeware All
Discussion
Evolutionary Algoritms - Discussion 36
EA Advantages
+ Applicable to a wide range of problems
+ Useful in areas without good problem specific
techniques
+ No explicit assumptions about the search space
necessary
+ Easy to implement
+ Any-time behaviour
“..is a good designer of complex structures that are well
adapted to a given environment or task.”
Evolutionary Algoritms - Discussion 37
EA Disadvantages
–Problem representation must be robust
–No general guarantee for an optimum
–No solid theoretically foundations (yet)
–Parameter tuning: trial-and-error Process
(but self-adaptive variants in evolution strategies)
–Sometimes high memory requirements
–Implementation: High degree of freedom
Evolutionary Algoritms - Discussion 38
Summary
•EAs are different from classical algorithms
•Less effort to develop an EA which:
–Delivers acceptable solutions,
–In acceptable running time,
–Low costs of men and time
•EAs are distributable (Belew and Booker (1991)):
–Subpopulations on MIMD,
–Via network
•EAs are easy to implement
„In order to make evolutionary computing work well, there must
be a programmer that sets the parameters right.“
„an EA is the second best algorithm for any problem“
Sources
•Spears, W. M., De Jong, K. A., Bäck, T., Fogel, D. B., and de
Garis, H. (1993). “An Overview of Evolutionary
Computation,” The Proceedings of the European Conference
on Machine Learning, v667, pp. 442-459.
•A.E. Eiben, “Evolutionary computing: the most powerful
problem solver in the universe?”
• Zbigniew Michalewicz, “Genetic Algorithms + Data Structures
= Evolution Programs”, Springer, 1999, 3-540-60676-9
•Lawrence J. Fogel, Alvin J. Owens, Michael J. Walsh: Artificial
intelligence through simulated evolution, Wiley Verlag 1966
•John R. Koza: Genetic Programming on the programming of
computers by means of natural selection, MIT Verlag 1992