Conference PaperPDF Available

Evolutionary Developmental Design for Non-Programmers

Authors:
  • White Lioness Technologies
  • Global Environmental Technologies Inc.

Abstract and Figures

Evolutionary developmental design (Evo-Devo-Design) is a design method that combines complex developmental techniques with an evolutionary optimisation techniques. In order to use such methods, the problem specific developmental and evaluation procedures typically need to be define using some kind of textual programming language. This paper reports on an alternative approach, in which designers can use Visual Dataflow Modelling (VDM) instead of textual programming. This research described how Evo-Devo-Design problems can defined using the VDM approach, and how they can subsequently be run using a Distributed Execution Environment (called Dexen) on multiple computers in parallel. A case study is presented, where the Evo-Devo-Design method is used to evolve designs for a house, optimised for daylight, energy consumption, and privacy.
Content may be subject to copyright.
Design Tool Development - eCAADe 29 307
Evolutionary Developmental Design for Non-Programmers
Patrick Janssen1, Cihat Basol2, Kian Wee Chen3
1,2,3National University of Singapore
1patrick@janssen.name, 2cihatbasol@gmail.com, 3chenkianwee@gmail.com
Abstract. Evolutionary developmental design (Evo-Devo-Design) is a design method that
combines complex developmental techniques with an evolutionary optimisation techniques.
In order to use such methods, the problem specic developmental and evaluation procedures
typically need to be dene using some kind of textual programming language. This paper
reports on an alternative approach, in which designers can use Visual Dataow Modelling
(VDM) instead of textual programming. This research described how Evo-Devo-Design
problems can dened using the VDM approach, and how they can subsequently be run using
a Distributed Execution Environment (called Dexen) on multiple computers in parallel. A
case study is presented, where the Evo-Devo-Design method is used to evolve designs for a
house, optimised for daylight, energy consumption, and privacy.
Keywords. Evolutionary; developmental; design; performance; optimisation.
INTRODUCTION
Evolutionary design is loosely based on the
neo-Darwinian model of evolution through natural
selection. A population of individuals is maintained
and an iterative process applies a number of evo-
lutionary steps that create, transform, and delete
individuals in the population. Each individual rep-
resents a design variant, and has a genotype rep-
resentation and a phenotype expression: the geno-
type representation encodes information that can
be used to create a model of the design, while the
phenotype expression is the actual design model.
The individuals in the population are evaluated
relative to one another, and on the basis of these
evaluations, new individuals are created using ‘ge-
netic operators’ such as crossover and mutation.
The process is continued through numerous gen-
erations so as to ensure that the population as a
whole evolves and adapts.
Evolutionary design differs from other types
of evolutionary approaches (such a genetic algo-
rithms) in that it includes a complex developmen-
tal step that generates a phenotype by applying
the genes in the genotype (Frazer 1995, Bentley
and Kumar 1999, Stanley and Miikkulainen 2003,
Janssen 2004, Hornby 2005, Kowaliw and Banzhaf
2011). We therefore refer to this as evolutionary
developmental design, or Evo-Devo-Design. For
designers, the developmental step is crucially im-
portant, since it delineates the search space of
possible designs. The Evo-Devo-Design method is
able to augment the traditional process of design
exploration, in which typically only a small number
of options will be considered. The advantage of
Evo-Devo-Design is that it is able to automatically
develop and evaluate large populations of design
variants. This method has proved to be well suited
308 eCAADe 29 - Design Tool Development
to design processes that are typically divergent and
exploratory (Janssen 2004).
One of the key drawbacks of such advanced dig-
ital design methods has been the need for designers
to write and develop their own customised software
tools. This has severely limited the general appli-
cability of such methods. This paper describes an
alternative approach, whereby designers can apply
Evo-Devo-Design methods without having to write
any code. The authors have developed a Distribut-
ed EXecution ENvironment (Dexen) for population
based multi-objective optimisation algorithms. Such
algorithms include hill climbing, simulated anneal-
ing and evolutionary algorithms [1,7,8]. In this paper,
we will focus on using Dexen for Evo-Devo-Design.
Following this introduction, section two gives
an overview of the Dexen architecture. Section three
focuses on how non-programmers can use Dexen
for Evo-Devo-Design. Section four reports on a case-
study experiment using Dexen to evolve a house
design.
DEXEN SYSTEM ARCHITECTURE
The two main goals of Dexen are speed and
flexibility. Speed is an issue since design optimisa-
tion problems typically require complex simulations
that can be prohibitively slow. Flexibility is an issue
since design optimisation problems typically require
highly customised evolutionary steps, often requir-
ing the integration of existing simulation programs.
In order to achieve these goals, Dexen has been de-
signed with two key features. First, for speed, Dexen
is designed to run on multiple computers in parallel.
Second, for flexibility, Dexen provides an end-user
programming model that allowed users to encap-
sulate the problem specific aspects within a few key
scripts.
Dexen is based on a previous multi-objective
evolutionary developmental design environment
called EDDE (Janssen 2004, Janssen et al 2005, Jans-
sen 2009). Dexen has been developed with a fun-
damentally different type of architecture to achieve
improvements in both speed and flexibility.
The process of running a population based
optimisation problem within Dexen is described
as a job. The blueprint for the job is referred to as
a job definition or (in the case of design jobs) the
design schema (Janssen 2004). The schema de-
fines a set of computational procedures, which are
referred to as tasks. When a job is run, the tasks
will be executed by Dexen. Each task will act on
entities in the population called individuals. An
individual represents a complete solution to the
problem being optimised.
For design optimisation jobs, the schema will
typically include three tasks: development, evalu-
ation, and feedback. Development will generate a
model of the design. Evaluation will evaluate some
performance criteria of the model. Finally, feedback
will use the results from evaluation to generate or
modify individuals. If the algorithm being used is an
evolutionary algorithm, then feedback will kill some
low performance individuals, and generate some
new individuals using crossover and mutation.
Dexen has been designed for two levels of user,
which we refer to as general users and specialist us-
ers. General users are assumed to have the required
programming skills to developed their own schemas
from scratch. Specialist users may not have the re-
quired programming skills, but will instead be able
to create schemas by using automated schema gen-
erators. Specialist users may typically have advanced
knowledge and skills in their domain of interest.
Different schema generators can be created for
various areas of specialisation. Each schema gen-
erator will target specific software tools. Currently, a
schema generator has been developed focusing on
architectural design using the Sidefx Houdini soft-
ware, to be discussed in more detail in section 4.
Dexen components
Dexen consists of four main types of compo-
nents: one server, and multiple clients, masters and
slaves. Each of these components can run on sepa-
rate machines, thereby allowing the computation to
be distributed between multiple machines.
Design Tool Development - eCAADe 29 309
The server is the core of the system, and all oth-
er components connect to the server.
Each client provides a user interface for an end
user to start, stop, and monitor the progress of
jobs. When a user starts a job, they need to use
the client to upload the schema for that job.
This schema will include a set of tasks that need
to be executed.
Each master manages one job, including the
population of individuals associated with that
job. A user my start multiple jobs, in which case
Dexen will create multiple masters.
Slaves execute the user defined tasks associ-
ated with a particular job. Typically, many slaves
will be running in parallel. Dexen will automati-
cally assign slaves to masters to execute tasks
without requiring any action from the user.
A Dexen population consists of a set of individ-
uals, each of which can become a complete solution
to the problem being optimised. Initially, when indi-
viduals are first created, they contain only the basic
parameters (or genes) for a particular solution. As
individuals are processed, they may accumulate ad-
ditional information, and they thereby change their
state.
For example, for an evolutionary schema, an
individual’s state includes it’s genotype, phenotype,
and performance scores. The individual starts life
with only a time of birth and a genotype. The de-
velopment task creates a phenotype. One or more
evaluation tasks calculate the performance scores.
Finally, the feedback task kills some existing individ-
uals, and generate some new individuals (who will
only have a genotype).
The Dexen population is therefore a heteroge-
neous population that contains individuals in differ-
ent states. For example, some may only have geno-
types, some may also have phenotypes, and some
may also have performance scores.
A schema must define two types of tasks: one
master task and one or more slave tasks. The mas-
ter task will usually be used to configure various
settings and to initialize the population. Initialization
typically consists of creating a set of new individuals
to start the optimization process. Each slave task will
then process individuals from the population.
Each slave task performs a specific user defined
procedure, and as a result it requires individuals in
a particular state. For example, an evaluation task
may need an individual that already has a pheno-
type, but that does not yet have an evaluation score.
Individuals that do not meet these criteria need to
be rejected. A filtering process therefore has to take
place in order to discover which individuals in the
population match which slave task. In order to do
this, each slave task is assigned a user define bool-
ean function, referred to as the filter function. This
is used to decide if a particular individual is valid for
processing by that task.
EVO-DEVO-DESIGN FOR NON-PROGRAM-
MERS
For the general user, writing a schema involves
defining the tasks that will be executed by Dexen.
The user needs to define one master task, and one or
more slave tasks. The programming model that has
been defined for these tasks is both simple and pow-
erful. The schema has to be written in Python, and a
basic understanding of object-orientated program-
ming is required.
However, for users that are non-programmers,
writing a schema may be difficult. Such users may
be architects and engineers who are experts in their
own field, but who may not have the required pro-
gramming skills needed to write the schema code.
For such users, schema generators can be used in
order to automate the process of creating schemas.
Schema generators are implemented as part of the
client and run on the user’s local computer.
Schema generators target specific software ap-
plications. The user will be required to define the
problem specific aspects of their schemas in some
format that will not require them to write code. For
example, in a design scenario, the user would be re-
quired to define the design development and one
310 eCAADe 29 - Design Tool Development
or more design evaluation procedures. The schema
generator can then be used to generate all the nec-
essary Python code to wrap these core procedures.
In order to define the core procedures, a design-
er could use the Visual Dataflow Modelling approach
(VDM). VDM allows users to program by visually link-
ing together graphical nodes with wires. The nodes
and wires are arranged by users to create complex
networks through which data can flow. Each node
represents a function, and the wires represent the
data inputs and outputs for the function (Woodbury
2010, Janssen and Chen 2011).
The Houdini schema generator
In order to demonstrate this approach, a sche-
ma generator has been developed for a 3D CAD and
animation software, called SideFX Houdini.
For development, a Houdini network that gen-
erates a phenotype from a set of genes is required.
The phenotype will be some kind of model of the
design variant. For evaluation, the Houdini network
that generates an evaluation score from a pheno-
type is required. A simulation program may be used
in order to perform the evaluation. If more than one
criteria needs to be evaluated, then multiple net-
works can be created.
The Houdini schema generator provides a set of
Houdini nodes that the development and evaluation
network must use. These nodes are used to define
the start and end points of each network, and the
user can then create any type of network between
these two points. The Python code generated by
the schema generator will assume that these special
nodes are present and will read and write data from
these nodes. For development, a genotype and a
phenotype node is provided. For evaluation, a phe-
notype and an evaluation score node is provided.
The user also needs to set some basic param-
eters in a settings file for the schema generator. The
parameters that can be set include the following:
The optimisation algorithm to be used. Options
include hill climbing, simulated annealing, or
evolutionary algorithm.
The population size, the maximum number of
births, and the input sizes for all tasks, including
feedback.
The settings for the feedback task, including the
ranking and selection algorithms to use for the
birth and death of individuals.
The names of the Houdini files in which the
development and evaluation networks are de-
fined.
The structure of the genotype, including the
types of genes. (For example, genes can be inte-
gers, floats, or strings.) The length of the geno-
type is assumed to remain constant.
In order to generate the schema, the user places
the settings file and the Houdini files in a single fold-
er, and then uses the client to execute the schema
generator script. This will result in the Python files
being automatically generated for the schema, and
being placed in the same folder.
The generator will create the Python code for
the master task, and each of the slave tasks. For the
development and evaluation tasks, Python wrap-
pers will be generated for the Houdini files. For the
feedback task, a simple feedback procedure will
be generated. In this procedure, the individuals re-
ceived by the feedback task will be ranked, the low
performance individuals will be killed, and the high
performance individuals will be used as parents for
breeding new individuals.
The user may then upload this schema to the
server to start running the job.
A CASE STUDY
In the case study, a Houdini schema was de-
veloped for a free-standing house in a residential
setting. Three performance criteria were defined:
minimization of energy consumption, maximiza-
tion of daylight, and maximization of privacy. A
number of Houdini files were created, and the
Houdini schema generator was then used to
automatically generate the Python code for the
schema.
Design Tool Development - eCAADe 29 311
Slave tasks
In total, four Houdini files were created, one for
each slave task: the development task, the energy
evaluation task, the daylight evaluation task, and the
privacy evaluation task. Each Houdini file contains a
network of nodes that define a problem specific pro-
cedure to be executed by Dexen.
The Houdini development network maps the
genotype to the phenotype. The network starts with
a Dexen genotype node and ends with a Dexen phe-
notype node.
The genotype in this case consists of 55 real
valued genes, each in the range 0.0 to 1.0. The phe-
notype is a three dimensional model of the house,
saved in the Houdini format. The model is shown in
Fig. 1.
The house is spread over three floors, and has
a living room, dining room, a kitchen, and four bed-
rooms. A stair-core gives access to all three floors.
The living room, dining room, and kitchen are always
located on the ground floor. In addition, one of these
spaces on the ground floor will be a double height
space. The bedrooms are all located on the upper
floors. Service spaces such as bathrooms and store
rooms are not included. A typical (randomly gener-
ated) house is shown in Fig 2, and the genotype for
this example is show in Fig 1.
Conceptually, the developmental process can
be thought of as a process that transforms an initial
simple model into a final complex model. The initial
model consists of 12 equal spaces. On each floor, four
spaces are packed together around a centre point
so that they meet in the middle. The model of the
house (i.e. the phenotype) is generated as follows:
The programmes are assigned to the spaces
using 9 genes. The programmes are subject to
various constraints. For example, for the ground
floor level, the stair-core and living room must
be adjacent to one another, and the living room
and dining room must be adjacent to one an-
other. Each programme also has a required area.
The size of each space is defined using 9 genes.
Since the area is already known, the genes only
need to assign the proportion of the spaces. The
three stair-core spaces are also constrained to
all have the same size, so that the stack on top
of each other.
The windows are inserted using 12 genes. Each
space can have a window in either of its two
outward facing walls. The two possible window
types are strip window or fully glazed. Each
room must have at least one window, but can-
not have two fully glazed windows.
The sun shades are defined using 24 genes. Sun
shades are only added to walls that have win-
dows. The genes control the depth of each of
the sun shades.
The orientation of the building is defined using
one gene. The building is first placed in the cen-
tre of the site, orientated so that the stair-core is
facing the road. The gene is then used to rotate
the building by a certain amount.
In many cases, the genes are mapped to some
other values. For example, the sun shade genes are
mapped to a dimension from 0 to 2 meters, and the
orientation gene is mapped to an angle from -45 to
45 degrees. In some cases, the genes can also be
mapped to a set of discrete variables. For example,
for the window genes, each gene is mapped to one
of the seven possible choices of window pair choices.
On level 2, a situation can arise where one
of the bedrooms is diagonally opposite the stair-
core. In such a case, the bedroom would become
Fig 1. An example of a house
generated using the Houdini
developmental network.
312 eCAADe 29 - Design Tool Development
inaccessible. As a result, if this situation arises, then
the spaces are offset in order to create an adjacency
between the stair-core and the diagonal bedroom.
This situation can be seen in the example shown in
Fig. 1.
After the main geometry of the house has been
generated, all dimensions are then snapped to a
constructional grid. In this case, this grid was set to
0.3 meters. This final step ensures that there are no
awkward dimensions. This also means that the area
of the rooms will not exactly match the required ar-
eas for each programme. However, since the devia-
tion is small, this is seen as being acceptable.
Each Houdini evaluation network uses the
phenotype (generated by the development task) to
calculate an evaluation score. Each network starts
with a Dexen phenotype node, and ends with a
Dexen evaluation score node. In addition, custom
nodes have been developed to actually perform
each type of evaluation. These custom evalua-
tion nodes provide the user with a simple method
of running the required simulations. The custom
nodes can be inserted into the Houdini network to
perform the simulation. The input into the custom
node will be the Houdini geometry, and the output
will be the simulation results. The custom nodes
will also have a set of simulation parameters that
can be set by the user.
For energy and daylighting evaluation, the
EnergyPlus and Radiance simulation programs are
used respectively. The custom nodes will read the
Houdini geometry, generate the text-based input
file, execute the simulation program, read the text-
based results file, and finally import the results back
into Houdini. The EnergyPlus node calculates the
energy required to keep the house within a certain
temperature range using an ideal load air system.
The Radiance node calculates the percentage of
floor area that has a daylight level of higher than 300
Lux for a standard overcast sky condition. At an early
design stage, these are seen as good indicators of
the relative performance of the design with respect
to energy consumption and daylighting.
For privacy, a custom node is used that calcu-
lates the privacy level of each window based on the
relative position and orientation of other windows of
neighbouring houses. (See Fig. 2.) A value of 100%
indicates total privacy, while 0% indicates no privacy.
This calculation is performed inside Houdini, so in
this case, no external simulation program is required.
RESULTS
The schema generator settings file was used to
set the key parameters for the job. The optimisation
algorithm was set to use an evolutionary algorithm,
and the ranking algorithm was set to use Pareto
ranking. The population size was set to 100, and the
maximum number of births was set to 10,000. The
input sizes for all tasks was set to 1, except for feed-
back, for which the input size was set to 20.
The job was executed on a cluster of 20 stan-
dard desktop PCs and was run overnight. The job
took approximately 7 hours to complete.
The Pareto graphs for the results are shown in
Fig. 3. Since there are three performance criteria, two
Pareto graphs are shown, one plotting energy against
privacy and another potting energy against daylight.
The Pareto front is plotted on both of these two graphs.
The Pareto graphs show how the number of in-
dividuals generated by the evolutionary process gets
more dense closer to the Pareto front. The individuals
Fig 2. The house on the site,
surrounded by ve other
houses.
Design Tool Development - eCAADe 29 313
in the initial population were mostly far away from
the Pareto front. Through inheritance of favourable
genes, the population as a whole gradually evolved,
with individuals in the population gradually becom-
ing optimised for the selected performance criteria.
In total, there are 52 individuals on the Pareto
front. Of these individuals, most were born at the end
of the evolutionary process. (Out of the 52 individu-
als on the Pareto front, 40 were born during the last
1000 births. However, individual 13 actually turned
out to be one of the best, and survived all the way
until the end.) These individuals represent different
trade-offs between energy, daylight, and privacy.
From this Pareto optimal set, individuals that had an
energy score of less than 115 KWh, or a daylighting
score of less than 75%, or a privacy score of less than
65% are eliminated. This then leaves 25 individuals,
from which the designer can select a preferred de-
sign. One of the best individuals is shown in Fig 4.
CONCLUSIONS
Initial experiment using Dexen have shown that
the use of schema generators lowers the threshold
for non-programmers to start using advanced opti-
misation techniques. In fact, it is now possible to run
complex optimisation algorithms using only graphi-
cal CAD tools.
Dexen also achieves its two main goals of
speed and flexibility. In terms of speed, the distrib-
uted master-slave architecture means that Dexen
can easily be deployed on compute clusters, and
as a result, large and complex optimisation jobs
that would otherwise take days to run can now
be completed overnight. In terms of flexibility, the
Dexen allows a wide variety of optimisation prob-
lems to be defined.
Fig 3. Two Pareto graphs
plotting energy against
privacy and energy against
daylight. Each point repre-
sents a design. Larger black
and white circles represent
designs on the Pareto front.
White circles represent de-
signs where energy < 115
KWh, daylight > 75%, and
privacy > 65%.
Fig 4. Four examples of
evolved design variants for
a suburban detached house.
The examples shown were
randomly selected from a
population of 100 individu-
als, after approximately 8000
births.
314 eCAADe 29 - Design Tool Development
REFERENCES
Bentley, P and Kumar, S 1999, Three ways to grow
designs: A comparison of embryogenies of an
evolutionary design problem, in Proceedings of
the 1999 conference on Genetic and evolutionary
computation, Morgan Kaufmann, pp 35–43.
Caldas, L 2001, An Evolution-Based Generative Design
System: Using Adaptation to Shape Architectural
Form, Doctoral dissertation, Massachusetts Insti-
tute of Technology.
Frazer, J. H. (1995) An Evolutionary Architecture, AA
Publications, London, UK.
Hornby, G 2005, Measuring, enabling and comparing
modularity,regularity and hierarchy in evolu-
tionary design, in Proceedings of the 2005 confer-
ence on Genetic and evolutionary computation,
vol 2, pp 1729-1736.
Janssen, PHT 2004, A design method and a computation-
al architecture for generating and evolving building
designs. Doctoral dissertation, School of Design
Hong Kong Polytechnic University (October 2004).
Janssen, PHT, Frazer, JH, and Tang, MX 2005, Genera-
tive Evolutionary Design: A Framework for Gen-
erating and Evolving Three-Dimensional Build-
ing Models, in Proceedings of the 3rd International
Conference on Innovation in Architecture, Engi-
neering and Construction (AEC 2005), pp 35-45.
Janssen, PHT 2009, An evolutionary system for de-
sign exploration, in Proceedings of the CAAD Fu-
tures ‘09, pp. 259-272.
Janssen, PHT and Chen, KW 2011, Visual Dataflow Mod-
elling: A Comparison of Three Systems, in Proceed-
ings of the CAAD Futures ‘11, (to be published).
Janssen, PHT, Chen, KW, and Basol, C 2011, Iterative
Virtual Prototyping: Linking Houdini with Radi-
ance and EnergyPlus, in Proceedings of the CAAD
Futures ‘11, (to be published).
Kowaliw, T and Banzhaf, W 2011, Mechanisms for
Complex Systems Engineering through Artifi-
cial Development in Morphogenetic Engineer-
ing: Toward Programmable Complex Systems,
to appear in Morphogenetic Engineering: Toward
Programmable Complex Systems, Springer-Verlag.
Mitchell, M 1996, An introduction in Genetic Algo-
rithms, MIT Press, Massachusetts Institure of
Technology, 1996.
Shea, K 1997, Essays of Discrete Structures: Purposeful
Design of Grammatical Structures by Directed
Stochastic Search, Doctoral dissertation, Carn-
egie Mellon University, Pittsburgh, PA.
Stanley KO and Miikkulainen, R 2003, A taxonomy for
artificial embryogeny, in Artificial Life, 9(2):93–
130.
Woodbury, R 2010, Elements of Parametric Design,
Routledge, NY.
... Evolutionary developmental design (Evo-Devo-Design) is a design method that combines complex developmental techniques with an evolutionary optimization technique (Janssen et al. 2011). In the architectural field, Evo-Devo approach is often used in design generation. ...
... In contrast, Evo-Devo approach also has been applied to performance-based building design optimization, where façade design is treated as a developmental change to create design variants with high diversity and environmental responsiveness. Janssen has presented design approaches using Evo-Devo approach involving façade development with a simple shading and fenestration pattern (Janssen et al. 2011). Although this can be of greater relevance to practice, the approach simply integrates façade design into a linear workflow but without additional design consideration such as comparing different façade schemes. ...
Conference Paper
Full-text available
One of the problems lies in performance-based architectural design optimization is the separation of building massing design and facade design. The separation of design processes significantly weakens the synergizing of building massing and facades for more progressive performance improvement. In order to overcome this weakness, this paper presents a performance-based design optimization workflow combining facade with building massing design using an Evo-Devo approach. This workflow enables architects to make a rapid design exploration of different façade design schemes incorporating building massing design optimization. For demonstration, a case study is presented to show how this approach can facilitate early-stage architectural design exploration, and how the combination of the two factors can outperform the results produced by separated design processes.
... In order to support this exploratory process, researchers have proposed the use of populationbased optimisation algorithms to search for a set of well-performing design variants (Caldas 2008;Flager et al. 2009;Janssen et al. 2011;Lin and Gerber, 2014;Turrin et al. 2011). Such algorithms optimise a population of design variants based on a set of performance objectives. ...
... A basic Euclidian distance-based clustering algorithm is sufficient. For this research, k-means analysis (Hartigan 1975) is used. It is one of the most common Euclidean distance-based algorithm used in data mining. ...
Conference Paper
Full-text available
In order to support exploration in the early stages of the design process, researchers have proposed the use of population-based multi-objective optimisation algorithms. This paper focuses on analysing the resulting population of design variants in order to gain insights into the relationship between architectural features and design performance. The proposed analysis method uses a combination of k-means clustering and Archetypal Analysis in order to partition the population of design variants into clusters and then to extract exemplars for each cluster. The results of the analysis are then visualised as a set of charts and as design models. A demonstration of the method is presented that explores how self-shading geometry, envelope materials, and window area affect the overall performance of a simplified building type. The demonstration shows that although it is possible to derive general knowledge linking architectural features to design performance, the process is still not straightforward. The paper ends with a discussion on how the method can be further improved.
... The actual design and planning are still led by human inspirations and experiences. Since designers normally only propose a few alternatives (Janssen et al., 2011), the author postulates that as complexity increases, humans' limited cognitive ability becomes the constraint. In fact, the complexity of human designed objects is still nowhere near that of natural organisms (French, 1994). ...
... In fact, the complexity of human designed objects is still nowhere near that of natural organisms (French, 1994). Therefore, it would be valuable to create computer systems that can generate and compare different design options (Janssen et al., 2011). This research explores how principles derived from nature can help computer algorithms generate complex designs for residential development. ...
Thesis
Full-text available
This research presents an algorithm (COULD – COmputational Urban Layout Design) for generating residential development plans. Inspired by developmental biology, COULD can grow a plan from scratch or improve existing ones. The universal modules are the basic building blocks, which play similar roles to biological cells. They are ‘genetically identical’ with full developmental potential but will change physical form according to their local context. The ability to adapt is maintained throughout the developmental process during which modules sort out their dependencies through interactions. The use of generative genotype and its separation from phenotype give COULD characteristics of an emergent system.
... This paper proposes a design method for architects to explore low exergy design in the early design stage. The proposed design method combines integrated optimisation system such as GENE_ARCH (Caldas, 2006 ), Para- Gen (Turrin et al, 2011) and DEXEN (Janssen et al, 2011) with low exergy design evaluation tools such as the DPV (Schlueter and Thesseling, 2008). ...
Conference Paper
Full-text available
Evolutionary algorithms have popularly been used for the past ten years in building performance optimisation. This paper will present the use of multi-objective ant colony algorithm as a possible alternative to multi-objective evolutionary algorithm. The multi-objective optimisation of a semi-transparent building integrated photovoltaic (BIPV) facade is used for the proof of concept. The design of semi-transparent BIPV facades has an impact on a wider range of factors, including solar heat gain and daylight penetration into the rooms of the building. Results from the experiments conducted show that multi-objective ant colony algorithm can speed up the multi-objective optimisation process but does not perform as well as the multi-objective evolutionary algorithm.
Article
Simulation-Based Multi-Objective Optimization (SBMOO) methods are being increasingly used in conceptual architectural design. They mostly focus on the solving, rather than the re-formulation, of a Multi-Objective Optimization (MOO) problem. However, Optimization Problem Re-Formulation (Re-OPF) is necessary for treating ill-defined conceptual architectural design as an iterative exploration process. The paper proposes an innovative SBMOO method which builds in a dynamic and interactive Re-OPF phase. This Re-OPF phase, as the main novelty of the proposed method, aims at achieving a realistic MOO model (i.e., a parametric geometry-simulation model which includes important objectives, constraints, and design variables). The proposed method is applied to the conceptual design of a top-daylighting system, focusing on divergent concept generation. The integration of software tools Grasshopper and modeFRONTIER is adopted to support this application. The main finding from this application is that the proposed method can help to achieve quantitatively better and qualitatively more diverse Pareto solutions.
Article
A platform for experimenting with population-based design exploration algorithms is presented, called Dexen . The platform has been developed in order to address the needs of two distinct groups of users loosely labeled as researchers and designers . Whereas the researchers group focuses on creating and testing customized toolkits, the designers group focuses on applying these toolkits in the design process. A platform is required that is scalable and extensible: scalable to allow computationally demanding population-based exploration algorithms to be executed on distributed hardware within reasonable time frames, and extensible to allow researchers to easily implement their own customized toolkits consisting of specialized algorithms and user interfaces. In order to address these requirements, a three-tier client–server system architecture has been used that separates data storage, domain logic, and presentation. This separation allows customized toolkits to be created for Dexen without requiring any changes to the data or logic tiers. In the logic tier, Dexen uses a programming model in which tasks only communicate through data objects stored in a key-value database. The paper ends with a case study experiment that uses a multicriteria evolutionary algorithm toolkit to explore alternative configurations for the massing and façade design of a large residential development. The parametric models for developing and evaluating design variants are described in detail. A population of design variants are evolved, a number of which are selected for further analysis. The case study demonstrates how evolutionary exploration methods can be applied to a complex design scenario without requiring any scripting.
Conference Paper
Full-text available
Designers interested in applying evo-devo-design methods for performance based multi-objective design exploration have typically faced two main hurdles: it's too hard and too slow. An evo-devo-design method is proposed that effectively overcomes the hurdles of skill and speed by leveraging two key technologies: computational workflows and cloud computing. In order to tackle the skills hurdle, Workflow Systems are used that allow users to define computational workflows using visual programming techniques. In order to tackle the speed hurdle, cloud computing infrastructures are used in order to allow the evolutionary process to be parallelized. We refer to the proposed method as Evo-Devo In The Sky (EDITS). This paper gives an overview of both the EDITS method and the implementation of a software environment supporting the EDITS method. Finally, a case-study is presented of the application of the EDITS method.
Conference Paper
Full-text available
Visual programming languages enable users to create computer programs by manipulating graphical elements rather than by entering text. The difference between textual languages and visual languages is that most textual languages use a procedural programming model, while most visual languages use a dataflow programming model. When visual programming is applied to design, it results in a new modelling approach that we refer to 'visual dataflow modelling' (VDM). Recently, VDM has becoming increasingly popular within the design community, as it can accelerate the iterative design process, thereby allowing larger numbers of design possibilities to be explored. Furthermore, it is now also becoming an important tool in performance-based design approaches, since it may potentially enable the closing of the loop between design development and design evaluation. A number of CAD systems now provide VDM interfaces, allowing designers to define form generating procedures without having to resort to scripting or programming. However, these environments have certain weaknesses that limit their usability. This paper will analyse these weaknesses by comparing and contrasting three VDM environments: McNeel Grasshopper, Bentley Generative Components, and Sidefx Houdini. The paper will focus on five key areas: * Conditional logic allow rules to be applied to geometric entities that control how they behave. Such rules will typically be defined as if-then-else conditions, where an action will be executed if a particular condition is true. A more advanced version of this is the while loop, where the action within the loop will be repeatedly executed while a certain condition remains true. * Local coordinate systems allow geometric entities to be manipulated relative to some convenient local point of reference. These systems may be either two-dimensional or three-dimensional, using either Cartesian, cylindrical, or spherical systems. Techniques for mapping geometric entities from one coordinate system to another also need to be considered. * Duplication includes three types: simple duplication, endogenous duplication, and exogenous duplication. Simple duplication consists of copying some geometric entity a certain number of times, producing identical copies of the original. Endogenous duplication consist of copying some geometric entity by applying a set of transformations that are defined as part of the duplication process. Lastly, exogenous duplication consists of copying some geometric entity by applying a set of transformations that are defined by some other external geometry. * Part-whole relationships allow geometric entities to be grouped in various ways, based on the fundamental set-theoretic concept that entities can be members of sets, and sets can be members of other sets. Ways of aggregating data into both hierarchical and non-hierarchical structures, and ways of filtering data based on these structures need to be considered. * Spatial queries include relationships between geometric entities such as touching, crossing, overlapping, or containing. More advanced spatial queries include various distance based queries and various sorting queries (e.g. sorting all entities based on position) and filtering queries (e.g. finding all entities with a certain distance from a point). For each of these five areas, a simple benchmarking test case has been developed. For example, for conditional logic, the test case consists of a simple room with a single window with a condition: the window should always be in the longest north-facing wall. If the room is rotated or its dimensions changed, then the window must re-evaluate itself and possibly change position to a different wall. For each benchmarking test-case, visual programs are implemented in each of the three VDM environments. The visual programs are then compared and contrasted, focusing on two areas. First, the type of constructs used in each of these environments are compared and contrasted. Second, the cognitive complexity of the visual programming task in each of these environments are compared and contrasted.
Article
Full-text available
In this chapter, we argue that artificial development is an appropriate means of approaching complex systems engineering. Artificial development works via the inclusion of mechanisms that enhance the evolvability of a design space. Two of these mechanisms, regularities and adaptive feedback with the environment, are discussed. We concentrate on the less explored of the two: adaptive feedback. A concrete example is presented and applied to a simple artificial problem resembling vasculogenesis. It is shown that the use of a local feedback function substantively improves the efficacy of a machine learner on the problem. Further, inclusion of this adaptive feedback eliminates the sensitivity of the machine learner to a system parameter previously shown to correspond to problem hardness.
Conference Paper
Full-text available
This paper describes a comprehensive framework for generative evolutionary design. The key problem that is identified is generating alternative designs that vary in a controlled manner. Within the proposed framework, the design process is split into two phases: in the first phase, the design team develops and encodes the essential and identifiable character of the designs to be generated and evolved; in the second phase, the design team uses an evolutionary system to generate and evolve designs that incorporate this character. This approach allows design variability to be carefully controlled. In order to verify the feasibility of the proposed framework, a generative process capable of generating controlled variability is implemented and demonstrated.
Conference Paper
Full-text available
Visual programming languages enable users to create computer programs by manipulating graphical elements rather than by entering text. The difference between textual languages and visual languages is that most textual languages use a procedural programming model, while most visual languages use a dataflow programming model. When visual programming is applied to design, it results in a new modelling approach that we refer to 'visual dataflow modelling' (VDM). Recently, VDM has becoming increasingly popular within the design community, as it can accelerate the iterative design process, thereby allowing larger numbers of design possibilities to be explored. Furthermore, it is now also becoming an important tool in performance-based design approaches, since it may potentially enable the closing of the loop between design development and design evaluation. A number of CAD systems now provide VDM interfaces, allowing designers to define form generating procedures without having to resort to scripting or programming. However, these environments have certain weaknesses that limit their usability. This paper will analyse these weaknesses by comparing and contrasting three VDM environments: McNeel Grasshopper, Bentley Generative Components, and Sidefx Houdini. The paper will focus on five key areas: * Conditional logic allow rules to be applied to geometric entities that control how they behave. Such rules will typically be defined as if-then-else conditions, where an action will be executed if a particular condition is true. A more advanced version of this is the while loop, where the action within the loop will be repeatedly executed while a certain condition remains true. * Local coordinate systems allow geometric entities to be manipulated relative to some convenient local point of reference. These systems may be either two-dimensional or three-dimensional, using either Cartesian, cylindrical, or spherical systems. Techniques for mapping geometric entities from one coordinate system to another also need to be considered. * Duplication includes three types: simple duplication, endogenous duplication, and exogenous duplication. Simple duplication consists of copying some geometric entity a certain number of times, producing identical copies of the original. Endogenous duplication consist of copying some geometric entity by applying a set of transformations that are defined as part of the duplication process. Lastly, exogenous duplication consists of copying some geometric entity by applying a set of transformations that are defined by some other external geometry. * Part-whole relationships allow geometric entities to be grouped in various ways, based on the fundamental set-theoretic concept that entities can be members of sets, and sets can be members of other sets. Ways of aggregating data into both hierarchical and non-hierarchical structures, and ways of filtering data based on these structures need to be considered. * Spatial queries include relationships between geometric entities such as touching, crossing, overlapping, or containing. More advanced spatial queries include various distance based queries and various sorting queries (e.g. sorting all entities based on position) and filtering queries (e.g. finding all entities with a certain distance from a point). For each of these five areas, a simple benchmarking test case has been developed. For example, for conditional logic, the test case consists of a simple room with a single window with a condition: the window should always be in the longest north-facing wall. If the room is rotated or its dimensions changed, then the window must re-evaluate itself and possibly change position to a different wall. For each benchmarking test-case, visual programs are implemented in each of the three VDM environments. The visual programs are then compared and contrasted, focusing on two areas. First, the type of constructs used in each of these environments are compared and contrasted. Second, the cognitive complexity of the visual programming task in each of these environments are compared and contrasted.
Conference Paper
Full-text available
This paper explores the use of growth processes, or embryogenies, to map genotypes to phenotypes within evolutionary systems. Following a summary of the significant features of embryogenies, the three main types of embryogenies in Evolutionary Computation are then identified and explained: external, explicit and implicit. An experimental comparison between these three different embryogenies and an evolutionary algorithm with no embryogeny is performed. The problem set to the four evolutionary systems is to evolve tessellating tiles. In order to assess the scalability of the embryogenies, the problem is increased in difficulty by enlarging the size of tiles to be evolved. The results are surprising, with the implicit embryogeny outperforming all other techniques by showing no significant increase in the size of the genotypes or decrease in accuracy of evolution as the scale of the problem is increased.
Conference Paper
Full-text available
For computer-automated design systems to scale to complex designs they must be able to produce designs that exhibit the characteristics of modularity, regularity and hierarchy -- characteristics that are found both in man-made and natural designs. Here we claim that these characteristics are enabled by implementing the attributes of combination, control-flow and abstraction in the representation.To support this claim we use an evolutionary algorithm to evolve solutions to different sizes of a table design problem using five different representations, each with different combinations of modularity, regularity and hierarchy enabled and show that the best performance happens when all three of these attributes are enabled.We also define metrics for modularity, regularity and hierarchy in design encodings and demonstrate that high fitness values are achieved with high values of modularity, regularity and hierarchy and that there is a positive correlation between increases in fitness and increases in the measured values of modularity, regularity and hierarchy.
Conference Paper
This paper proposes a software and hardware architecture for an evolutionary design exploration system for building design. First, the typical architecture for such systems is described. Certain limitations of this typical architecture are identified. The proposed architecture is then described and the advantages of this architecture are highlighted. Finally, the development of systems implementing this architecture is discussed.
Article
This dissertation dwells in the interstitial spaces between the fields of architecture, environmental design and computation. It introduces a Generative Design System that draws on evolutionary concepts to incorporate adaptation paradigms into the architectural design process. The initial aim of the project focused on helping architects improving the environmental performance of buildings, but the final conclusions of the thesis transcend this realm to question the process of incorporating computational generative systems in the broader context of architectural design. The Generative System [GS] uses a Genetic Algorithm as the search and optimization engine. The evaluation of solutions in terms of environmental performance is done using DOE2.1E. The GS is first tested within a restricted domain, where the optimal solution is previously known, to allow for the evaluation of the system's performance in locating high quality solutions. Results are very satisfactory and provide confidence to extend the GS to complex building layouts. Comparative studies using other heuristic search procedures like Simulated Annealing are also performed. The GS is then applied to an existing building by Alvaro Siza, to study the system's behavior in a complex architectural domain, and to assess its capability for encoding language constraints, so that solutions generated may be within certain design intentions. An extension to multicriteria problems is presented, using a Pareto-based method.
Article
In "An Evolutionary Architecture", John Frazer presents an overview of his work for the past 30 years. Attempting to develop a theoretical basis for architecture using analogies with nature's processes of evolution and morphogenesis. Frazer's vision of the future of architecture is to construct organic buildings. Thermodynamically open systems which are more environmentally aware and sustainable physically, sociologically and economically. The range of topics which Frazer discusses is a good illustration of the breadth and depth of the evolutionary design problem. Environmental Modelling One of the first topics dealt with is the importance of environmental modelling within the design process. Frazer shows how environmental modelling is often misused or misinterpreted by architects with particular reference to solar modelling. From the discussion given it would seem that simplifications of the environmental models is the prime culprit resulting in misinterpretation and misuse. The simplifications are understandable given the amount of information needed for accurate modelling. By simplifying the model of the environmental conditions the architect is able to make informed judgments within reasonable amounts of time and effort. Unfortunately the simplications result in errors which compound and cause the resulting structures to fall short of their anticipated performance. Frazer obviously believes that the computer can be a great aid in the harnessing of environmental modelling data, providing that the same simplifying assumptions are not made and that better models and interfaces are possible. Physical Modelling Physical modelling has played an important role in Frazer's research. Leading to the construction of several novel machine readable interactive models, ranging from lego-like building blocks to beermat cellular automata and wall partitioning systems. Ultimately this line of research has led to the Universal Constructor and the Universal Interactor. The Universal Constructor The Universal Constructor features on the cover of the book. It consists of a base plug-board, called the "landscape", on top of which "smart" blocks, or cells, can be stacked vertically. The cells are individually identified and can communicate with neighbours above and below. Cells communicate with users through a bank of LEDs displaying the current state of the cell. The whole structure is machine readable and so can be interpreted by a computer. The computer can interpret the states of the cells as either colour or geometrical transformations allowing a wide range of possible interpretations. The user interacts with the computer display through direct manipulation of the cells. The computer can communicate and even direct the actions of the user through feedback with the cells to display various states. The direct manipulation of the cells encourages experimentation by the user and demonstrates basic concepts of the system. The Universal Interactor The Universal Interactor is a whole series of experimental projects investigating novel input and output devices. All of the devices speak a common binary language and so can communicate through a mediating central hub. The result is that input, from say a body-suit, can be used to drive the out of a sound system or vice versa. The Universal Interactor opens up many possibilities for expression when using a CAD system that may at first seem very strange.However, some of these feedback systems may prove superior in the hands of skilled technicians than more standard devices. Imagine how a musician might be able to devise structures by playing melodies which express the character. Of course the interpretation of input in this form poses a difficult problem which will take a great deal of research to achieve. The Universal Interactor has been used to provide environmental feedback to affect the development of evolving genetic codes. The feedback given by the Universal Interactor has been used to guide selection of individuals from a population. Adaptive Computing Frazer completes his introduction to the range of tools used in his research by giving a brief tour of adaptive computing techniques. Covering topics including cellular automata, genetic algorithms, classifier systems and artificial evolution. Cellular Automata As previously mentioned Frazer has done some work using cellular automata in both physical and simulated environments. Frazer discusses how surprisingly complex behaviour can result from the simple local rules executed by cellular automata. Cellular automata are also capable of computation, in fact able to perform any computation possible by a finite state machine. Note that this does not mean that cellular automata are capable of any general computation as this would require the construction of a Turing machine which is beyond the capabilities of a finite state machine. Genetic Algorithms Genetic algorithms were first presented by Holland and since have become a important tool for many researchers in various areas.Originally developed for problem-solving and optimization problems with clearly stated criteria and goals. Frazer fails to mention one of the most important differences between genetic algorithms and other adaptive problem-solving techniques, ie. neural networks. Genetic algorithms have the advantage that criteria can be clearly stated and controlled within the fitness function. The learning by example which neural networks rely upon does not afford this level of control over what is to be learned. Classifier Systems Holland went on to develop genetic algorithms into classifier systems. Classifier systems are more focussed upon the problem of learning appropriate responses to stimuli, than searching for solutions to problems. Classifier systems receive information from the environment and respond according to rules, or classifiers. Successful classifiers are rewarded, creating a reinforcement learning environment. Obviously, the mapping between classifier systems and the cybernetic view of organisms sensing, processing and responding to environmental stimuli is strong. It would seem that a central process similar to a classifier system would be appropriate at the core of an organic building. Learning appropriate responses to environmental conditions over time. Artificial Evolution Artificial evolution traces it's roots back to the Biomorph program which was described by Dawkins in his book "The Blind Watchmaker". Essentially, artificial evolution requires that a user supplements the standard fitness function in genetic algorithms to guide evolution. The user may provide selection pressures which are unquantifiable in a stated problem and thus provide a means for dealing ill-defined criteria. Frazer notes that solving problems with ill-defined criteria using artificial evolution seriously limits the scope of problems that can be tackled. The reliance upon user interaction in artificial evolution reduces the practical size of populations and the duration of evolutionary runs. Coding Schemes Frazer goes on to discuss the encoding of architectural designs and their subsequent evolution. Introducing two major systems, the Reptile system and the Universal State Space Modeller. Blueprint vs. Recipe Frazer points out the inadequacies of using standard "blueprint" design techniques in developing organic structures. Using a "recipe" to describe the process of constructing a building is presented as an alternative. Recipes for construction are discussed with reference to the analogous process description given by DNA to construct an organism. The Reptile System The Reptile System is an ingenious construction set capable of producing a wide range of structures using just two simple components. Frazer saw the advantages of this system for rule-based and evolutionary systems in the compactness of structure descriptions. Compactness was essential for the early computational work when computer memory and storage space was scarce. However, compact representations such as those described form very rugged fitness landscapes which are not well suited to evolutionary search techniques. Structures are created from an initial "seed" or minimal construction, for example a compact spherical structure. The seed is then manipulated using a series of processes or transformations, for example stretching, shearing or bending. The structure would grow according to the transformations applied to it. Obviously, the transformations could be a predetermined sequence of actions which would always yield the same final structure given the same initial seed. Alternatively, the series of transformations applied could be environmentally sensitive resulting in forms which were also sensitive to their location. The idea of taking a geometrical form as a seed and transforming it using a series of processes to create complex structures is similar in many ways to the early work of Latham creating large morphological charts. Latham went on to develop his ideas into the "Mutator" system which he used to create organic artworks. Generalising the Reptile System Frazer has proposed a generalised version of the Reptile System to tackle more realistic building problems. Generating the seed or minimal configuration from design requirements automatically. From this starting point (or set of starting points) solutions could be evolved using artificial evolution. Quantifiable and specific aspects of the design brief define the formal criteria which are used as a standard fitness function. Non-quantifiable criteria, including aesthetic judgments, are evaluated by the user. The proposed system would be able to learn successful strategies for satisfying both formal and user criteria. In doing so the system would become a personalised tool of the designer. A personal assistant which would be able to anticipate aesthetic judgements and other criteria by employing previously successful strategies. Ultimately, this is a similar concept to Negroponte's "Architecture Machine" which he proposed would be computer system so personalised so as to be almost unusable by other people. The Universal State Space Modeller The Universal State Space Modeller is the basis of Frazer's current work. It is a system which can be used to model any structure, hence the universal claim in it's title. The datastructure underlying the modeller is a state space of scaleless logical points, called motes. Motes are arranged in a close-packing sphere arrangement, which makes each one equidistant from it's twelve neighbours. Any point can be broken down into a self-similar tetrahedral structure of logical points. Giving the state space a fractal nature which allows modelling at many different levels at once. Each mote can be thought of as analogous to a cell in a biological organism. Every mote carries a copy of the architectural genetic code in the same way that each cell within a organism carries a copy of it's DNA. The genetic code of a mote is stored as a sequence of binary "morons" which are grouped together into spatial configurations which are interpreted as the state of the mote. The developmental process begins with a seed. The seed develops through cellular duplication according to the rules of the genetic code. In the beginning the seed develops mainly in response to the internal genetic code, but as the development progresses the environment plays a greater role. Cells communicate by passing messages to their immediate twelve neighbours. However, it can send messages directed at remote cells, without knowledge of it's spatial relationship. During the development cells take on specialised functions, including environmental sensors or producers of raw materials. The resulting system is process driven, without presupposing the existence of a construction set to use. The datastructure can be interpreted in many ways to derive various phenotypes. The resulting structure is a by-product of the cellular activity during development and in response to the environment. As such the resulting structures have much in common with living organisms which are also the emergent result or by-product of local cellular activity. Primordial Architectural Soups To conclude, Frazer presents some of the most recent work done, evolving fundamental structures using limited raw materials, an initial seed and massive feedback. Frazer proposes to go further and do away with the need for initial seed and start with a primordial soup of basic architectural concepts. The research is attempting to evolve the starting conditions and evolutionary processes without any preconditions. Is there enough time to evolve a complex system from the basic building blocks which Frazer proposes? The computational complexity of the task being embarked upon is not discussed. There is an implicit assumption that the "superb tactics" of natural selection are enough to cut through the complexity of the task. However, Kauffman has shown how self-organisation plays a major role in the early development of replicating systems which we may call alive. Natural selection requires a solid basis upon which it can act. Is the primordial soup which Frazer proposes of the correct constitution to support self-organisation? Kauffman suggests that one of the most important attributes of a primordial soup to be capable of self-organisation is the need for a complex network of catalysts and the controlling mechanisms to stop the reactions from going supracritical. Can such a network be provided of primitive architectural concepts? What does it mean to have a catalyst in this domain? Conclusion Frazer shows some interesting work both in the areas of evolutionary design and self-organising systems. It is obvious from his work that he sympathizes with the opinions put forward by Kauffman that the order found in living organisms comes from both external evolutionary pressure and internal self-organisation. His final remarks underly this by paraphrasing the words of Kauffman, that life is always to found on the edge of chaos. By the "edge of chaos" Kauffman is referring to the area within the ordered regime of a system close to the "phase transition" to chaotic behaviour. Unfortunately, Frazer does not demonstrate that the systems he has presented have the necessary qualities to derive useful order at the edge of chaos. He does not demonstrate, as Kauffman does repeatedly, that there exists a "phase transition" between ordered and chaotic regimes of his systems. He also does not make any studies of the relationship of useful forms generated by his work to phase transition regions of his systems should they exist. If we are to find an organic architecture, in more than name alone, it is surely to reside close to the phase transition of the construction system of which is it built. Only there, if we are to believe Kauffman, are we to find useful order together with environmentally sensitive and thermodynamically open systems which can approach the utility of living organisms.