Content uploaded by Evangelos Pantazis
Author content
All content in this area was uploaded by Evangelos Pantazis on Oct 02, 2019
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=tasr20
Architectural Science Review
ISSN: 0003-8628 (Print) 1758-9622 (Online) Journal homepage: https://www.tandfonline.com/loi/tasr20
Beyond geometric complexity: a critical review
of complexity theory and how it relates to
architecture engineering and construction
Evangelos Pantazis & David Jason Gerber
To cite this article: Evangelos Pantazis & David Jason Gerber (2019): Beyond geometric
complexity: a critical review of complexity theory and how it relates to architecture engineering and
construction, Architectural Science Review, DOI: 10.1080/00038628.2019.1659750
To link to this article: https://doi.org/10.1080/00038628.2019.1659750
Published online: 17 Sep 2019.
Submit your article to this journal
Article views: 13
View related articles
View Crossmark data
ARCHITECTURAL SCIENCE REVIEW
https://doi.org/10.1080/00038628.2019.1659750
Beyond geometric complexity: a critical review of complexity theory and how it
relates to architecture engineering and construction
Evangelos Pantazisaand David Jason Gerbera,b
aViterbi School of Engineering – Civil and Environmental Engineering, University of Southern California, Los Angeles, CA, USA; bSchool of Architecture,
University of Southern California, Los Angeles, CA, USA
ABSTRACT
The wide application of digital design, the advances of digital fabrication and robotic processes have facili-
tated the materialization of bespoke geometries. In turn it has raised the issue of how architects can reduce
design complexity using computational techniques. This paper presents a survey on complexity theory
inclusive of work from the disciplines which range from cybernetics to systems and information theory. We
synthesize a taxonomy of different definitions of complexity and ways of managing design complexity by
decomposing its different levels as they relate to the fields of architecture, engineering and construction.
Our hypothesis is that by reviewing the literature on complexity theory which appears to be highly frag-
mented, we can aid designers build a better understanding of the underlying principles. Thus designers
can develop a more system approach towards the use of digital design tools and make use of concepts
coming from the field of complexity theory such as abstraction, adaptation and self-organization in order
to come up with novel computational design methods. Such methods can enable designers to deal with
design problems holistically and manage design complexity in the contemporary digital design context.
ARTICLE HISTORY
Received 6 November 2018
Accepted 14 August 2019
KEYWORDS
Complexity theory; design
complexity; holistic design;
complex adaptive systems;
computational design
1. Introduction
It is arguable that complexity is one of the most challenging
intellectual, scientific and technological topics of the twenty-
first century (Wolfram 2002). The complexity of building design
nowadays can be witnessed in mixed-use super structures tow-
ering modern cities in United Arab Emirates, while geometric
complexity has been celebrated into many sophisticated cultural
and residential buildings around the world. From an architec-
tural point of view, it is remarkable to observe the evolution of
building construction in the recent years. Next to iconic stone
cathedrals and buildings of the past centuries, stand now pris-
matic and freeform steel structures, which have been digitally
designed and fabricated and that are equipped with build-
ing systems capable to respond dynamically to environmental
changes (Figure 1). Undoubtedly, advancements in information
technologies have played a big role in the transformation of
the built environment from being the materialization of draw-
ings into constructed form as it was introduced in the Renais-
sance, into becoming the materialization of digital information
(Mitchell 2005). The introduction of embedded building systems
and sensor technologies are turning buildings from static struc-
tures into complex cyber physical systems that can sense and
respond to climatic or temporal changes of the environment
(Malkawi 2005;Rahman2010).
Our environment becomes increasingly complex and there-
fore complexity theory and computational methods have been
radically influencing research in nearly all disciplines both in the
sciences and the humanities (Bundy 2007). A good example from
CONTACT Evangelos Pantazis epantazi@usc.edu Viterbi School of Engineering – Civil and Environmental Engineering, University of Southern California,
Kaprelian Hall 3620 S Vermont Ave, Los Angeles, CA 90089, USA
the field of sciences is the paradigmatic shifts that occurred in
fields such as physics and biology which was a direct result of
adopting notions of complexity and using computational meth-
ods as a primary tool for simulating and modelling natural pro-
cesses. Reductionist models have been successively modified
or replaced as the predominant paradigm of research in the
recent years. To clarify this more, the mechanistic worldview of
nature which relies on the continuous top-down reduction of a
whole into its parts is being replaced by the correlation of local
interactions, and the identification of patterns that can bring
the parts into an equilibrium as an emergent property of the
overall system (Figure 2). For example, scientists and biologists
have investigate closely Complex Adaptive Systems (CAS) such
as termite colonies and by tracking how termites fore-age and
collectively build their habitats, they have developed mathemat-
ical models in order to understand how the complex geometry of
their habitat (termite mounds) is correlated with (a) environmen-
tal conditions of specific locations, (b) the type and amount of
available building material and (c) the termites’ body dimensions
and locomotion (Perna and Theraulaz 2017).
Even though information technologies have radically
changed the way we design and construct buildings, unlike the
fields of biology and physics, architectural design thinking has
not been greatly affected by computational thinking. By taking
into consideration the implications of complexity sciences on
architecture, the objective of this literature survey is twofold: (1)
to provide a taxonomy which can help designers better under-
stand how complexity can be decomposed in multiple levels
© 2019 Informa UK Limited, trading as Taylor & Francis Group
2E. PANTAZIS AND D. J. GERBER
Figure 1. Timeline illustrating the increasing complexity in building structures in terms of the design approach and building paradigm used.
Figure 2. Diagram showing the paradigmatic shift from reductionist to complex models and their relationship to architecture.
by considering it from different perspectives and (2) to illus-
trate the potential benefits of developing computational design
methods, which are based upon the principles of redundancy,
self-organization and adaptation rather than only relying on
reductionist approaches.
1.1. A brief review of complexity in architecture
As early as the 1960s, a group of architects and researchers
such as Gordon Pask, Cedric Price, Nicholas Negroponte and
Christopher Alexander foresaw radical computational advance-
ments and firmly believed that design tools and the architec-
tural approach had to move from being purely descriptive and
prescriptive in the sense of producing blueprints towards one
that can be predictive, which can generate better performing
designs by harnessing the potentials of Artificial Intelligence (AI)
and computational design methods (Frazer 1995; Negroponte
1970;Pask1969). They addressed the necessity of approach-
ing the design process from a ‘system’s thinking’ perspective
and introduced terms such as control, communication and feed-
back as well as emergence and evolution for reconsidering the
architectural design process. Additionally, a number of archi-
tectural theorists have written about complexity in architec-
ture, among them Eisenman (Eisenman 1993), Jencks (Jencks
1997) and Lynn (Lynn and Kelly 1999), but attempts to address
the notion of complexity theory in architecture have rarely
ventured beyond rather diagrammatic and iconic applications.
Despite the automation that digital tools have brought and their
ARCHITECTURAL SCIENCE REVIEW 3
capacity to model and realize complex geometries and to reduce
the time and cost in the design and production of buildings,
in most cases they followed the traditional architectural top
down planning model. However, in recent years the realization
of complex projects has demonstrated the shortcomings of cur-
rent digital design approaches as despite the wide application
of powerful digital design tools, the design process remains still
manual and in many cases largely inefficient. On the contrary,
research work from the fields of biology and computer science
suggest that by employing simple bottom up rules and stochas-
tic processes, phenomena of self-organization can be exploited
and the process of how complex systems achieve order can be
better studied (Crutchfield 1994). Therefore, issues relating to:
(a) the changing role of the architect from a mere author of a
designobjecttotheconductorofdesignrulesand(b)thecon-
ception of the computer as a collaborative partner (by means
of artificial intelligence) instead of simple drafting tool, remain
current although they were raised in the early days of compu-
tation (Gero 1996; Scheurer 2010; Shea, Aish, and Gourtovaia
2005).
1.2. Direction of architectural design
To date, each domain in the construction industry which mainly
includes Architecture Engineering and Construction (AEC) still
provides largely independent solutions, which are already
defined outcomes before being passed from one discipline
to another (Rahman 2010). This lack of discipline integration
and inefficiencies in the design process have resulted into a
built environment that accounts for 40% of global energy con-
sumption and up to 30% of global greenhouse gas (GHG)
emissions (Leicht, Abdelkarim, and Messner 2010). Static build-
ing models which do not consider the parameter of time
(building’s life cycle) are still primarily used (Figure 2). There-
fore, up to 46% of the energy consumption is locked in for
long periods due to the life span of buildings(Block 2009).
As the concerns about the buildings’ environmental footprint
have increased, the environmental performance standards have
become more stringent. This has a direct consequence on archi-
tectural designers and engineers who now have an impor-
tant responsibility to employ design and creativity for reducing
design complexity and achieving more coherent and sustainable
solutions.
Recent work from both the academia and practice shows
that this prospect is achievable by employing computational
design methods that can help manage complexity in the design
process.1
Despite the increasing interest in complexity in the fields of
Architecture and Engineering and the importance of developing
a good understanding, a unified definition of the term is still to
date virtually nonexistent. The observation that there is a lack
of a unifying framework, which can support the investigation
of multiple levels of complexity within the fields of Architec-
ture, Engineering and Construction (AEC) is a core motivation
of this work. Another motivation is that although digital design
tools are widely adopted in the recent years, current design
methodologies could be characterized as computer based and
not computational (Von Bülow 2007). Designers have no holis-
tic understanding of the different levels of complexity and thus
have to a large extent used digital design methods to generate
solely geometric complexity.
1.3. Overview of the paper
The breadth of this topic far extends the limits of this lit-
erature review though a significant attempt was made to
comprehensively address complexity through the perspective of
building design and make this study as self-contained as pos-
sible. In the following section, a holistic overview of the term
‘complexity’ is synthesized by bringing together definitions that
range from scientific domains to AEC. Underlying assumptions of
complexity theory which stem from the tightly related fields of
General Systems Theory(GST), Cybernetics, Information Theory
(IT) and Complex Adaptive Systems (CAS) are brought together
in order to provide a taxonomy of existing definitions of com-
plexity. Specifically, this paper is organized into five sections. In
Section 2, the definition of complexity is discussed alongside
its main sources and measures. In Section 3, the most essential
sources of complexity are presented and the important aspects
of existing methodologies that address it are showcased, while
in Section 4 the most common approaches for dealing with
complexity are identified and directions for future research are
provided. Lastly, in Section 5, a summary and conclusion on the
reviewed material is presented.
2. Background and theoretical framework
There have been terms for complexity in everyday language
since the antiquity, however, the idea of treating it as a coherent
scientific concept is quite new (Wolfram 2002). The multiplicity
of definitions of complexity is an impediment in developing a
clear understanding and indicates a lack of a unifying framework.
In fact, many complexity definitions researched in this paper rep-
resent variations of a few underlying schemes (Crutchfield 1994;
Feldman and Crutchfield 1998). As a historical analog to the
problem of defining and measuring complexity is the problem
of describing electromagnetism before Maxwell’s equations. In
the context of electromagnetism, quantities such as electric and
magnetic forces that arose in different experimental contexts
were originally considered as fundamentally different (Lloyd
2001). It is now common sense that electricity and magnetism
are in fact closely related aspects of the same fundamental quan-
tity, the electromagnetic field. Similarly, researchers in biology,
computer science and engineering have been faced with issues
of complexity but due to the prevailing concept of reduction-
ism, they have considered them within the bounds of their own
discipline. Providing a general overview (taxonomy) of complex-
ity and its implications for architectural design is considered
necessary for being able to handle the increasing complexity
of AEC.
Inordertobeabletoreviewtheliteratureonsuchabroad
topic and draw conclusions which can be useful to the design
computing community a methodology was implemented by the
authors and is illustrated in Figure 3a. It is obvious that due to
its multifaceted nature, the topic cannot be fully covered within
the extents of this paper. Therefore, for an in-depth analysis of
complexity we point the reader to the work of S. Wolfram and
4E. PANTAZIS AND D. J. GERBER
work from Santa Fe institute (Kauffman 1991; Li and Vitányi 2013;
Simon 1996; Wolfram 2002).
2.1. Methodology
Our bibliographic research (ca. 260 publications) was orga-
nized into two levels: on a ‘local’ level, the literature within
the architectural computer-aided design communities (Cumin-
cad, CAADria, eCAADe, ACADia) was queried based on keywords
relating to complexity (i.e. complexity theory, design complex-
ity, architectural complexity, etc.) and main references and key
terms were extracted. On a ‘global’ level, the largest available
corpus of digitalized books (ca. 10,000 publications dating from
1910 to 2010) was queried using N-Gram viewer for different
types of complexity (i.e. architectural complexity) and related
key terms (i.e. feedback, emergence, self-organization) which
were extracted from our local bibliographic search.
N-Gram viewer is an online graphing tool, using a proba-
bilistic Markov model for predicting the combination of char-
acters on a database (collection of digitized books) given an
input sequence of characters and charts, annual counts of words
and possible combinations accordingly (i.e. database =google
Books, input sample sequence =design complexity, 1-gram
sequence: design, complexity, 2-gram sequence: design com-
plexity, ...) (Weiss 2015). The extent of the ‘global’ search was
constrained by the last public release of the Google books
database, which dates back in 2012. This search is used as a
form of validation to highlight how the types of complexity
and related terms which were extracted from publications in
the architectural communities appear in the global literature
(Google 2018). The results of our local analysis of the surveyed
literature are illustrated in Figure 3b. The appearance of the word
complexity in publications from different disciplines during the
past 100 years is plotted.
(a)
(b)
Figure 3. (a) Diagram illustrating the research methodology implemented for this literaturesurvey. (b) Number of publications inclusive of the term complexity in archi-
tecture and related domains reviewed by the authors based on a library built from querying the databases o of CUMINCAD, USC digital Library and JSTOR. (c) Plots of the
appearance of key terms relating to complexity (above) and different types of complexity (below) as n-grams in the Google books library (25,000,000 publications).
ARCHITECTURAL SCIENCE REVIEW 5
(c)
Figure 3. Continued.
From the graph, one can clearly observe that there has been
a constant trend of publications relating to complexity from
1935 to 2010 in different scientific fields including math, physics
and biology but also cybernetics, information theory and com-
puter science. In the architectural literature, apart from Venturi’s
famous book ‘Complexity and Contradiction’ that addressed
complexity as a reaction to the uniformity and reductionism
imposed by modernism (Venturi 1977), there were very few pub-
lications dealing with the topic in a rigorous and scientific way.
However, in the early 1990s, a significant increase in publica-
tions dealing with complexity in architecture and engineering
can be clearly observed in Figure 3b. The literature survey indi-
cates that this increase was a direct consequence of the wide
application of digital design tools in practice on one hand side,
and the founding of Santa Fe institute (SFI) in 1984 on the other
side. SFI was the first institution dedicated on providing a com-
mon ground for studying complexity theory and demonstrated
how complexity can be successfully applied across different sci-
entific and engineering disciplines to solve real-world problems
(Gell-Mann 1992). In Figure 3c, the key terms which appear in
the global literature (with regard to the term complexity) are
extracted and plotted. The graph shows there has been a steady
increase in the use of the terms since the 1940s, while before
that date most of the terms except for structural complexity and
adaptation were almost not present in the literature. In the fol-
lowing sections, the main sources of complexity are described
along with the core underlying assumptions of complexity in
order to better understand the term.
2.2. Sources of complexity
The main source of complexity is undoubtedly Nature, where
in many cases complex phenomena occur through the repeti-
tion of simple rules (Goldenfeld and Kadanoff 1999). Complexity
arises whenever one or more of the following five attributes are
found in a system: (1) high number of parts, relationships and
degrees of freedom, (2) multiple states/communication types,
(3) broken symmetry (differential growth), (4) emergent proper-
ties, (5) non-linearity and (6) a lack of robustness (Yates 1978).
Consequently, the number of components in a building sys-
tem, the tight coupling on multiple levels (social, structural,
functional, geometrical) of all connected elements, the estab-
lishment of specific hierarchies among them and the non-linear
character of their interactions increase significantly the complex-
ity of such a system.
Living systems – organisms, coevolving ecosystems, are the
paramount examples of organized complexity (Holland 1992a).
For example, the genomic systems of a higher metazoan cell
encode on the order of 10,000 to 100,000 structural and reg-
ulatory genes whose joint orchestrated activity constitutes the
developmental program underlying ontogeny of a fertilized egg
(Kauffman 1993). However, apart from examples in nature and
6E. PANTAZIS AND D. J. GERBER
human life (e.g. behavioural, social and environmental sciences),
instances of systems with characteristics of organized complex-
ity are also abundant in applied fields such as architecture and
engineering (Klir 1985). Jane Jacobs stated that an essential qual-
ity shared by all living cities is the high degree of organized
complexity (Jacobs 1961). Pask, emphasizing on the level of
organized complexity embodied in buildings, regarded them
not as ‘machines for living’ but as evolving environments with
which the inhabitants cooperate and in which they perform their
mental processes (Pask 1969). Pask considered architects as ‘sys-
tem designers’ rather than ‘geometers and master builders’ and
identified that there is a demand for a system-oriented think-
ing in order to respond to the complex nature of architectural
design. The design of a building or a city entails another level
of complexity which goes beyond analysing and understand-
ing how such complex systems function and that relates to
the complexity of creating something that does not exist previ-
ously. A number of researchers recently have adopted a more
systemic approach and suggest that living cities and inhab-
ited buildings should be considered as complex holistic systems
(Menges 2013; Salingaros 2000). In Figure 4, a building is decom-
posed into components which interact dynamically giving rise to
phenomena across scales. The components are related with hier-
archies that transcend from socio-economic and cultural rela-
tionships down to geometrical form and the basic structure of
materials.
2.3. Underlying assumptions of complexity
To be able to study complexity beyond the context of architec-
tural design, the association of complexity theory with a number
of underlying principles which are not considered within the
classic scientific paradigm has to be taken into account (Crutch-
field 1994; Dent 1999). Classic science is based upon the assump-
tions that (a) an entity can be divided into component parts
and that cumulative explanation of the parts and their relations
can fully explain the entity (reductionism), (b) phenomena can
be studied objectively (objectivity), which means that if differ-
ent observers look at the same phenomena in the same way
they will create similar descriptions and finally, (c) there is lin-
ear causality between phenomena, which means a cause leads
to one or multiple effects in a linear fashion from the initiation to
the finalization of a process (Dent 1999).
However, a number of ‘younger’ scientific disciplines such IT,
GST and Cybernetics, which mainly appeared in the second half
of twentieth century and were not based upon the aforemen-
tioned assumptions, became the cornerstones of complexity
theory. They introduced an alternative paradigm to that of clas-
sic science, one that is non-deterministic and non-linear (Fischer
2014). Unlike classical scientific approach, this new worldview is
based on three different fundamental assumptions: (a) an entity
can be best understood by considering it in its entirety (holism),
and it has characteristics that belong to the system (as a whole)
and do not belong to any of its parts, (b) the observer is not inde-
pendent from the phenomena and the observers’ experiences
add to the perceived reality (subjectivity) and (c) there exists cir-
cular causality (feedback), which means that cause and effect
in different phenomena is not always linear and that there is
dynamic (non-linear) exchange between action and experience
(Maturana and Varela 1987). In Figure 5, the term of complexity
is decomposed in multiple levels based on this worldview.
The seminal work of Alan Turing, and J. V. Neumann, laid
the foundation of this new worldview by relating complexity
to the bulk of information exchange, which was defined as the
length of the shortest algorithmic description for executing a
given task (Turing 1936;VonNeumann1951). Shannon formal-
ized a general theory of communication as it relates to amount
of information exchange between the feedback mechanisms of
different systems for the accomplishment a given task (Shannon
1948) and is considered the father of Information Theory (IT).
Along this path, N. Wiener and R. Fischer viewed communication
systems as stochastic processes (Fisher 1956; Wiener 1961)and
helped define IT mathematically by introducing the concepts
of disorder and entropy from thermodynamics. Stochastic pro-
cesses have since been central in modelling and solving complex
problems with unknown structure and boundaries and there-
fore are of great interest for design exploration and evolutionary
computation (Oxman 2008).
General Systems Theory (GST) acknowledged the similarity
of principles which apply to systems, regardless of the nature
of their parts or the relations and ‘forces’ between them (Von
Bertalanffy 1973). The particles of a gas in a container are a clear
example of a physical systems in this sense while self-organized
assemblies of organisms such as a beehive, an anthill or a
human community can be considered typical examples of com-
plex adaptive systems (Rapoport 1986). Different systems can be
characterized based on four fundamental components namely:
structure, behaviour, communication and hierarchy (levels of
organization) (Gerard 1958). Expanding on the theory of GST,
Wiener introduced Cybernetics and focused on the relation-
ships among system’s components (feedback) and the manipu-
lation of hierarchies (control) that exist within them rather than
analysing each one of the components in isolation(Wiener 1961).
By embracing non-linearity via circular causality (feedback) and
by introducing concepts such as ‘forward looking search’ in sys-
tem design, cybernetics contributed to the holistic understand-
ing of complex natural phenomena (Holland 1992b).
While the significance of GST is that it offered a conceptual
vehicle that acknowledges principles which apply to systems
in general, regardless of the nature of their components or the
relations and ‘forces’ between them (Fischer 2014), Cybernet-
ics offered a conceptual vehicle on which all systems may be
ordered and understood based on concepts such as behaviour,
feedback and hierarchy (Heylighen and Joslyn 2001).
2.4. Dening complexity
Based on the set of assumptions described in the previous
section, Weaver identifies three types of complexity, organized
simplicity, disorganized complexity and organized complexity
(Weaver 1948). In Figure 6(a), taxonomy of the different classes
and subclasses of complexity that has been surveyed is provided.
Organized simplicity applies mostly to ‘designed physical sys-
tems’ such as the ones engineers were modelling in the nine-
teenth century (i.e. the mechanical loom). Disorganized com-
plexity applies to both physical and artificial systems, whose
behaviour is almost impossible to predict (i.e. motion of million
particles in a gas container) (Klir 1985). Organized complexity is
ARCHITECTURAL SCIENCE REVIEW 7
Figure 4. Diagram illustrating the main characteristics of complex systems with regard to building design.
Figure 5. The term of complexity disassembled into different levels based on where it applies (second level), where it arises (third and fourth levels) and keys properties
in addressing and managing it (fifth level).
encountered in complex adaptive systems (i.e. a beehive) and
designed abstract systems (i.e. a building) and is therefore inter-
esting to designers, architects and engineers. By surveying the
scientific and architecture literature, it is possible to distinguish
four main types of organized complexity which are typical in
such systems namely: structural (also organizational), probabilis-
tic (also deterministic), algorithmic and computational (Schuh
and Eversheim 2004;Suh2005b,2005a) (Figure 7).
8E. PANTAZIS AND D. J. GERBER
Figure 6. A taxonomy of differentcomplexity types based on whether they regard complexity as relative or as an absolute quantity. The taxonomy is built upon definitions
from the fields of General Systems Theory, Cybernetics, Information Theory, Computer Science.
2.4.1. Complexity as an absolute quantity
In biology, a living organism can be classified as structurally com-
plex, because it has many different working parts, each formed
by variations in the implementation of the same genetic cod-
ing (Goldenfeld and Kadanoff 1999). If we consider an organism
as a system, probabilistic complexity is the sum of the interrela-
tionships, interactions and interconnectivity of parts within the
system and between the system and its environment (Stefan
Wrona 2001).
Based on the definition of a Turing Machine, Kolmogorov
defined algorithmic complexity as the length of the description
provided to a computer system in order to perform complete a
task (Kolmogorov 1965). This highly compressed description of
the regularities in the observed system, also called a ‘schema’,
can be used to define the complexity of a system or artificial
intelligence (AI) computing machine (Minsky 1961). Algorithmic
complexity is also called descriptive, or Kolmogorov complexity
in the literature depending on the scientific community and is
defined as finding the universally shortest description an object
or process (Chaitin 1990). If we consider a computer with specific
hardware and software specifications, then algorithmic com-
plexity is defined as the length of the shortest program that
describes all the necessary steps for performing a process, i.e.
print a string. Algorithmic complexity in many cases fails to meet
our intuitive feeling about what is complex and what is not. For
instance, consider if we compare Aristotle’s works to an equally
long passage written by proverbial monkeys, the latter is likely to
be more random and therefore have much greater complexity.
Bennet introduced ‘logical depth’ as a way to extend algorithmic
complexity and averaged the number of steps over the rele-
vant programs using a natural weighting procedure that heavily
favours shorter program (Bennett 1995; Lloyd and Pagels 1988).
Suppose you want to do a task of trivial algorithmic complexity,
such as print message with only 0s, then the depth is very small.
But if the example with the random passage from the monkeys
is considered, the algorithmic complexity is really high, but the
depth is low, since the shortest program is ‘Print’ followed by
the input message in the form of a ‘string’ of characters. In the
field of mathematical problem solving, computational complex-
ity is defined as the difficulty of executing a task in terms of
computational resources (Cover and Thomas 2012) (Figure 8).
In computer science, computational complexity is the amount
of computational effort that goes into solving a decision
problem starting from a given problem formulation (Traub,
Włodzimierz Wasilkowski, and Woźniakowski 1983). Within this
classification, non-deterministic polynomial time complexity
(NP) is one of the most fundamental complexity classes and
is defined as the set of all decision problems for which the
instances where the answer is yes have efficiently verifiable
proofs of the fact that the answer is indeed ‘yes’ (Barton, Berry-
wick, and Ristad 1987;Horgan1995). In other words, compu-
tational complexity describes how the time required to solve a
problem using a currently known algorithm increases as the size
of the problem increases. Depending on that relationship, prob-
lems are classified as Polynomial Time (P), Non-Deterministic
Polynomial Time (NP), NP-Complete or NP-Hard, which describes
ARCHITECTURAL SCIENCE REVIEW 9
Figure 7. The design to construction process as it relates to complexity and to a formal design model which includes iterative cycles between: synthesis, analysis and
evaluation (above). The design cycles, decomposed into four different domains. The [UA], [FR], [DP], [PP] are the characteristic vectors of each domain. During the design
process, we go from the domain on the left to the one on the right. The process is iterative in the sense that as the designers go over the stages of synthesis-analysis and
evaluation they can go back to the domain on the left based on ideas generated in the right domain (below).
whether a problem can be solved and how quickly. For NP-
complete problems for instance, although a solution can be
verified as correct there is no known way to solve the problem
efficiently (Cobham 1965). A famous example of a NP-complete
problem is that of a travelling sales person (TSP) which was for-
mulated in the 1800s by W.R. Hamilton and states: Given a list
of cities and the distances between each pair of cities, what is
the shortest possible route that visits each city and returns to the
origin city? (Rosenkrantz, Stearns, and Lewis 1974).
2.4.2. Complexity as a relative quantity
Mitchell defined architectural complexity in a digital context as
the ratio of added design content to the added construction con-
tent (Mitchell 2005). In Mitchell’s definition, design content is
defined as the joint product of the information already encoded
in a computer-aided design system and the information added,
in response to conditions and requirements of the context at
hand, by the designer. The construction content of a building
is defined by Mitchell as the length of the sequence that starts
with the fabrication description of a component and ends with
the assembly of the whole building (Mitchell 1990). Per the def-
initions in Section 2.4.1, Mitchell’s definition overlaps with that
of algorithmic complexity. Design content refers to the length of
the description necessary for describing to a computer system a
set of instructions to create a 3d geometry. Construction content
refers to length of description necessary to generate toolpaths
for the fabrication and on-site assembly of the design content.
In Mitchell’s definition, the designer by operating a Computer-
Aided Design (CAD) system handles the complexity of defining
the architectural shape and therefore Mitchell’s definition does
not appropriately capture the computational complexity with
coming up with ‘a design’ (i.e. the architect’s decision making
during the design process).
In engineering, Suh introduced axiomatic design and divided
complexity into two domains namely functional and physical.
The functional domain includes a set of constraints, attributes
10 E. PANTAZIS AND D. J. GERBER
Figure 8. Diagram illustrating different types of complexity (architectural, engineering, algorithmic, etc.) mapped onto the different phases of a design to construction
process.
and desires coming from the user while the functional domain
includes a set of functional requirements that a design objects
need to fulfil. The physical domain includes a set of design
parameters and a set of fabrication and construction processes
(Suh 1990;Suh2005b,2005a). In the physical domain, the com-
plexity of an object is related to the coupling of design param-
eters and available construction process and therefore can be
described as an absolute quantity. Within the functional domain,
complexity is regarded as a measure of uncertainty in achiev-
ing a set of goals defined by a set of functional requirements
which are coupled with a set of design parameters. Accord-
ing to Suh, a design is considered complex when its proba-
bility of success is low: that is when the information content
required to satisfy a number of functional requirements by a
number of design parameters is high. With the definition of
engineering complexity, Suh provided a tool to view complex-
ity of designed and engineered systems from a scientific and not
purely empirical approach, and aimed at creating a higher level
of abstraction in order to enable designers to synthesize and
operate complex systems without making them overly complex
per se.
Lastly in the field of construction, complexity is divided into
technological and organization complexity and is a function
of the size and uncertainty of the project (Baccarini 1996).
The two types of complexity are classified in terms of differ-
entiation and interdependency and thus organizational com-
plexity by differentiation refers to the number and diversity
of parts involved in a construction process, while organiza-
tional complexity by interdependency refers to the degree of
interactions between a given project’s elements (Morris and
Hough 1987). Technological complexity by differentiation refers
to the range of construction tasks while by interdependency
refers to relationships between a network of regulations and
tasks, teams, technologies and construction activities.
2.5. Measures of complexity
As it can be observed in the section above, contemporary
researchers in information theory, biology, and engineering and
computer science have developed separately different defini-
tions and measures of complexity but there seems to be also an
overlap as they were asking the same questions of complexity
but within their own disciplines. By reviewing the literature, we
can conclude that the most frequent questions that appear in
the literature across disciplines for quantifying complexity of an
object, an organism, a problem, a process or even an investment
are the following:
(1) How hard is it to describe?
(2) How hard is it to create?
(3) What is its degree of organization?
The difficulty of description (i.e. logical depth) can be mea-
sured in bits (i.e. effective complexity) while the difficulty of cre-
ation (i.e. design content) can be measured in time and energy
(i.e. entropy). Entropy can be considered as a measure of ran-
domness of operations in the design process and can be used
for measuring the potential of generative design systems to
ARCHITECTURAL SCIENCE REVIEW 11
generate novel solutions based on a set of design decisions
(Gero and Sosa 2008). Lastly the difficulty of organization can be
subdivided into two groups: one which measures the difficulty of
describing an organizational structure and another which relates
to the amount of information shared between the parts of a sys-
tem as the result of its’ structure. For an extensive mathematical
analysis of complexity measures, we point the reader to the work
of SFI (Gell-Mann and Lloyd 1996;Levin1976).
3. Complexity in architecture, engineering and
construction (AEC)
Admittedly the design, construction and management of a
building is indeed a convolute problem involving multiple dis-
ciplines and therefore it is hard to come up with an abso-
lute definition as well as measure of complexity. Following W.
Mitchell’s and N. Suh’s definitions, design complexity in AEC will
be examined under the scope of two domains. The former is at
the virtual/functional domain which is directly related to two lev-
els of complexity: the complexity of design problems and the
complexity of design processes. The latter is the real/physical
domain which is directly related to the building construction
(fabrication and on-site or off-site assembly) and Building Sys-
tems Integration (BSI).
3.1. Decomposing complexity in the AEC
We will consider the design to construction process holistically
and will investigate how complexity arises into these subtopics
and possible ways it could be addressed. In doing so, we aim to
clarify architectural complexity and translate achievements from
other fields for design purposes.
3.1.1. The complexity of design problems
In design, as well as in science, given a specific problem one
has to deal with many interconnected variables, often deriving
from functional requirements. On the contrary from science, in
the design world, problems are ‘wicked’ that means there exists
no clear formulation that contains all the information the prob-
lem solving mechanism needs for understanding and solving the
problem (Rittel and Webber 1973). Unfortunately, there is a lack
of clarity in many of these parts that increases the complexity of
this kind of problems. Through the act of designing, architects
and designers face two different aspects of complexity (Glanville
2007,2001). One aspect relates to the lack of complete informa-
tion for the design problem, which makes the formulation of a
universal design solution difficult (Suh 2005a). The other aspect
relates to the fact that the target is to create something new,
which means that the solution is not specified. The paradox in
the field of architecture is that if we have yet to specify new and
innovative design procedures, how can the resulting outcomes
be innovative?
Architectural design has a long history of addressing complex
programmatic requirements through a series of steps without
a specific design target (Terzidis 2006). Unlike other fields such
as engineering, where the target is to solve a particular prob-
lem in the best possible way, architectural design problems,
because of this novelty aspect, are open-ended, flux, and uncer-
tain and therefore ill-structured (Rittel and Webber 1973). For
instance, the task of designing a house leans towards the side
of ill-structured problems and NP-Hard problems; the amount
of uncertainty involved makes the specification of the prob-
lem hard and thus the solution becomes complex (Simon 1977).
Simon supported the idea that the degree of complexity of any
given problem critically depends on the description of the prob-
lem. Holland described optimization problems in domains as
broad and diverse as ecology, evolution, psychology, economic
planning as well as artificial intelligence and by abstracting from
the specific field he examined commonalities relating only to
complexity and uncertainty of such problems (Frazer 1995;Hol-
land 1992a). Although designing a house is hard to describe
even as a multi-objective optimization problem, the design
methodology required to approach such a problem using com-
putational means, can share common features of adaptation and
self-organization with an optimization problem in biology such
as the construction of ant hills (Theraulaz and Bonabeau 1995)
3.1.2. Complexity in design process
Although there is no well-formulated consensus model of the
design process in architecture, a typical model has emerged
in which the following features take place: (a) the assumption
that most design problems are ill-defined (wicked) problems
by definition, (b) the recognition of the importance of pre-
structures, presuppositions or proto-models as the origins of
solution concepts, (c) the emphasis on a conjecture–analysis
cycle in which the designer and the other participants refine
their understanding of both the solution and the problem in
parallel, and finally (d) the display of the essential spiral and non-
linear characteristics (Cross and Roozenburg 1992). Despite the
fact that the use of computational tools offers an opportunity
to formalize design process there are no formal architectural
design methods that follow the above model in a systematic
way.
The design process can be considered as one in which
the designer (i.e. architect, engineer) navigates through an ill-
defined problem domain and employs various strategies to elab-
orate the problem description. She then iteratively generates
and evaluates design alternatives and after a number of iter-
ations, i.e. when given a time-constraint, it proposes a solu-
tion (Gero 1996). In computational terms, the design process
can be described as a purposeful (not random), constrained,
decision making, exploratory and learning activity. Decision
making implies a set of variables that relate to the problem
definition and context. Search is the common process used in
decision making. Exploration here is akin to changing the prob-
lem space within which the decision making occurs. Learning
implies the restructuring of knowledge based on the cycle of pre-
supposition–conjecture–analysis–evaluation cycle (Gero 2000).
The ill-structured nature of the design problems, the existence of
changing contextual factors and the engagement of the human
factor do not allow the clear definition of the solution space to
be explored and therefore increase the complexity of the design
process. Non-linearity and the amount of interconnected design
parameters between the conjecture-analysis cycles also increase
the complexity. In an attempt to improve the latter, research
and professional practice have focused more on automating
traditional «manual» methods of production using computer-
aided design and algorithmic design tools (Gero 1996; Scheurer
12 E. PANTAZIS AND D. J. GERBER
2007). Current parametric design systems have facilitated the
design and management of non-standard geometries and at
first sight seem to reduce complexities of the design process,
at least in terms of algorithmic complexity. This is easily mea-
sured if we consider that the printout of a code for a parametric
model together with a table of all the parameter sets is much
shorter than all the workshop drawings (Scheurer 2010). How-
ever, there are complexities relating to the description of the
problem and the definition of efficient design strategies which
remain largely unresolved. For instance, in biological systems the
blueprint of an organism, that is its genetic code, is considered
as a set of instructions which are dependent on an environ-
mental context for its interpretation and manifestation and is
subject to evolution and adaptation. In architecture, despite dig-
ital design tools were developed to streamline the production
of the blueprints of buildings, less focus has been put towards
formalizing the encoding process. Although architectural design
processes has been computer based for over 20 years, only
recently there has been rigorous research towards the adap-
tation of computational methods for design exploration (Von
Bülow 2007). In order to leverage the power of computation,
more emphasis should be put on how design abstractions can be
formally described to computers algorithmically so that similar
to biology, evolutionary and learning mechanisms can be used
in order to extend the cognitive capacity of designers and there-
fore explore new design schemes or evolve existing ones based
on previous knowledge and/or experience.
3.1.3. Complexity in construction
Construction projects are invariably complex and are becoming
increasingly more, due to the fragmented nature of the industry
and its capability to both generate and collect a large amount of
data (Bennett 1991; Soibelman et al. 2008). Building construction
is typically characterized by the engagement of multiple, sepa-
rate and diverse organizations such as architects, engineers, con-
sultants and contractors for a finite period of time. On a higher
level, organizational complexity in a project is a function of the
project’s size and its uncertainty and increases when the number
and level of differentiation of all the contributing organizations
increase (Beyer and Trice 1979). The differentiation can be either
vertical and refers to the level of detail the activities of a project
might entail or horizontal. Horizontal differentiation refers to the
number of formal units such as departments, groups and so on
involved or to the way the tasks are structured in terms of labour
subdivision and the level of specialization required for each task
(Gidado 1993). For instance, the organizational complexity of a
project can increase if the number of different occupational spe-
cializations utilized to accomplish a project is high or when for
the duration of the project specialists are working at different
times during the project life cycle and/or at geographically sep-
arated offices. The dynamic and distributed character of the
construction environment increases the complexity as a result of
the required amount and types of information exchange among
all contributing parts (i.e. designers, engineers, contractors) (Tei-
cholz 2000). The multitude of different disciplines, the lack of
integrated frameworks as well as the reliance on classification
methods conducted by human protocols hinders this communi-
cation exchange and has caused inefficiencies, project cost and
time overruns. Furthermore, the quality and maintainability are
reduced, design intent is diminished, and the efficient access of
objects and information in a timely manner is prohibited (Caldas
and Soibelman 2003; Halfawy and Froese 2005).
On a lower level, complexity in construction occurs when
dealing with the fabrication of non-standard geometries and
non-repetitive assembly methods. The fabrication process relates
to the manipulation of raw material for the production of dis-
crete elements, while the assembly process refers to combina-
tion of discrete elements into systems (Mitchell 2005). Complex-
ity in fabrication can be described as the length of the translation
of a specific geometry or shape description into a sequence
of instructions for a Computer Numerically Controlled (CNC)
machine or a robotic arm that will fabricate such geometry. Addi-
tionally, complexity in the assembly process can be described as
the number and diversity of steps required to combine discrete
fabricated elements into a structure.
Consequently, if expressed in terms of algorithmic complex-
ity, the number and descriptive intricacy of elements and steps
for their fabrication and assembly increase construction com-
plexity. This can be easily measured for off-site construction
processes. However, the non-linear character and uncertainty of
on-site construction process, due to the dynamically changing
environment of the construction site and errors which appear in
the construction process cannot be easily described in terms of
algorithmic complexity but can be better described relatively in
terms of entropy.
3.1.4. Extending the definition of architectural design
complexity
Based on the types of complexity described above, it is obvi-
ous that in order to encapsulate the complexity of architec-
tural design holistically in a digital context it is important to
consider the definition of architectural complexity beyond the
realm of architecture. Thus the terms of design and construc-
tion content by Mitchell are decomposed in more detail and are
extended with concepts from engineering. We adopt Suh’s con-
cept of dividing complexity in functional and physical domain
and integrate them with Mitchell’s definition of complexity. Fol-
lowing Suh’s definition of engineering complexity, the design
content lies on both virtual and functional domain and can fur-
ther be subdivided into architectural design and engineering
design content. Architectural design content includes construct-
ing a design model which combines constraints, environmental
conditions and design intentions (design approach) set forward
by the occupants/stakeholders/decision makers with a set of
functional requirements (FRs). This design model maps a set of
constraints and attributes to the set of functional requirements
and couples them with a set of design parameters by consider-
ing process variables. The engineering content includes finding
the shortest description for coupling the functional parameters
with design parameters by considering process variables. The
complexity of construction content lies in the physical domain
because it is the mapping of design parameters to process vari-
ables such as available resources (material), building technology,
construction activities, time and cost.
The complexity of construction content is extended to not
only include the number of operations necessary to realize a
design but also engulf: (a) the level of differentiation of tasks,
(b) the interdependency among the function of the tasks, (c)
ARCHITECTURAL SCIENCE REVIEW 13
the degree of labour skills each task requires and (d) the level
of uncertainty to complete a task. Risk and uncertainty of how
design parameters are coupled with a set of process variables
(building paradigm) can be considered as measure of complex-
ity in the physical domain. The complexity of the design content
in the virtual domain lies (a) on amount of design decisions
required to formulate a design model and (b) on the uncertainty
of satisfying the functional requirements given a set of design
parameters.
3.2. Tools for handling complexity
The survey of the literature across different disciplines indicates
that the main research tools for managing complexity include (a)
abstraction, (b) modularity and (c) the idea of scalability. Abstrac-
tion can be used as a tool to compare data by treating them
as generic entities that we can compare, encapsulate and draw
generalizations upon them. Modularity can be used as a concept
which enables the development of functionally specific compo-
nent that are specialized at solving particular problem aspects.
Lastly, multi-scalability is a concept that allows one to formally
express features and principles (rules) that may be present across
different levels but have completely different effect according
to the specificities of the scale. For instance, what can be a fun-
damental rule in once scale may, on a much larger scale, reveal
themselves to be frozen accident. In order to emphasize the
importance of abstraction, modularity and scalability we con-
sider an example from physics by Ferreira (Ferreira 2001). Sup-
pose you want to describe the number of particles of a given
entity with a mathematical function
{−i(ϑ22i)/(2me)−i(ϑ22j)/(2mn)
+e2/(4πε0)i1, i21/|ri1−ri2|++z2e2/
(4πε0)j1, j21 /|Rj1−Rj2|−ze2/
(4πε0)i,j1/|ri −Rj|}ψ=Eψ
In the above equation, which describes matter in an atomic
level, i(and j) which vary between 1 and 1020, represent the
number of particles in a human body. Thus it is complicated
to solve this equation for ψ, the wave function for the parti-
cles. It is already impossible to solve this equation analytically,
for the atom of Helium (i=2andj=1). So how can we pro-
ceed when faced with such problems? Although unable to solve
this equation analytically, physicists use abstraction to be able
to explain the dynamics of larger particles. They position them-
selves at different levels, for example, at the molecular level
and develop models from there, considering the characteris-
tics of molecules that they are able to observe, as mass, charge,
poles, etc. Equations and simulation models in physics are often
solved by approximation using large amounts of computer
resources and power. With the introduction of computation,
physicists were able to computationally simulate and predict
molecular behaviours which could not be observed otherwise.
In many scientific disciplines, computation has been considered
as an exploratory rather than a neutral tool and has advanced
their mathematization (Kotnik 2010). However, in architecture,
despite the widespread use of digital design models and the
capacity to develop complex building models across scales,
digital building models are still not used to predict occupant
behaviours and building performance but rather as descriptive
models that are used to represent and communicate an idea eas-
ier and faster instead of substantially affecting current design
thinking.
Up until the early 2000s, the majority of digital design tools
were computer based and have thus automated and mech-
anized data handling within the design process. Recently we
have seen the introduction of computational tools that promote
design exploration and attempt to extend to the designer’s intel-
lect by correlating data in novel ways (Von Bülow 2007). The gen-
eralization of geometry via computational tools and methods
requires a higher level of formalization of design thinking but
also provides new forms of creative expression. Suh who intro-
duced the axiomatic design approach argues that via design
formalization and via systematically incorporating scientific prin-
ciples in design, there is the potential to inform empiricism and
intuition in design and evaluate the complexity of a design prob-
lem holistically (Suh 1990). Kotnik states that the consideration
of digital design as computable functions offers an opportunity
to systematize design knowledge and compare existing meth-
ods in design (Kotnik 2010). This is because mathematical func-
tions make the governing of cause and the effect explicit, and
therefore connecting methods of digital design with the con-
cept of computational functions offers a platform for directly
transferring formal mathematical concepts into architecture.
However, there are two sides regarding this transfer of math-
ematical concepts in contemporary digital practices. One side
is connected with performance-based design techniques which
are directly related to optimization problems. This side, although
quite prevalent nowadays can easily close up design thinking
towards parametric manipulation of optimization routines. The
other side of this transfer of mathematical concepts lies on the
very idea of the algorithmic description and offers the possi-
bility of precisely controlling the relation between functional
requirements and design parameters between architectural ele-
ments in unique ways. The formalization of design thinking can-
not replace the design process but can act as a framework for
exchanging knowledge between fields of science and design as
well as for a more systemic examination of contemporary design
practices.
The tools for managing complexity outlined in this section
offer a high-level framework for managing the multiple levels
of complexity that are included in building design. Following
the axiomatic design approach, design can be broadly defined
as the creation of a synthesized solution which satisfies a set of
perceived needs through the mapping of process between func-
tional requirements, which exist in the functional domain and
design parameters which exist in the physical domain. Through
this perspective, concepts such as self-organization, autonomy,
topology, holism and entropy from the field of complexity the-
ory can be used to correlate the multiple levels of complexity
across the different fields of AEC.
4. A holistic digital design approach for managing
building complexity
The analysis in the previous sections shows that the complex-
ity of creating a building design description (added design
14 E. PANTAZIS AND D. J. GERBER
content) lies in the functional domain and is conditioned by
the (a) time, (b) information exchange (bits) and (c) energy (i.e.
entropy) required to come up with a design proposal and (d)
level of uncertainty (i.e. % probability) of fulfilling a set of design
goals defined by functional requirements and selected design
parameters. On the other hand, the complexity of constructing a
given design description (construction content) lies in the phys-
ical domain and can be measured as an absolute quantity with
field specific dimensions. Thus the construction content defined
by Mitchell is extended to not only include the (a) description
of a sequence of fabrication and construction activities, but
also (b) the range of construction activities, (c) the level inter-
dependency between activities and existing resources, (d) the
technological sophistication required and (e) the level of risk in
delivering it.
By embracing design complexity in multiple levels and partic-
ularly both in the physical and functional domains, we can deter-
mine the complexity of a design to construction process by con-
sidering not only geometric complexity and available technolo-
gies (i.e. construction methods, digital fabrication) but also envi-
ronmental parameters (location, orientation, azimuth), building
performance data (heating and cooling loads), dynamic user
behaviour (space occupancy) and specific construction tasks.
Practical application of complexity theory may be found in many
disparate disciplines and has included modelling approaches
with both systemic scientific and utilitarian objectives. It is not
operationally meaningful to view complexity as an intrinsic
property of an object (i.e. a building), but instead is better to
consider it holistically and assume that the level of complexity,
arises from, or exists in, abstractions of the world. In the field of
software engineering, they have managed to deal with complex
problems and new technology, by breaking down complexity in
structured (computable) situations.
4.1. Complexity and complex adaptive systems (CAS)
One of the most important characteristics of complex non-linear
systems is that they cannot be successfully analysed by deter-
mining in advance a set of components and properties that can
be studied in isolation linearly. Instead it is necessary to con-
sider the system as a whole, even if that means taking a crude
look, and then allow possible simplifications to emerge from the
dynamic interactions of the elements. This makes obvious that
thestudyofCAShaslotincommonwiththedesignprocess.
CAS are particularly interesting for managing design complexity
because it has been proven that a model or schema in rela-
tively few dimensions is exploring a gigantic strategy space, far
from any optimum or equilibrium. Think for example a com-
puter learning to play chess. In the not so distant past, chess was
an unsolved problem, as ‘go’ is still today, and adaptive learn-
ing was necessary for chess – adaptation was not completed.
Reinforcement learning models have done great progress in
developing models of computer ‘go’ games nowadays and sim-
ilar approaches could be helpful for design problems as well. To
better understand how CAS can be utilized in the design pro-
cess we need to relate CAS to artificial systems which do not
share same properties. How does the complex system operate?
How does it engage in passive learning about its environment,
in prediction of the future impacts of the environment, and in
prediction of how the environment will react to its behaviour?
One may ask how CAS differs from a system like turbulent
flow in a fluid, a complex phenomenon but not one that is likely
to be adaptive. Yet in the turbulent flow there are eddies that
give rise to smaller eddies and so on, and certain eddies that have
properties which allow them to survive in the flow and have off
springs, while others do not and die out. Why is such turbulent
flow not regarded as an evolutionary system?
The answer lies in the way information about the environ-
ment is recorded. In CAS, it is not merely listed in what com-
puter scientists call a look-up table. Instead, the regularities of
the experience are encapsulated in highly compressed form
as a model or theory (i.e. schema). Such a schema is usually
approximate, sometimes wrong, but it may be adaptive if it can
make useful predictions including interpolations and extrapola-
tions and sometimes generalizations to situations very different
from those previously encountered. It is important to note that
the adaptive process need not always be extremely effective
in achieving apparent success at the level of prediction and
behaviour, what corresponds in biological evolution to the ‘phe-
notypic’ level of the organism as opposed to the ‘genotypic’ level
of the DNA. In computational design research and academic cir-
cles today, most researchers have yet to take a ‘crude look at the
whole’. Instead a lot of computational design research is focused
on specialization (i.e. performance-based design, robotics) and it
is taken for granted that serious work can be done only by look-
ing at one or few aspects of a building (i.e. geometry). Yet every
architect needs to make decisions as if all aspects of a design sce-
nario have been considered with all interactions among them
(design engineering, construction). However, if there are only
disconnected specialists to consult, then the collation of their
result is not always resulting in a fair picture of the whole and
the design outcome is not always coherent. If the complexity
that arises from the interaction of these specializations itself is
not managed, the potential of digital media and computation is
not fully leveraged.
4.2. Towards new kinds of abstractions- ‘A building as a
ne-tuned orchestra’
To clarify the above statements, let us consider an example of
how the principles of CAS that we briefly mentioned but also the
tools such as abstraction and modularity can come together to
produce coherent design solutions. For reasons of simplicity, we
will draw a metaphor between a delivering a building and a sym-
phonic concert. Let’s consider for instance an orchestra, with our
‘abstraction hats’ (see Figure 9). The orchestra is an instance of
hierarchy and abstraction. It is analogous to many things we see
every day, including IT systems, games, factories, cars or build-
ings and cities. They all have a common thread collections and
layers of rules, specifications, archetypes and abstractions. In an
orchestra, the conductor is leading the system.
She is not playing an instrument, but she is directing the
events and acting as quality control. She is the highest level in
a hierarchy of abstractions. If we look at the musical score, like
drawings we see embedded instructions. The notes are written,
but they are not the sounds. They encapsulate the rules of the
ARCHITECTURAL SCIENCE REVIEW 15
Figure 9. A diagram showing how structural hierarchies and different layers of abstraction can come together to produce a complex outcome(i.e. a concert, a building).
music, the sequence, pitch, volume and timing. But nothing hap-
pens until we get to the executive layer: the individual musicians
and their instruments. The score contains high-level sequential
instructions necessary to execute in specific ways the lower level
instructions required to get the intended result. The musicians
do that job and are performing a role analogous to the designers.
The musician (designers) will execute parts (automation) within
the score (high-level commands) on a specific instrument (oper-
ating system/design tool). The musician translates the encoded
information into sounds (instrument/tool) necessary to have
the audience appreciate the outcome (business function), all
managed by the top layer (orchestration) and intended by the
composer (IT service owner) to please the audience (business
owner). These abstractions enable useful hierarchies. Abstrac-
tions use interfaces between layers and are based on trusted
behaviours. The system works because each layer is aware of the
rules agreed between the adjacent layers. There is a common
language between those layers: music. A transformation hap-
pens within each layer when something more specific is added.
If the conductor had to play all the instruments himself, the
outcome would be impossible. Here we see specialization and
abstraction work to form a useful hierarchy, from general case
to specific. There are technical elements, encoding elements and
transformational elements. Without these structures, we would
not have an orchestra. Abstract layers link together via a com-
mon language and internally take care of translation. While we
are all aware of this example on the surface, it is worth examin-
ing it in the light of our new awareness. This idea is as natural
to us as breathing, but the sophistication of the abstractions
would not have happened without recognizing the incremen-
tal disciplines involved. Humans built this musical result after
years of evolved thinking. It is an example where far more has
been realized because of imposed structural rules. It has yielded
far greater returns than if we simply sat the entire group of
musicians in a room and shouted, ‘Play!’. Instead, consider if
we sat the musicians down and had them agree to a set of
operating principles over an agreed common target. Then we
filled the agreed structure with the rules of musical notation
(software) that we enabled and tested a priori, and we executed
the sequence within them. The result is robust, recognizable
and repeatable and can deliver far more impact than random
sound, or even one man trying to play each instrument individ-
ually. If we make an analogy between the delivery of an opera
and a building project we would observe that a lot of complexi-
ties and issues arise in the latter because the ‘musicians’ of the
construction industry (i.e. architects, engineers and construc-
tors) although they agree to a set of operating principles (project
brief, contracts etc.), they are sitting in different rooms, each one
is using their own notation and playing on their own.
5. Conclusion
This study reviews the literature from the fields of science, archi-
tecture, engineering and complexity theory and critically evalu-
ates complexity from the perspective of design computing for
architecture. The survey shows that the lack of a unifying the-
ory for complexity has hindered research efforts and has resulted
in researchers from different disciplines coming up with similar
definitions of complexity because they were considering com-
plexity solely within the bounds of their domain. In the field
of architecture, the lack of science-based approaches towards
design has resulted in approaching complexity diagrammati-
cally and predominantly through the perspective of geometry.
Despite the work of designers such as A. Gaudi and F. Otto
has materially manifested that we are rapidly moving towards
other levels of complexity, the absence of more rigorous com-
putational design methods and the lack of communication with
other scientific disciplines has resulted in approaching complex-
ity diagrammatically and solely through a top down reductionist
approach. To provide a common basis for designers and archi-
tects a taxonomy of complexity definitions and measures is pro-
vided which can help build a better understanding of complexity
going forward. Based on this taxonomy, which regards our build-
ings as cyber-physical systems, the architectural complexity can
be captured if it is considered both as an absolute and as a
relative quantity.
16 E. PANTAZIS AND D. J. GERBER
As computational design and digital construction platforms
are changing the way we construct our buildings, the clarifi-
cation of underlying complexity assumptions such as holism,
feedback, hierarchy emergence and entropy to name a few are
considered essential knowledge for enabling architects to man-
age design complexity. Such conceptual tools are considered
necessary for developing more formal design strategies which
can operate across multiple scales and disciplines, and thus can
manage complexity. If we consider ‘coding’ as a new kind of
geometry at the hands of the architect, that allows them to
develop not one-off drawings but computable models that sup-
port design exploration, then designers need to develop new
kinds of design abstractions which are based on systemic analy-
sis of buildings and a deep understanding of natural form mak-
ing processes (i.e. nest building) rather the figurative replication
of their form (i.e. free from shape of a nest).
Based on our review, we can conclude that abstractions,
which can be useful for design have the following characteristics:
(a) are robust, (b) depend on well-formed rules, (c) need evolu-
tion to refine them, (d) facilitate useful hierarchies, (e) take care of
translation between layers, (f) use common interfaces between
layers, (g) can manage complexity and (h) may yield or empower
results that exceed expectation. Summarizing, computational
techniques and unified frameworks between the different disci-
plines in the AEC can help understand design problems holisti-
cally and can help reduce the complexity of architectural design
and planning of automated construction processes by effec-
tively coupling functional requirements with design parameters
early in the design phase. The current analogy which exists in
digital design between the designer (user) and machine (digi-
tal tool) should be reversed. In this analogy, the designer acts
like an apprentice that uses a specific language (python, c++,
java) or interface to pose questions to the master (computer)
and respectfully awaiting the answer. In order to promote the
designers’ creativity, computational design tools need to be con-
ceived not as drafting aids but as the user’s/designer’s appren-
tice, which when given a set of specifications, are capable to
generate proposals (design alternatives) that the user/designer
(master craftsman) can evaluate and critique. In such a situation,
the tool is expected to build knowledge through the interac-
tion with the user and the processing of multiple data sets. By
tracing evolutionary patterns in existing typologies (i.e. vernacu-
lar structures) and combining them with occupants’ behaviours,
environmental conditions and building paradigms, simple yet
not simplified generative design systems and robust distributed
construction systems can be devised for exploring larger and
more meaningful solution spaces.
Note
1. A. Pottinger, lead structural engineer from Buro Happold commenting
on the successful completion of Abu Dhabi Louvre, a challenging project
both in terms of architecture and engineering which was completed in
2017, states that "
One of the absolutely overriding things we had to do was
to find simplicity amongst the complexity. If we didn’t do that the project
wouldn’t be buildable’.
Acknowledgements
This material is based upon work supported by the National Science Foun-
dation under grant number1231001. Any opinions, findings and conclusions
or recommendations expressed in this material are those of the author(s)
and do not necessarily reflect the views of the National Science Founda-
tion. We would like to give special thanks to all the professors, researchers
and friends that contributed to this project; specifically, to L. Soibelman,
B Becerik-Gerber, I. Bertsatos, S. Panagiotopoulos for their contributions in
reviewing, commenting and reflecting to this work.
Disclosure statement
No potential conflict of interest was reported by the authors.
References
Baccarini, David. 1996. “The Concept of Project Complexity—A Review.”
International Journal of Project Management 14 (4): 201–204.
Barton, G.E., R.C. Berrywick, and E.S. Ristad. 1987.Computational Complexity
and Natural Language. Cambridge, MA: MIT Press.
Bennett, John. 1991.International Construction Project Management: General
Theory and Practice. New Hampshire, USA: Butterworth-Heinemann.
Bennett, Charles H. 1995. “Logical Depth and Physical Complexity.” In The
Universal Turing Machine A Half-Century Survey, edited by Rolf Herken,
227–257. Yorktown Heights, USA: Oxford University Press.
Beyer, Janice M., and Harrison M. Trice. 1979. “A Reexamination of the Rela-
tions Between Size and Various Components of Organizational Complex-
ity.” Administrative Science Quarterly 24 (1): 48–64.
Block, Dr. Philippe. 2009. “Thrust Network Analysis: Exploring Three-
Dimensional Equilibrium.” Massachusetts Institute of Technology.
Bundy, Alan. 2007. “Computational Thinking is Pervasive.” Journal of Scientific
and Practical Computing 1 (2): 67–69.
Caldas,CarlosH,andLucioSoibelman.2003. “Automating Hierarchical Docu-
ment Classification for Construction Management Information Systems.”
Automation in Construction 12 (4): 395–406.
Chaitin, Gregory J. 1990.Information, Randomness & Incompleteness: Papers
on Algorithmic Information Theory. Vol. 8, Series in Computer Science.New
York, USA: World Scientific.
Cobham, Alan. 1965. The Intrinsic Computational Difficulty of Functions.
Paper presented at the Logic, Methodology and Philosophy of Science:
Studies in Logic and the Foundations of Mathematics.
Cover, Thomas M, and Joy A Thomas. 2012.Elements of Information Theory.
John Wiley & Sons.
Cross, Nigel, and Norbert Roozenburg. 1992. “Modelling the Design Process
in Engineering and in Architecture.” Journal of Engineering Design 3 (4):
325–337. doi:10.1080/09544829208914765.
Crutchfield, James P. 1994. “The Calculi of Emergence: Computation, Dynam-
ics and Induction.” Physica D: Nonlinear Phenomena 75 (1): 11–54.
Dent, Eric B. 1999. “Complexity Science: A Worldview Shift.” Emergence 1 (4):
5–19.
Eisenman, Peter. 1993. “Folding in Time, the Singularity of Rebstock.” Archi-
tectural Design 1 (102): 22–25.
Feldman, David P, and Jim Crutchfield. 1998. “A Survey of Complexity Mea-
sures.” Santa Fe Institute, USA 11.
Ferreira, Pedro. 2001. “Tracing Complexity Theory.” In Research Seminar in
Engineering Systems, 1–26. Pittsbough, PA: Carnegie Mellon University.
Fischer, Thomas. 2014. Wiener’s Refiguring of a Cybernetic Design Theory.
Paper presented at the 2014 IEEE Conference on Norbert Wiener in the
21st Century (21CW).
Fisher, Ronald A. 1956. “Statistical Methods and Scientific Inference”.
Frazer, John H. 1995. “An Evolutionary Architecture.” Themes.
Gell-Mann, Murray. 1992. Complexity and complex adaptive systems. Paper
presented at the Proceedings of the Santa Fe Inistitute Studies in the
Sciences of Complexity, California Institute of Technology, Pasadena.
Gell-Mann, Murray, and Seth Lloyd. 1996. “Information Measures, Effective
Complexity, and Total Information.” Complexity 2 (1): 44–52.
Gerard, R.W. 1958. “Concepts and Principles of Biology. Initial Working
Paper.” Behavioral Science 3 (1): 95–102.
Gero, John S. 1996. “Creativity, Emergence and Evolution in Design.”
Knowledge-Based Systems 9 (7): 435– 448. doi:10.1016/S0950-7051(96)
01054-4.
ARCHITECTURAL SCIENCE REVIEW 17
Gero, John S. 2000. “Computational Models of Innovative and Creative
Design Processes.” Technological Forecasting and Social Change 64 (2-3):
183–196. doi:10.1016/S0040-1625(99)00105-5.
Gero, John S., and Ricardo Sosa. 2008. “Complexity Measures as a Basis
for Mass Customization of Novel Designs.” Environment and Planning B:
Planning and Design 35 (1): 3–15. doi:10.1068/b32106.
Gidado, Kassim. 1993. “Numerical Index of Complexity in Building Construc-
tion to Its Effect on Production Time.” University of Brighton.
Glanville, Ranulph. 2001. “An Intelligent Architecture.” Convergence: The
International Journal of Research Into New Media Technologies 7 (2): 12–24.
Glanville, Ranulph. 2007. “Designing Complexity.” Performance Improvement
Quarterly 20 (2): 75–96. doi:10.1111/j.1937-8327.2007.tb00442.x.
Goldenfeld, Nigel, and Leo P. Kadanoff. 1999. “Simple Lessons from Complex-
ity.” Science 284 (5411): 87–89. doi:10.2307/2899140.
Google. 2018. “NGram Viewer Database.” Accessed 17 October 2018.
http://storage.googleapis.com/books/ngrams/books/datasetsv2.html.
Halfawy, Mahmoud, and Thomas Froese. 2005. “Building Integrated Architec-
ture/Engineering/Construction Systems Using Smart Objects: Methodol-
ogy and Implementation 1.” Journal of Computing in Civil Engineering 19
(2): 172–181.
Heylighen, Francis, and Cliff Joslyn. 2001. “Cybernetics and Second Order
Cybernetics.” Encyclopedia of Physical Science & Technology 4:
155–170.
Holland, John H. 1992a.Adaptation in Natural and Artificial Systems: An Intro-
ductory Analysis with Applications to Biology, Control, and Artificial Intelli-
gence. Ann Arbor: A Bradford Book.
Holland, John H. 1992b.Adaptation in Natural and Artificial Systems: An Intro-
ductory Analysis with Applications to Biology, Control, and Artificial Intelli-
gence. 1 Vols. Vol. 1. Cambridge, MA: MIT Press.
Horgan, John. 1995. “From complexity to perplexity.” Scientific American.
Jacobs, Jane. 1961.The Death and Life of Great American Cities. Vintage.
Jencks, Charles. 1997.Architecture of the Jumping Universe. Academy Editions.
Kauffman, Stuart A. 1991. “Antichaos and Adaptation.” Scientific American
265 (2): 78–84.
Kauffman, Stuart A. 1993.The Origins of Order: Self Organization and Selection
in Evolution. Oxford university press.
Klir, George. 1985.Architecture of Systems Problem Solving. New York: Plenum
Press.
Kolmogorov, Andrei Nikolaevich. 1965. “Three Approaches to the Definition
of the Concept “Quantity of Information”.” Problems of Information Trans-
lation 1 (1): 3–11.
Kotnik, Toni. 2010. “Digital Architectural Design as Exploration of Com-
putable Functions.” International Journal of Architectural Computing 8 (1):
1–16.
Leicht, Robert M, P.M. Abdelkarim, and John I Messner. 2010. Gaining end
user involvement through virtual reality mock-ups: a medical facility case
study. Paper presented at the Proceedings of the CIB W.
Levin, Leonid A. 1976. “Different Measures of Complexity of Finite Objects
(Axiomatic Description).” DOKLADY AKADEMII NAUK SSSR 227 (4):
804–807.
Li, Ming, and Paul Vitányi. 2013.An Introduction to Kolmogorov Complexity
and its Applications. Springer Science & Business Media.
Lloyd, Seth. 2001. “Measures of Complexity: a Nonexhaustive List.” IEEE Con-
trol Systems Magazine 21 (4): 7–8.
Lloyd, Seth, and Heinz Pagels. 1988. “Complexity as Thermodynamic Depth.”
Annals of Physics 188 (1): 186–213.
Lynn, Greg, and Therese Kelly. 1999.Animate Form. Vol. 1: Princeton Archi-
tectural Press New York.
Malkawi, Ali. 2005. “Performance Simulation: Research and Tools.” In Perfor-
mative Architecture: Beyond Instrumentality, edited by Branko Kolarevic,
and Ali Malkawi, 85–96. New York: Spon Press.
Maturana, Humberto R, and Francisco J Varela. 1987.The Tree of Knowledge:
The Biological Roots of Human Understanding. Boston, MA: New Science
Library/Shambhala Publications.
Menges, Achim. 2013. “Morphospaces of Robotic Fabrication.” In RobArch,
edited by Sigrid Brell-Cokcan and Johanness Braumann. Springer.
Minsky, Marvin. 1961. “Steps Toward Artificial Intelligence.” Proceedings of
the IRE 49 (1): 8– 30.
Mitchell, William J. 1990.The Logic of Architecture: Design, Computation, and
Cognition. Cambridge, MA: MIT press.
Mitchell, William J. 2005. “Constructing Complexity.” In Computer Aided Archi-
tectural Design Futures, 41–50. Springer.
Morris, Peter WG, and G.H. Hough. 1987. “The anatomy ofmajor projects.” In.:
Wiley.
Negroponte, Nicholas. 1970.The Architecture Machine: Towards a More
Human Environment.Cambridge, MA: MIT Press.
Oxman, Rivka. 2008. “Performance-based Design: Current Practices and
Research Issues.” International Journal of Architectural Computing 6 (1):
1–17. doi:10.1260/147807708784640090.
Pask, Gordon. 1969. “The Architectural Relevance of Cybernetics.” Architec-
tural Design 39 (9): 494–496.
Perna, Andrea, and Guy Theraulaz. 2017. “When Social Behaviour is Moulded
in Clay: on Growth and Form of Social Insect Nests.” Journal of Experimen-
tal Biology 220 (1): 83–91.
Rahman, Mahadev. 2010. “Complexity in Building Design.” In Re-inventing
Construction, edited by Ilka Ruby and Andreas Ruby, 440. Berlin: Ruby
Press.
Rapoport, Anatol. 1986.General System Theory: Essential Concepts & Applica-
tions. Vol. 10: CRC Press.
Rittel, H.W., and Melvin M Webber. 1973. “2.3 Planning Problems are Wicked.”
Polity 4: 155–169.
Rosenkrantz, Daniel J, Richard Edwin Stearns, and Philip M Lewis. 1974.
Approximate Algorithms for the Traveling Salesperson Problem. Paper
presented at the 15th Annual Symposium on Switching and Automata
Theory (SWAT 1974).
Salingaros, Nikos. 2000. “Complexity and Urban Coherence.” Journal of Urban
Design 5: 291–316.
Scheurer, Fabian. 2007. “Getting Complexity Organised Using Self-
Organisation in Architectural Construction.” Automation in Construction
(16): 78–85.
Scheurer, Fabian. 2010. “Materialising Complexity.” Architectural Design 80
(4): 86–93. doi:10.1002/ad.1111.
Schuh, Giinther, and W. Eversheim. 2004. “Release-Engineering—An
Approach to Control Rising System-Complexity.” CIRP Annals-
Manufacturing Technology 53 (1): 167–170.
Shannon, C.E. 1948. “A Mathematical Theory of Communication.” Mobile
Computing and Communication Review (SIGMOBILE) 5 (1): 3–55. doi:10.
1145/584091.584093.
Shea, Kristina, Robert Aish, and Marina Gourtovaia. 2005. “Towards Inte-
grated Performance-Driven Generative Design Tools.” Automation in Con-
struction 14 (2): 253–264. doi:10.1016/j.autcon.2004.07.002.
Simon, Herbert A. 1977. “The Structure of ill-Structured Problems.” In Models
of Discovery, 304–325. Springer.
Simon, Herbert A. 1996.The Sciences of the Artificial. MIT Press.
Soibelman, Lucio, Jianfeng Wu, Carlos Caldas, Ioannis Brilakis, and Ken-
Yu Lin. 2008. “Management and Analysis of Unstructured Construc-
tion Data Types.” Advanced Engineering Informatics 22 (1):
15–27.
Stefan Wrona, Adam Gorczyca. 2001. Complexity in Architecture - How CAAD
can be involved to Deal with it. Paper presented at the AVOCAAD -
Added Value of Computer Aided Architectural Design, ogeschool voor
Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus
Brussel.
Suh, Nam P. 1990.The Principles of Design. Vol. 990: Oxford University Press
New York.
Suh, Nam P. 2005a. “Complexity in Engineering.” CIRP Annals - Manufacturing
Technology 54 (2): 46–63. doi:10.1016/S0007-8506(07)60019-5.
Suh, Nam P. 2005b.Complexity: Theory and Application. New York: Oxford
University Press.
Teicholz, Paul. 2000. Vision of future practice. Paper presented at the
Berkeley-Stanford Workshop on Defining a Research Agenda for AEC
Process/Product Development in.
Terzidis, Kostas. 2006.Algorithmic Architecture. Oxford; Burlington, MA: Archi-
tectural Press.
Theraulaz, Guy, and Eric Bonabeau. 1995. “Coordination in Distributed Build-
ing.” Science 269 (5224): 686.
Traub, Joseph Frederick, Grzegorz Włodzimierz Wasilkowski, and Hen-
ryk Woźniakowski. 1983.Information, Uncertainty, Complexity. Addison-
Wesley Publishing Company, Advanced Book Program/World Science
Division.
18 E. PANTAZIS AND D. J. GERBER
Turing, Alan Mathison. 1936. “On Computable Numbers, with an Application
to the Entscheidungsproblem.” J. of Math 58 (345-363): 5.
Venturi, Robert. 1977.Complexity and Contradiction in Architecture. Vol. 1.
NewYork,USA:TheMuseumofmodernart.
Von Bertalanffy, Ludwig. 1973. “The Meaning of General System The-
ory.” General System Theory: Foundations, Development, Applications 1 (1):
30–53.
Von Bülow, Peter. 2007. “An intelligent genetic design tool (IGDT) applied to
the exploration of architectural trussed structural systems.” Ph.D. Disser-
tation, University of Stuttgart.
Von Neumann, John, and A. H. Taub. 1951. “The General and Logical Theory
of Automata.” John Von Neumann: Collected Works 1 (1): 288–326.
Weaver, Warren. 1948. “Science and Complexity.” American Scientist 36 (536):
449–456.
Weiss, Andrew. 2015. “Google N-Gram Viewer.” The Complete Guide to
Using Google in Libraries: Instruction, Administration, and Staff Productivity
1: 183.
Wiener, Norbert. 1961.Cybernetics or Control and Communication in the Ani-
mal and the Machine. Vol. 25: MIT press.
Wolfram, Stephen. 2002.A New Kind of Science. Vol. 5: Wolfram media Cham-
paign.
Yates, Frances E. 1978. “Complexity and the Limits to Knowledge.” American
Journal of Physiology-Regulatory, Integrative and Comparative Physiology
235 (5): R201–R2R4.