Content uploaded by Nelson Silva
Author content
All content in this area was uploaded by Nelson Silva on Aug 03, 2015
Content may be subject to copyright.
Master Thesis
(Dissertação de Mestrado)
Master in Design and Product Development Engineering
(Mestrado em Engenharia da Concepção e Desenvolvimento de Produto)
Optimal Forms
Generative Modeling Techniques in Optimization
Nelson de Jesus Silvério da Silva
Leiria, July 2011
Master Thesis
(Dissertação de Mestrado)
Master in Design and Product Development Engineering
(Mestrado em Engenharia da Concepção e Desenvolvimento de Produto)
Optimal Forms
Generative Modeling Techniques in Optimization
Nelson de Jesus Silvério da Silva
Scientific Adviser: Dr. Nuno Alves
(Escola Superior de Tecnologia e Gestão do Instituto Politécnico de Leiria, Portugal)
Scientific Co-Adviser: Dr. Eva Eggeling
(Fraunhofer, Austria)
Leiria, July 2011
i
Report submitted to the Polytechnic Institute of Leiria in partial fulfillment of the
requirements for the degree of Master in Design and Product Development
Engineering (Mestrado em Engenharia da Concepção e Desenvolvimento de
Produto).
ISBN:
© Polytechnic Institute of Leiria
ii
iii
The Jury
President
Vogals
iv
v
To My Family
vi
vii
Acknowledgments
To my Scientific Advisor, Prof. Dr. Nuno Alves (Vice-director of CDRSP
Research Centre), for the constant attention, motivation and ideas given
throughout the course of the thesis and for helping me in achieving success.
Thank you also for your friendship and whole-hearted smile.
To my Scientific Co-Adviser, Dr. Eva Eggeling (Business Unit Manager of the
Visual Computing Fraunhofer Austria Research GmbH), thank you so much
for the support and incentive given to me personally, with your warm
friendship and also for the support given professionally that made the
realization of this thesis possible.
To M.Sc. Torsten Ullrich (Researcher at Fraunhofer, Austria), for priceless
knowledge, contribution, helps and of course, he‟s sincere friendship, this was
crucial for the development of this innovative work.
To Prof. Dr. Paulo Bártolo (Director of CDRSP Research Centre) and Prof. Dr.
Helena Bártolo (CDRSP Research Centre) for all the support throughout the
course of the thesis and for receiving me at the CDRSP Research Centre.
To Prof. Dr. Dieter Fellner (Director of the Fraunhofer Institute for Computer
Graphics Research - IGD) for giving me the opportunity to develop this work
with the CGV Group and Fraunhofer Austria and for allowing me to have
access to all the materials and valuable information produced within the
CGV/Fraunhofer group.
To M.Sc. Volker Settgast (Researcher at Fraunhofer Austria) for all the great
tips about Autodesk Maya and Rendering in general and he‟s invaluable
friendship.
To Dr. Christina Lemke (Architect with projects in Germany and Spain, urban
planner, construction biologist & ecologist) for her support, that kindly allowed
me to have access to her published PhD thesis work.
To my wife, for her patience, for being always there, supporting and
encouraging and for understanding all the time I wasn‟t around.
To ALL of you, that made this thesis possible, with your comprehension,
motivation, support and encouragement.
This master thesis was only possible to achieve, due to the good relationships
and close partnership that was settled between CDRSP and Fraunhofer
Austria. Thank you all so much…
viii
ix
Keywords
Procedural, Optimization, Evolutionary, Algorithm, Simulation,
Building
Abstract
The generative modeling paradigm is a shift from static models to
flexible models. A generative model describes a modeling process
using functions, methods and operators. The result is an
algorithmic description of the construction process. Each
evaluation of such an algorithm creates a model instance, which
depends on its input parameters (width, height, radius, orientation,
etc.). These values are normally chosen according to aesthetic
aspects and style. In this study, the model‟s parameters are
automatically generated according to an objective function. A
generative model can be optimized according to its parameters, in
this way, the best solution for a constrained problem is determined.
The field of application is energy and architecture. Besides the
establishment of an overall framework design, this work consists
on the identification of different building shapes and their main
parameters, the creation of an algorithmic description for these
main shapes and the formulation of the objective function,
respecting a building‟s energy consumption (solar energy, heating
and insulation). Also, this work aims the conception of an
optimization pipeline, combining an energy calculation tool with a
geometric scripting engine. In this study, one can read about state
of the art developments related to architecture and procedural
modeling. The major contribution of this development is to present
methods that lead to an automated and optimized 3D shape
generation for the projected building (based on the desired
conditions and according to specific constrains), this will help in
the construction of real buildings that account for less energy
consumption and for a more sustainable world.
x
xi
Table of
Contents
xii
xiii
Acknowledgments ......................................................................................................................................... vii
Keywords ........................................................................................................................................................ ix
Abstract .......................................................................................................................................................... ix
Table of Contents ....................................................................................................................................... xi
Figures List ............................................................................................................................................... xvii
Tables List ................................................................................................................................................ xxv
Abbreviations and Acronyms .................................................................................................................. xxix
Introduction ......................................................................................................................................... 1 I.
1 The Problematic .................................................................................................................................. 1
1.1 Thesis Structure ............................................................................................................................... 4
State of the Art ................................................................................................................................. 7 II.
2 Overview ............................................................................................................................................. 7
2.1 Parametric and Procedural Modeling .............................................................................................. 8
2.1.1 Plugins for existent 3D Software: Blender, 3D Studio Max and Others ....................................... 9
2.1.2 CityEngine .................................................................................................................................. 10
2.1.3 Bentley – MicroStation Extension: GenerativeComponents (GC).............................................. 13
2.1.4 Rhinoceros and Grasshopper ..................................................................................................... 14
2.1.5 Generative Modeling Language (GML) ...................................................................................... 16
2.1.6 Euclides Framework and JavaScript ........................................................................................... 18
2.1.7 Autodesk Revit Architecture 2012 ............................................................................................. 19
2.1.8 Project Vasari and Project Nucleus ............................................................................................ 21
2.1.9 Autodesk Adaptive Components ............................................................................................... 23
2.2 Evolutionary Architecture and the Use of Algorithms in Optimization of Problems ..................... 25
2.2.1 Differential Evolution (DE) ......................................................................................................... 28
2.2.2 Pros and Cons of using Evolutionary Algorithms (EAs) .............................................................. 38
2.2.3 Other Evolutionary Based Algorithms - DE/EDA and Hybrid-DE ................................................ 39
2.3 Advanced Rendering, Visualization and Interaction Techniques in Architecture ......................... 40
2.3.1 Multitouch (MTT) ....................................................................................................................... 40
2.3.2 Virtual and Augmented Reality .................................................................................................. 43
2.3.3 Computer Generated Holography ............................................................................................. 47
2.3.4 Advanced Rendering .................................................................................................................. 50
xiv
2.3.5 Virtual World Interactivity ......................................................................................................... 53
2.4 “A World Full of Sensors” .............................................................................................................. 57
2.4.1 Sensors Feed Information into Virtual Worlds .......................................................................... 57
2.4.2 Remote Monitoring of Persons Inside Buildings ........................................................................ 60
2.4.3 Kinetic, Responsive Performative and Adaptive Architecture ................................................... 61
2.5 Reverse Engineering and Rapid Prototyping ................................................................................. 62
2.5.1 Reverse Engineering................................................................................................................... 63
2.5.2 Rapid Prototyping ...................................................................................................................... 68
2.5.3 Rapid Prototyping Techniques ................................................................................................... 69
2.5.4 Personal Fabrication and Future Manufacturing ....................................................................... 76
2.6 Building Information Modeling (BIM) and Automated Construction of Buildings ........................ 78
2.6.1 Building Information Modeling (BIM) ........................................................................................ 78
2.6.2 Automated Construction of Buildings ........................................................................................ 79
Simulation Tools in Architecture ..................................................................................................... 81 III.
3 Outline ............................................................................................................................................... 81
3.1 EnergyPlus and DesignBuilder ....................................................................................................... 82
3.2 Autodesk Ecotect Analysis ............................................................................................................. 84
3.2.1 Short Comparison between Autodesk Ecotect and EnergyPlus ................................................. 87
3.3 Ansys: AirFlow ............................................................................................................................... 88
3.4 Sustainability Tools in Architecture – Comparison Studies/Audits ............................................... 90
A Global Optimization Framework ................................................................................................. 91 IV.
4 Problematic of “Form Follows Energy” and the Pursuit of Solutions................................................ 91
4.1 Answer to the Problematic: Optimal Forms - A Global Optimization Framework ........................ 97
4.2 Identification of Essential Forms Used in the Real World ............................................................. 98
4.2.1 Procedural Shape Generation .................................................................................................... 99
4.2.2 Code Writing Using Euclides and JavaScript ............................................................................ 101
4.3 Simulation Tools Integration ....................................................................................................... 102
4.3.1 Simulation in Ecotect and the Admittance Method................................................................. 104
4.3.2 Initial Manual Workflow Tests ................................................................................................. 105
4.4 Differential Evolution................................................................................................................... 107
4.5 The Developed Global Optimization Framework ........................................................................ 110
4.6 Case Studies and Presentation of Results.................................................................................... 112
4.6.1 Case Study 1 – Classic Shape Building Optimization ................................................................ 113
4.6.2 Case Study 1 - Presentation of Results..................................................................................... 116
4.6.3 Case Study 2 – Cube Shape Building Optimization .................................................................. 126
xv
Conclusions and Future Work ....................................................................................................... 131
V.
5 Summary ......................................................................................................................................... 131
5.1 Final Conclusions ......................................................................................................................... 132
5.2 Future Work................................................................................................................................. 133
REFERENCES ............................................................................................................................................ 135
xvi
xvii
Figures List
xviii
xix
FIG. II-1 – SUICIDATOR CITY GENERATOR (SCG) FOR BLENDER 9
FIG. II-2 – PROCEDURAL/PARAMETRIC EXAMPLES CREATED IN CITYENGINE 10
FIG. II-3 – CITYENGINE IDE AND THE RULE EDITOR CAPABILITIES 12
FIG. II-4 – GENERATIVECOMPONENTS (GC) IDE, BENTLEY MICROSTATION 13
FIG. II-5 – RHINOCEROS IS USED IN MULTIPLE FIELDS, INCLUDING ARCHITECTURE 14
FIG. II-6 – VORONOI EXAMPLES, CREATED USING RHINOCEROS AND GRASSHOPPER, BY ATSUO NAKAJIMA (TOKYO, JAPAN) 15
FIG. II-7 - PARAMETRIC STRATEGIES ACHIEVED USING RHINO AND GRASSHOPPER. (CREATED BY THE AUTHOR OF THIS MASTER
THESIS) 15
FIG. II-8 - CREATION OF A SIMPLE HOUSE MODEL USING GML, THE EXTRUDE OPERATOR IS REPEATEDLY APPLIED TO THE GROUND
POLYGON. TO CREATE THE ROOF, THE COMBINED OPERATOR COLLAPSE-MID IS APPLIED TO THE FACECW AND FACECCW
EDGES OF THE EDGE RETURNED BY THE EXTRUDE OPERATION. 16
FIG. II-9 – PARAMETERIZATION/CONFIGURATION OF A CHAIR WITH GML 17
FIG. II-10 – GOTHIC STYLE BUILDING GENERATED WITH GML 17
FIG. II-11 – CONFIGURATION OF DIFFERENT WHEEL RIM STYLES USING GML 17
FIG. II-12 – EXAMPLE OF A 3D APPLICATION CREATED FOR THIS THESIS USING EUCLIDES AND JAVASCRIPT. IT ALLOWS THE
CONTROL OF SEVERAL SHAPE PARAMETERS ON THE “CLASSIC BUILDING FORM EXAMPLE”. 18
FIG. II-13 – ENERGY CONSUMPTION STUDY USING AUTODESK REVIT 19
FIG. II-14 – CONCEPTUAL DESIGN IN AUTODESK REVIT ARCHITECTURE 20
FIG. II-15 – SUN STUDIES USING PROJECT “VASARI” 21
FIG. II-16 - PANEL STUDY USING REVIT AND VASARI 22
FIG. II-17 - USING REVIT, VASARI AND NUCLEUS PHYSICS FOR A PANEL STUDY, PLUS ANALYSIS 22
FIG. II-18 – ADAPTIVE COMPONENTS IN AUTODESK REVIT 23
FIG. II-19 – ADAPTIVE PANEL EXAMPLE IN AUTODESK REVIT 24
FIG. II-20 – ANOTHER ADAPTIVE PANEL EXAMPLE, BUILT USING ADAPTIVE COMPONENTS 24
FIG. II-21 – IMAGE TAKEN FROM THE BOOK “THE SELFISH GENE” BY RICHARD DAWKINS 25
FIG. II-22 – SEVERAL VISIONS RELATED TO EVOLUTIONARY ARCHITECTURE AND BIOMIMETIC 26
FIG. II-23 – EVOLUTIONARY EXAMPLES TAKEN FROM THE BOOK “AN EVOLUTIONARY ARCHITECTURE” BY JOHN FRASER [23] 26
FIG. II-24 – DYNAMIC GEOMETRY COMPUTATION, “SHANGHAI TOWER - GEOMETRY GENERATE AND RENDERING” (MICHAEL
PENG) 27
FIG. II-25 – ECOLOGICAL HOUSE OF THE FUTURE (EUGENE TSUI) 27
FIG. II-26 – THE GLOBAL OPTIMIZATION PROBLEM (EXAMPLE IN MATLAB). SEARCH OF THE HIGHEST MOUNTAIN PEAK AMONG A
NEIGHBORHOOD OF OTHER HIGH MOUNTAINS PEAKS. 28
FIG. II-27 - OBSERVATION, ANALYSIS AND COMPUTATION OF BRANCHING PATTERNS IN NATURAL SYSTEMS (BY EVAN GREENBERG
[29]) 29
FIG. II-28 – SIMPLE EA’S STEPS 32
FIG. II-29 – GENERAL EVOLUTIONARY ALGORITHM: I: INITIALIZATION, F(X): EVALUATION, ?: STOPPING CRITERION, SE: SELECTION,
CR: CROSS-OVER, MU: MUTATION, RE: REPLACEMENT, X*: OPTIMUM. AUTHOR: JOHANN "NOJHAN" DRÉO 33
xx
FIG. II-30 - DE OPTIMIZATION PERFORMANCE (PEDERSEN, M. [31]) ON SEVERAL DIFFERENT PROBLEMS USING DE/RAND/1/BIN
ALGORITHM. PLOTS SHOW THE MEAN FITNESS ACHIEVED. OVER 50 OPTIMIZATION RUNS 34
FIG. II-31 – 3D ANAGLYPH AND ACTIVE STEREO VISUALIZATION ON A MULTITOUCH DI TABLE (MTT4ALL MULTITOUCH TABLE
WAS BUILT BY THE AUTHOR OF THIS MASTER THESIS) 41
FIG. II-32 – INITIAL MTT4ALL SCALE MODEL (LEFT) AND FINAL MTT4ALL FUNCTIONAL PROTOTYPE IN USE , RUNNING
FRAUNHOFER VIRTUALDESK APPLICATION, DEVELOPED BY THE AUTHOR OF THIS MASTER THESIS FOR FRAUNHOFER AUSTRIA
(RIGHT) 42
FIG. II-33 – FRAUNHOFER, MULTITOUCH ARCHITECTURE VISUALIZATION – MESSE FRANKFURT GMBH (USING THE
INSTANTREALITY FRAMEWORK) 43
FIG. II-34 – ANAGLYPH VISUALIZATION (CREATED BY THE AUTHOR OF THIS MASTER THESIS) 44
FIG. II-35 – ACTIVE STEREO VISUALIZATION (CREATED BY THE AUTHOR OF THIS MASTER THESIS) 44
FIG. II-36 – AN AUGMENTED REALITY SYSTEM DEVELOPED BY FRAUNHOFER (MONITOR + CAMERA + VIRTUAL REALITY
SOFTWARE) 45
FIG. II-37 – DAVE, CGV AUSTRIA: IMAGES ARE PROJECTED ON THE BACK PROJECTION SIDE WALLS AND ON THE FLOOR FROM
ABOVE, MMIRRORS ARE USED TO REDUCE THE SPACE NEEDED 45
FIG. II-38 – CGV AUSTRIA, THE DAVE, A 3D IMMERSIVE SYSTEM (EXPLORING THE NATIONAL LIBRARY OF VIENNA) 46
FIG. II-39 – THE SWEETHOME3D (MODELING APPLICATION) OUTPUT IS TRANSFERRED WITH A WEB SERVICE TO AN OPENSG
CAVE APPLICATION WHICH LETS THE USER WALK THROUGH A 3D REPRESENTATION OF THE HOUSE PLAN 46
FIG. II-40 – HEYEWALL (HIGH RESOLUTION MULTITOUCH SCREEN) 47
FIG. II-41 – COMPUTER GENERATED HOLOGRAPHY. A COMPUTER CALCULATES A HOLOGRAPHIC FRINGE PATTERN FOR DISPLAY BY
THE SPATIAL LIGHT MODULATOR (SLM), WHICH DIFFRACTS LASER LIGHT TO YIELD AN INTERACTIVE, TRUE 3D IMAGE 48
FIG. II-42 – TRADESHOW (AN HOLOGRAM SYSTEM) 49
FIG. II-43 – ALTHOUGH HE WAS IN MELBOURNE, TELSTRA'S CHIEF TECHNOLOGY OFFICER, HUGH BRADLOW (RIGHT), MAKES IS
PRESENCE FELT AT A CONFERENCE IN ADELAIDE (PHOTO: TELSTRA) 49
FIG. II-44 – TOUCHABLE HOLOGRAPHY INTERACTION SYSTEM. AN AERIAL IMAGING SYSTEM, A NON-CONTACT TACTILE DISPLAY
AND A WIIMOTE-BASED HAND-TRACKING SYSTEM ARE COMBINED. IN THIS FIGURE, THE ULTRASOUND IS RADIATED FROM
ABOVE AND THE USER FEELS AS IF A RAIN DROP HITS HIS PALM 50
FIG. II-45 – 3D PHOTOREALISTIC RENDERING (CREATED BY HARCHI, AN ARCHITECTURE COMPANY BASED IN PORTUGAL) 51
FIG. II-46 – IPL/CDRSP FUTURE BUILDING (RENDERED IN AUTODESK MAYA 2011 BY THE AUTHOR OF THIS MASTER THESIS) 52
FIG. II-47 – MEDIUM QUALITY RENDERING OF A FACTORY INSTALLATION (CREATED IN DEEP EXPLORATION BY THE AUTHOR OF
THIS THESIS) 52
FIG. II-48 – HIGH QUALITY REAL-TIME INTERACTIVE RENDERING (WWW.ICREATE3D.COM) 53
FIG. II-49 – MANIPULATION OF VR DATA, PROVIDED BY MEMPHIS [1] (TICIVIEW VR SYSTEM). IGI/FRAUNHOFER RESEARCH
GROUP, 2007, SEOUL - SOUTH KOREA, (THE AUTHOR OF THIS MASTER THESIS WAS A MEMBER IN THE TEAM RESPONSIBLE
FOR THE DEVELOPMENT OF THIS SYSTEM) 55
FIG. II-50 – ADVANCED INTERACTIVE VISUALIZATION (USING IVIEWER), STARTING FROM LEFT TO RIGHT: (A) SKYSCRAPPER; (B) AN
APARTMENT INSIDE THE SKYSCRAPPER (WWW.ICREATE3D.COM) 55
xxi
FIG. II-51 – VIRTUAL FACTORY SIMULATION/SERIOUS GAME THAT WILL ALLOW A COMPANY TO GIVE TRAINING TO USERS, (THIS
PROJECT WAS CREATED AT CDRSP BY THE AUTHOR OF THIS MASTER THESIS) 56
FIG. II-52 – SCREENSHOT OF THE TRICORDER DEVICE SHOWING THE FLOORPLAN OF A LAB, OVERLAYED WITH PLUG ICONS TO
REPRESENT SOUND, LIGHT, CURRENT CONSUMPTION, MOTION AND VIBRATION. ALSO AVERAGE DATE FROM ALL SENSORS IS
DISPLAYED [52] 58
FIG. II-53 – A PORTAL IN SECOND LIFE SHOWS SENSOR DATA OVER TIME 59
FIG. II-54 – A VIRTUAL DATAPOND IN THE VIRTUAL ATRIUM (LEFT) AND A REAL DATAPOND IN THE REAL MEDIA LAB ATRIUM
(RIGHT) 60
FIG. II-55 – DIFFERENT REPRESENTATIONS OF A PERSON DETECTION IN 3D, STARTING FROM LEFT TO RIGHT: (A) BILLBOARD WITH
A THIN COLORED SURROUNDING LINE, (B) ADDITIONAL GEOMETRY, (C, D) OVERLAY MARKER WHICH IS NOT OCCLUDED BY
THE SCENE 60
FIG. II-56 – A MODEL WHICH IS A KINETIC PAVILION THAT REACTS ON WEATHER DATA 61
FIG. II-57 – PHASES OF THE REVERSE ENGINEERING PROCESS 63
FIG. II-58 – REVERSE ENGINEERING - CLASSIFICATION TECHNIQUES FOR 3D DATA ACQUISITION 64
FIG. II-59 – STEINBICHLER COMET 5 PHOTOGRAMMETRY 3D SCAN EQUIPMENT AT CDRSP REVERSE ENGINEERING LABORATORY
65
FIG. II-60 – A 3D SCAN OF A REAL GRAPHITE ELECTRODE USED IN MOLDS INDUSTRY. (3D SCAN AND ANALYSIS/INSPECTION DONE
BY THE AUTHOR OF THIS MASTER THESIS, USING STEINBICHLER COMET 5 EQUIPMENT, AND STEINBICHLER COMET PLUS AND
COMET INSPECT SOFTWARE) 65
FIG. II-61 – “SANTUÁRIO DO SENHOR DA PEDRA”, ÓBIDOS, PORTUGAL 66
FIG. II-62 – POINTS OBTAINED FOR THE “SANTUÁRIO DO SENHOR DA PEDRA” BUILDING 66
FIG. II-63 – 3D CLOUD OF POINTS FOR THE “SANTUÁRIO DO SENHOR DA PEDRA” BUILDING 66
FIG. II-64 - A) STL AND B) 3D MODEL 67
FIG. II-65 – SCALE MODEL OF “SANTUÁRIO DO SENHOR DA PEDRA” OBTAINED BY RAPID PROTOTYPING 67
FIG. II-66 – MAIN COMPONENTS OF A STEREO-LITHOGRAPHY MACHINE 69
FIG. II-67 – PROTOTYPES PRODUCED USING STEREO-LITHOGRAPHY 70
FIG. II-68 – SIMPLIFIED FDM PROCESS 71
FIG. II-69 – SCALE MODELS OBTAINED USING FDM 72
FIG. II-70 – SCHEME OF THE LOM PROCESS 73
FIG. II-71 – SCALE MODEL FOR OPORTO MUSIC HOUSE, PRODUCED USING LOM (PORTUGAL) 73
FIG. II-72 – SCHEME OF THE SELECTIVE LASER SINTERING (SLS) 74
FIG. II-73 - 3D PRINTING PROCESS 75
FIG. II-74 – COMPLETE SCALE MODEL OBTAINED USING THE 3D PRINTING PROCESS 75
FIG. II-75 – COMBINING SEVERAL TECHNOLOGIES/PROCESSES 76
FIG. II-76 - A BRUSH MADE IN A 3D PRINTER, USING TWO DIFFERENT MATERIALS, PRINTED SIMULTANEOUSLY INTO A SINGLE AND
NOT ASSEMBLED FUNCTIONAL OBJECT (OBJECT INC.) 77
xxii
FIG. II-77 – BIM VIRTUAL INFORMATION (VIRTUAL SIMULTANEOUS VISUALIZATION OF SIX DIFFERENT PHASES OF AN ONGOING
BUILDING PROJECT, CREATED USING GRAPHISOFT ARCHICAD PLATFORM) 79
FIG. II-78 – VISION FOR AN AUTOMATED SYSTEM FOR AUTONOMOUS CONSTRUCTION OF BUILDINGS (BEHROKH KHOSHNEVIS) 80
FIG. II-79 – A REAL PROTOTYPE FOR AN AUTOMATED SYSTEM THAT WILL ALLOW THE CREATION OF BUILDINGS 80
FIG. III-1 – ENERGYPLUS SIMULATION ZONES 82
FIG. III-2 – DESIGNBUILDER AND ITS BUILDINGS ENERGY EFFICIENCY RATING 83
FIG. III-3 - WORKING IN ENERGYPLUS-MODE INSIDE ECOTECT, WHEN DEFINING OPERATIONAL SCHEDULES 84
FIG. III-4 - INTERNAL DAYLIGHT FACTORS SHOWN OVER A STANDARD WORKING PLANE 85
FIG. III-5 - OVERLAYING A SUN-PATH ON THE MODEL VIEW 85
FIG. III-6 - ANNUAL CUMULATIVE SOLAR RADIATION OVER THE EXTERNAL SURFACES 86
FIG. III-7 – COLOURED CONTOURS OF THERMAL CONFORT IN A CONFERENCE ROOM PREDICTED FOR A PARTICULAR VENTILATION
SYSTEM DESIGN (ANSYS, INC. PROPRIETARY) 88
FIG. III-8 – ANSYS CFD MODELLING OF REGIONAL FLOW PATTERNS NEAR CAPE SHOPPING CENTRE (STEPHAN SCHMITT &
THOMAS KINGSLEY; QFINSOFT, SA) 89
FIG. IV-1 – DIFFERENT POSSIBLE BUILDINGS FORMS, ALL WITH 1000M3 OF VOLUME, THIS CAN ALLOW THE COMPARISON OF
RESULTS OBTAINED WITH DIFFERENT FORMS (PHD THESIS OF CHRISTINA LEMKE [88]) 92
FIG. IV-2 – DEFINITION OF AN ELEMENTARY VOLUME, ACCORDING TO DEPECKER, P., ET AL. [91] 93
FIG. IV-3 – FLOW FIELD AT A STREET INTERSECTION WITH A TALL BUILDING, ILLUSTRATING EXCHANGES BETWEEN THE STREETS AND
ADDITIONAL MIXING PROCESSES DUE TO THE LARGE BUILDING 94
FIG. IV-4 – VIEW OF GREENHOUSE SHAPES IN E-W ORIENTATION 96
FIG. IV-5 – 3D MODELS THAT REPRESENT REAL WORLD FACTORIES 98
FIG. IV-6 - SELECTED BASIC FORMS INSPIRED IN REAL WORLD BUILDING SHAPES, STARTING FROM LEFT TO RIGHT: (A) CUBE, (B)
CLASSIC AND (C) CYLINDER) CREATED USING EUCLIDES AND RENDERED USING DEEP EXPLORATION 98
FIG. IV-7 – TESTS FOR CREATING DIFFERENT 3D SHAPES (AND CONTROLLING ITS PARAMETERS) USING PARAMETRIC EQUATIONS 99
FIG. IV-8 – DEFINITION OF PARAMETRIC EQUATIONS IN MAPPLE 14 100
FIG. IV-9 - OPTIMIZED CODE GENERATION PRODUCED BY MAPPLE 14 100
FIG. IV-10 – PIECE OF JAVASCRIPT CODE TO GENERATE THE 3D CYLINDER SHAPE (CODE ADAPTED FROM PARAMETRIC EQUATIONS
AND MAPPLE 14) 101
FIG. IV-11 – RESULTING JAVASCRIPT/EUCLIDES INTERFACE THAT ALLOWS THE CONTROL OF EACH SHAPE PARAMETER 101
FIG. IV-12 - 3D SHAPE GENERATION IN EUCLIDES (1.000 M3 OF VOLUME); FOLLOWED BY THERMAL ANALYSIS; AND
PRESENTATION OF THE ANALYZED SHAPE. OTHER TYPES OF ANALYSIS CAN BE PERFORMED AS WELL. 105
FIG. IV-13 – ANOTHER 3D SHAPE GENERATION IN EUCLIDES (MAINTAINING 1.000 M3); FOLLOWED BY THERMAL ANALYSIS; AND
PRESENTATION OF THE ANALYZED SHAPE 106
FIG. IV-14 – EXAMPLE LIST FOR MATERIALS THAT CAN BE USED INSIDE AUTODESK ECOTECT 109
FIG. IV-15 - OVERVIEW OF “OPTIMAL FORMS”, THE DEVELOPED AND PROPOSED GLOBAL OPTIMIZATION FRAMEWORK 110
xxiii
FIG. IV-16 – THE GLOBAL OPTIMIZATION FRAMEWORK RUNNING AUTONOMOUSLY (SIMULATION IN ECOTECT FOLLOWED BY 3D
SHAPE GENERATION IN ORDER TO EVOLVE A POPULATION OF NEW BUILDINGS WITH DIFFERENT PARAMETERS USING THE
DIFFERENTIAL EVOLUTION ALGORITHM) 111
FIG. IV-17 – THE ALGORITHM CHOOSES DIFFERENT INDIVIDUALS (BUILDINGS) FOR ANALYSIS, NOT STOPPING ON THE LOCAL
MINIMA THAT WAS FOUND ALONG THE OPTIMIZATION PROCESS AND GIVING ROOM FOR JUMPING THOSE SAME LOCAL
MINIMA 116
FIG. IV-18 – IN THIS RUN IT WAS GENERATED A COMPLETELY DIFFERENT INDIVIDUAL (BUILDING SHAPE), BUT THE ADMITTANCE
VALUE WAS REALLY HIGH AND OTHER BETTER INDIVIDUALS WERE FOUND 117
FIG. IV-19 – A NEW OPTIMIZATION RUN, THIS TIME USING A DIFFERENT VALUE FOR CROSSOVER (0.6) 118
FIG. IV-20 – OTHER “TIGHTER” CONSTRAINS WERE CHOSEN, AND THE RESULTS WERE SLIGHTLY WORST THEN THE INITIAL
ATTEMPTS (RUNS 1, 2 AND 3) WHERE A WIDER DOMAIN OF SEARCH WAS USED 119
FIG. IV-21 – THE OPTIMIZATION RUNS (1, 2 , 3 AND 4) ARE PLOTTED HERE SIMULTANEOUSLY 120
FIG. IV-22 – BY ALLOWING THE ALGORITHM TO GENERATE BUILDINGS THAT COULD USE A DIFFERENT ORIENTATION (LESS
CONSTRAINED REGARDING THE ORIENTATION OF THE BUILDING), MORE EVALUATIONS WERE NEEDED, BUT A MUCH BETTER
RESULT WAS OBTAINED 121
FIG. IV-23 – A NEW CONSECUTIVE OPTIMIZATION RUN (USING EXACTLY THE SAME VALUES) IN ORDER TO CHECK IF THE BEHAVIOR
WAS CONSISTENT FROM RUN TO RUN 122
FIG. IV-24 – BY ALLOWING THE ALGORITHM TO SEARCH FOR SOLUTIONS IN AN ORIENTATION DOMAIN BETWEEN 0O AND 360O
AND BECAUSE THE BUILDING DOES NOT HAVE DOORS OR WINDOWS YET, THE ALGORITHM FOUND A GOOD SOLUTION BY
ORIENTING THE BUILDING ON A DIFFERENT DIRECTION (WHEN COMPARED TO RUNS 5 AND 5_1) 123
FIG. IV-25 – BY PLOTTING ALL THE INDIVIDUALS GENERATED BY THE GLOBAL OPTIMIZATION FRAMEWORK FOR RUNS (5, 5_1 AND
6) IT’S POSSIBLE TO CHECK THE CONSISTENCE OF RESULTS OBTAINED ON THESE MORE COMPLETE OPTIMIZATION RUNS 124
FIG. IV-26 – CASE STUDY 1 (FINAL CLASSIC BUILDING SHAPE) – DAILY AND ANNUAL SUN PATHS + SHADOWS AND DAYLIGHT
LEVELS AT 14:00 PM, FOR THE 8TH OF SEPTEMBER (BASED ON SPECIFIC CLIMATE/WEATHER FILES FOR COIMBRA,
PORTUGAL); WIDTH = 20 M; HEIGHT = 6 M; ROOF ANGLE = 60O; VOLUME = 1000 M3; ORIENTATION: 122,10O 125
FIG. IV-27 – WE CAN OBSERVE THAT AT EVALUATION 289 THE OPTIMIZATION FRAMEWORK HAD ALREADY ACHIEVED A VERY GOOD
RESULT (COMPARED TO THE FINAL RESULT), BUT BECAUSE THE STOP CRITERION USED WAS, 1000 EVALUATIONS OR DX =
1.0, THE OPTIMIZATION CONTINUED UNTIL DX = 1.0 FOR FOUR CONSECUTIVE TIMES 128
FIG. IV-28 - CASE STUDY 2 (FINAL CUBE BUILDING SHAPE) – DAILY AND ANNUAL SUN PATHS + SHADOWS AND TOTAL
RADIATION LEVELS AT 14:00 PM, FOR THE 8TH OF SEPTEMBER (BASED ON SPECIFIC CLIMATE/WEATHER FILES FOR
COIMBRA, PORTUGAL); WIDTH = 10 M; HEIGHT = 6 M; VOLUME = 1000 M3; ORIENTATION: 121,86O 129
xxiv
xxv
Tables List
xxvi
xxvii
TABLE III-1 - CHARACTERISTICS OF TWO DIFFERENT SIMULATION TOOLS [83] ................................................................. 87
TABLE IV-1 – ACTIVITY LEVEL IN AUTODESK ECOTECT ............................................................................................... 108
TABLE IV-2 - THIS TABLE SHOWS SEVERAL RUNS USING THE DEVELOPED FRAMEWORK IN ORDER TO OPTIMIZE THE DESIGN OF A
“CLASSIC SHAPE FORM” BUILDING USING THE DIFFERENTIAL EVOLUTION (DE) ALGORITHM. IN THIS TABLE IS ALSO POSSIBLE
TO SEE THE DIFFERENT CONSTRAINS AND PARAMETERS USED IN THE OPTIMIZATION PROCESS .................................... 114
TABLE IV-3 - RESULTS OBTAINED IN EACH OF THE OPTIMIZATION RUNS (ACCORDING TO TABLE IV-2).................................. 115
TABLE IV-4 – THIS TABLE SHOWS AN OPTIMIZATION RUN USING THE DEVELOPED FRAMEWORK IN ORDER TO OPTIMIZE THE DESIGN
OF A “CUBE SHAPE FORM” BUILDING USING THE DIFFERENTIAL EVOLUTION (DE) ALGORITHM. IN THIS TABLE IS ALSO
POSSIBLE TO SEE THE DIFFERENT CONSTRAINS AND PARAMETERS USED IN THE OPTIMIZATION PROCESS ....................... 127
TABLE IV-5 - RESULTS OBTAINED IN EACH OF THE OPTIMIZATION RUNS (ACCORDING TO TABLE IV-4).................................. 128
xxviii
xxix
Abbreviations
and Acronyms
xxx
xxxi
A
ANM: Annealed Nelder and Mead strategy · 37
AR: Augmented Reality · 42, 44, 53, 54, 55
ASA: Adaptive Simulated Annealing · 37
ASHRAE: American Society of Heating, Refrigerating
and Air Conditioning Engineers · 87
B
BGA: Breeder Genetic Algorithm · 37
BIM: Building Information Modeling · 11, 19, 21, 78, 79
BLAST: Building Loads Analysis and System
Thermodynamics · 82, 83
C
CAD: Computer Aided Design · 8, 10, 14, 61, 63, 65, 78,
79, 125, 133
CAM: Computer Aided Manufacturing · 14
CAVE: Cave Automatic Virtual Environment · 42, 45, 46
CC: Contour Crafting (Automated Construction System)
· 80
CDRSP: Centre for Rapid and Sustainable Development
of the Product · vii, 2, 7, 41, 43, 52, 56, 65, 68
CEP: Complex Event Processing · 59
CFD: Computational Fluid Dynamics · 88, 89, 133
CGA: Computer Generated Architecture (shape
grammar) · 11
CGH: Computer Generated Holography · 47, 48, 49
CIBSE: Chartered Institution of Building Services
Engineers · 87, 104, 105, 141
CNC: Computer Numeric Control · 14
CPU: Central Processing Unit · 51, 52
CR: Crossover · 34
CSG: Constructive Solid Geometry · 16
CT: Computed Tomography · 51
D
DAVE: Definitely Affordable Virtual Environment
(Immersive VR System Developed by Fraunhofer) ·
45, 46
DDE: Dynamic Data Exchange · 87
DE: Differential Evolution · 4, 28, 32, 33, 34, 35, 36, 37,
39, 107, 112, 114, 127
E
EA: Evolutionary Algorithm · 28, 32, 38
EASY: Evolutionary Algorithm with Soft Genetic
Operators · 37
EDA: Estimation of Distribution Algorithm · 39
ES: Evolutionary Strategies · 37
ESRI: Environmental Systems Research Institute · 10
ESTG: Superior School of Technology and Management
· 2
Euclides: Fraunhofer JavaScript Procedural Modeler ·
2, 4, 5, 18, 81, 97, 98, 100, 101, 102, 103, 105, 106,
107, 109, 112
F
FAR: Floor Area Ratio · 12
FCT: Portuguese Foundation for Science and
Technology · 7
FDM: Fused Deposition Modeling · 68, 70, 71, 72
G
GA: Genetic Algorithm · 30
GC: Bentley Microstation GenerativeComponents · 13
GFA: Gross Floor Area · 12
GIS: Geographic Information Systems · 10, 40
GML: Generative Modeling Language · 16, 17, 18
GPU: Graphics Processing Unit · 51, 52
xxxii
H
HDE: Hybrid Differential Evolution · 39
HSM: High Speed Machining · 68
HVAC: Heating, Ventilation and Air Conditioning · 82,
83
I
ICEO: IEEE Competition on Evolutionary Optimization ·
37
IEEE: Institute of Electrical and Electronics Engineers ·
37
IPL: Polytechnic Institute of Leiria · 2, 52
L
LUA: Lightweight multi-paradigm programming
language designed as a scripting language with
extensible semantics as a primary goal · 87, 102,
103, 141
M
MTT: 2.3.1 Multitouch · 40
MTT4ALL: Multitouch Table developed by the Author
of this master thesis · 41, 42
N
NP: Number of Elements in Each Generation · 34, 36,
37
NURBS: Non-uniform rational basis spline · 14, 16
O
OLED: Organic Light-Emitting Diode · 58
P
PDM: Product Data Management · 8
PLM: Product Lifecycle Management · 8
R
Rhino: (a.k.a. Rhinoceros), it's a commercial NURBS-
based 3D modeling tool, developed by Robert
McNeel & Associates · 14; Rhinoceros (Robert
McNeel & Associates) · 14, 15
RICS: Royal Institution of Chartered Surveyors · 90, 140
S
SCG: Suicidator City Generator · 9
SDE: Stochastic Differential Equations · 37
SLM: Spatial Light Modulator · 48
SLS: Selective Laser Sintering process · 68, 69, 73, 74
STL: A file format native to the stereolithography CAD
software · 67
T
Tabletops: Horizontal Interactive Displays · 40, 41
V
VEs: Virtual Environments · 53
VR: Virtual Reality · 42, 43, 53, 54, 55
VRML: Virtual Reality Markup Language · 43
xxxiii
1
I.
Introduction
1 The Problematic
The adjustment of architectural forms to local and specific solar radiation conditions is a
fundamental study that must be always conducted by architects. When discussing energy
consumption and solar power harness in buildings, important topics of discussion come
into play, like the real relation between a building form and its energy behavior, or finding
the right building shape for a specific location and weather conditions on an all year basis.
Several studies were published so far, to try to answer and demonstrate these and other
important questions. Form follows energy, but how exactly is this happening it‟s somehow
difficult to demonstrate without having automated tools and models. One must try to
manually analyze the energy dependence between form and volume. With this kind of
studies, there is an attempt to simultaneously adapt a building form, in order to increase the
potential areas for solar radiation “reception” and at the same time looks for ways to
reduce the thermal loss (here the admittance method, well known by architects, it‟s useful),
taking in account the need to design for specific locations and specific weather conditions.
This research work aims, to examine the theoretical concepts associated to the problem of
“Form Follows Energy”, pointed out, in studies done by some researchers. Also, the
present study discusses emergent methods based on evolutionary algorithms and
environmental simulation tools and it targets the development of new design methods that
allow the construction of sustainable optimized buildings by using digital technologies,
through the creation of an automated tool and an optimization framework that will allow
the optimization of 3D shapes (buildings), taking in account the geo-location and specific
weather conditions, throughout the run of automated simulations, making autonomous
changes and optimizations utilizing evolutionary algorithms.
2
For the creation of this work, a strong collaboration between the Polytechnic Institute of
Leiria/Superior School of Technology and Management/Centre for Rapid and Sustainable
Product Development (IPL/ESTG/CDRSP) and Fraunhofer Austria was established.
The main research objectives of this work can be listed as follows:
(i) To give an overview of state of the art technologies and techniques
currently employed in the architecture field, regarding simulation and
analysis, visualization, rendering, virtual interaction, rapid prototyping,
reverse engineering and automated construction;
(ii) To evaluate common shapes used in real world buildings, with focus on
greenhouses forms as a practical example case study;
(iii) To investigate, in order to obtain the necessary parametric equations
(required to the use of computer graphics in the creation of procedural 3D
models) for the several identified common building shapes. Also, to
extract the essential parameters (height, width, length, orientation, roof
angle…) of those fundamental shapes, in order to achieve a fully
parametrical defined 3D model, this will allow the use of Euclides
(JavaScript Procedural Modeler);
(iv) To research on the possibility of having a programming integration with
commonly used simulation packages and tools, to simulate how the
different shapes of buildings have an influence in energy consumption
throughout the life of these real buildings, with the final purpose of
developing an automated tool capable of running automated simulations;
3
(v) To make use of evolutionary algorithms in order to perform autonomous
and automatic optimizations of 3D shapes based on automated simulations
and the well-known method of admittance, always employing tools and
methods which are widely accepted in the architecture field. The final goal
is the creation of a global optimization framework for automatic
generation of optimized 3D building forms, also taking in account the
specific location weather conditions;
(vi) To present and explain the importance of the results achieved with the
developed global optimization framework, pointing out new directions in
sustainable architecture design;
Presently, this study is applied to architecture and sustainability. But it must be referred
that this problematic of evolutionary architecture and simulation tools integration can be
extended to other domains/fields, like the industrial or the medical field. They could also
benefit from an autonomous generation of different 3D shapes, as well as a self-governing
optimization of those same 3D forms.
4
1.1 Thesis Structure
The thesis is divided into five chapters, which develops in accordance with the identified
research objectives. This first chapter (Introduction) comprises an introduction that in
addition to listing the key objectives, also briefly describes the context of the research.
The contents of the remaining chapters are summarized as follows:
State of the Art
Reviews the latest work developed around procedural modeling, visualization
techniques, digital fabrication and reverse engineering. It also presents and
describes a JavaScript Framework, named “Euclides”, utilized for the easy creation
of 3D procedural and parametric shapes. This was the procedural framework used
in this thesis for the creation of all the necessary parameterized 3D buildings forms.
Lastly, a briefly explanation of how evolutionary algorithms work, is also given.
Moreover, a specific evolutionary algorithm (Differential Evolution - DE) is
described, as well as the reasons why this particularly algorithm was chosen in this
thesis, for the development of an automated tool for 3D shapes (buildings forms)
optimization. Other evolutionary algorithms are pointed out too, as plausible
alternatives to be implemented within the optimization framework in future work,
in order to tackle other problematic;
Simulation Tools in Architecture
Gives an overview of different interests in simulation, in particular those related to
the problematic of architecture and energy consumption in buildings. Some
simulation tools/packages are presented, together with the reasons for selecting a
particular tool to be used in the work presented here;
5
A Global Optimization Framework
Explains the work developed throughout this thesis, on the problematic of how
buildings form affects the energy consumption on a daily basis throughout its entire
life. Several methodologies used for choosing parameters to control a specific 3D
shape as well as other tools used to deduce parametric equations and “mathematical
code”, are also described with the objective to show how these 3D parametric
models were generated using Euclides and JavaScript. Also, the general concept of
the developed optimization framework is explained. A practical overview of the
work is given, every framework component is presented in more detail and the
achieved results are presented and clarified. Finally a case study is presented, where
the problem of automatic optimization is extremely relevant and the results
obtained are then presented and explained;
Conclusions and Future Work
Provides an overall summary of the thesis and points out further progress paths and
improvement options for the autonomous global optimization framework that was
developed and presented in this thesis;
7
II.
State of the
Art
2 Overview
A review on the state of the art is presented, regarding current work focused on procedural,
parametric and adaptive architecture modeling. A short description of evolutionary
algorithms is given. Also, innovative methods of visualization and presentation of
architecture projects are presented, as well as several techniques for rapid prototyping and
reverse engineering. These methods and techniques are essential to capture 3D geometry,
for achieving more complete results on any architecture project (e.g. production of scale
models for simulation in wind tunnels, virtual simulation, building control…) and essential
for presenting the achieved results to final customers (rendering, interactivity…), also, the
author of this thesis was a research member of the CDRSP Research Centre and earned a
scholarship, on the topic “Build-it-Green”, from the Portuguese Foundation for Science
and Technology (FCT). This “Build-it-Green” topic is closely related to the architecture
subject and some of the work that was developed at CDRSP by the author, was focused on
these same areas and it was conducted throughout the realization of this master thesis.
This state of the art review, aims to present a short explanation about each product,
methodology or recent development. For getting more insightful details, the correspondent
references should be further investigated.
8
2.1 Parametric and Procedural Modeling
Parametric Computer Aided Design (CAD) modeling assumes, nowadays, an important
role in the definition of 3D models. There are several active attempts to collect all the
information about a product or about the different parts that compose a product.
Information platforms like Product Data Management (PDM) or Product Lifecycle
Management (PLM) [1], offer a way to gather the different distributed data that is vital for
an efficient product management. However there is some “intelligent” information that
must be captured with each part and product assembly, such as parametric information
(e.g. width, height, volume, orientation, length, relations between parts, formulas …). Also
semantic methods, using ontologies, try to present solutions for solving problems like the
relationship between different, yet related 3D geometric information [2]. Procedural
modeling can be viewed as the use of different techniques in computer graphics to create
(generate) 3D models and textures from sets of rules. L-Systems, fractals, and generative
modeling are procedural modeling techniques since they apply algorithms for producing
scenes. The set of rules may either be embedded into the algorithm, configurable by
parameters, or a set of rules that is completely separated from the evaluation engine. The
output is then called procedural content, which can be used in computer games, films, be
uploaded to the internet while requiring much less bandwidth, or the user may edit the
content manually [3].
Procedural models often exhibit database amplification, meaning that large scenes can be
generated from a much smaller amount of rules. If the employed algorithm produces the
same output every time, the output needs not to be stored. Often, it is sufficient to start the
algorithm with the same random seed to achieve the same result. Although all modeling
techniques on a computer require algorithms to manage and store data at some point,
procedural modeling focuses on creating a model from a rule set.
Procedural modeling is often applied when it would be too cumbersome to create a 3D
model using generic 3D modelers, or when more specialized tools are required, this is
often the case for plants, architecture or landscapes [4].
9
2.1.1 Plugins for existent 3D Software: Blender, 3D
Studio Max and Others
There are many plugins available on the internet for use within commonly used 3D
modeling software, like Autodesk 3D Studio Max, Autodesk Maya or the open source
modeling software Blender and many others that allow the automatic generation of terrain,
buildings or even cities in a procedural way.
These plugins permit the creation of 3D models according to specified rules and custom
parameters specified by the user, they can also be customized through the use of scripting
languages like, Python or MEL. Suicidator City Generator (SCG) is a wonderful example
of such plugin for use inside Blender. It is a Python script for Blender or in other words it
is a program written in the Python programming language that runs inside the Blender
environment [5].
It‟s not the purpose of this work to explain in detail how these plugins perform, however
they must be mentioned here as an existent and possible path for the creation of generative
components in today‟s 3D modeling software packages.
Fig. II-1 – Suicidator City Generator (SCG) for Blender
10
2.1.2 CityEngine
City Engine (now acquired by ESRI) is one of the most successful and powerful examples
for procedural modeling, it‟s a standalone software that provides a unique conceptual
design and modeling solution for the efficient creation of 3D cities and buildings, for
professional users in entertainment, architecture, urban planning, Geographic Information
Systems (GIS) and general 3D content production [4]. CityEngine was also tested in this
master thesis study.
Fig. II-2 – Procedural/parametric examples created in CityEngine
The key highlights of CityEngine include [6]:
GIS/CAD Data Support and OpenStreet Map Import
CityEngine supports industry standard formats like, ESRI Shape file or DXF which
allow the import/export of any geo-spatial/vector data such as parcels, building
footprints with arbitrary attributes, or line data to create street networks. To copy
real cities or efficiently create an urban environment for our design, it‟s possible to
use data from OpenStreet Map. Geospatial data of real cities can also be
downloaded and directly imported it into CityEngine;
11
Dynamic City Layouts and Street Networks Patterns
An intuitive toolset is provided to interactively design, edit and modify urban
layouts consisting of (curved) streets, blocks and parcels. Street construction or
block subdivision is controlled via parametric interfaces, giving immediate visual
feedback; CityEngine offers unique street grow tools to quickly design and
construct urban layouts. Street patterns such as, grid, organic or circular, are
available and the topography of the terrain is taken into account;
Rule-based Modeling Core
Procedural modeling based on Computer Generated Architecture rules (CGA shape
grammar) offers unlimited possibilities to control mass, geometry assets,
proportions, or texturing of buildings or streets on a city-wide scale. We can define
our own rules using custom textures/models in the node- or text-based rule editor;
Facade Wizard and Parametric Modeling Interface
One can quickly create rules out of an image or a textured mass model with this
simple and easy-to-use visual facade authoring tool. The resulting facade rules are
size-independent, contain level-of-detail and can be extended with e.g. detailed
window asset. A convenient interface to interactively control specific street or
building parameters such as the height or age (defined by the rules) is provided and
with the live mode, parameter modifications invoke the automatic regeneration of
the 3D model;
Map-Controlled City Modeling and Reporting (Building Information Modeling -
BIM for Cities)
Any parameter of the buildings and streets can be controlled globally via image
maps (for example the building heights or the land use-mix); this allows for
intuitive city modeling and quick changes on a city-wide scale. Furthermore,
terrains can be imported, aligned, and exported. Customized rule-based reports can
be generated to analyze the urban design e.g. automatically calculate quantities
12
such as Gross Floor Area (GFA), Floor Area Ratio (FAR), etc. Reports are updated
automatically and instantaneously and can be made for whole city parts;
Industry-Standard 3D Formats
CityEngine supports Collada, Autodesk FBX, 3DS, Wavefront OBJ and e-on
software's Vue, which allow for flawless 3D data exchange; FBX and Collada
support asset instancing, multiple UV-sets, grouping and binary encoding;
furthermore, scenes can also be exported to RenderMan RIB or Mental Ray MI
format. Textures can be collected during (batch) export;
Python
Allows streamlining repetitive or pipeline-specific tasks with the integrated Python
scripting interface (e.g. write out arbitrary meta-data or instancing information for
each building, import FBX cameras, etc...). CityEngine is also available for
Windows (32/64 bits), Mac OSX (64 bits), and Linux (32/64 bits).
Fig. II-3 – CityEngine IDE and the Rule Editor Capabilities
13
2.1.3 Bentley – MicroStation Extension:
GenerativeComponents (GC)
Designers have (since the dawn of times), wanted to innovate. Indeed, innovation is widely
regarded as a trophy that awaits creative professionals who successfully explore endless
design alternatives to ultimately arrive at the most efficient solution - a process that can be
incredibly time consuming as each alternative is thoroughly modeled and assessed. Using
the existing tools, a minor change to a design may require a major update to the model,
thus restricting the number of design alternatives considered by the team due to time
constraints. GenerativeComponents is an associative parametric modeling system used by
architects and engineers to automate design processes and accelerate design iterations. As
an innovation by MicroStation, GenerativeComponents extends proven technologies and
delivers significant advantage to users as they rapidly explore a broad range of design
alternatives. With a hybrid approach, designers who use GenerativeComponents can model
geometry, capture relationships, and generate forms using scripts and/or direct
manipulation for unrivalled creative flexibility [7].
Fig. II-4 – GenerativeComponents (GC) IDE, Bentley MicroStation
This combination of accelerated iteration, flexible modeling, and automated process,
means that a GenerativeComponents design can be highly efficient, benefiting from a
combination of intuition and logic [7].
14
2.1.4 Rhinoceros and Grasshopper
Rhino (a.k.a. Rhinoceros) is a stand-alone, commercial NURBS-based 3D modeling tool,
developed by Robert McNeel & Associates. The software is commonly used for industrial
design, architecture, marine design, jewelry design, automotive design, CAD / CAM, rapid
prototyping, reverse engineering as well as the multimedia and graphic design industries.
Fig. II-5 – Rhinoceros is used in multiple fields, including architecture
Rhino is specialized in free-form non-uniform rational B-spline (NURBS) modeling. Plug-
ins developed by McNeel includes Flamingo (retrace rendering), Penguin (non-
photorealistic rendering), Bongo and Brazil (advanced rendering). Over one hundred third-
party plugins are available. There are also rendering plug-ins for Maxwell Render, V-ray
and many other engines. Additional plugins for CAM and CNC milling are available as
well, allowing for tool path generation directly in Rhino. Like many other modeling
applications, Rhino also features a scripting language, based on the Visual Basic language
and an SDK that allows reading and writing Rhino files directly. Rhino 3D gained its
popularity in architectural design in part because of the Grasshopper plug-in for
computational design. Many new avant-garde architects are using parametric modeling
tools, like Grasshopper. Rhino's increasing popularity is based on its diversity, multi-
disciplinary functions, low learning-curve, relatively low cost, and its ability to import and
15
export over 30 file formats, which allows Rhino to act as a “converter” tool between
programs in a design workflow. The combination between Rhino and Grasshopper is just
perfect to create all kinds of parametric studies and developments on any field. The power
offered by Rhino and Grasshopper is just amazing. There are also many other plugins
available (rendering, math, physics, kinematics…).
Fig. II-6 – Voronoi Examples, created using Rhinoceros and Grasshopper, by Atsuo Nakajima (Tokyo,
Japan)
For designers who are exploring new shapes using generative algorithms, Grasshopper is a
graphical algorithm editor tightly integrated with Rhino‟s 3D modeling tools [8]. In Fig.
II-7, by using Grasshopper, a building and its structural supports are generated and
calculated using only two splines created initially in Rhino.
Fig. II-7 - Parametric Strategies achieved using Rhino and Grasshopper. (Created by the author of
this master thesis)
Unlike RhinoScript, Grasshopper requires no knowledge of programming or scripting, but
still allows designers to build form generators from the simple to the remarkable [8].
16
2.1.5 Generative Modeling Language (GML)
Traditionally, 3D objects and virtual worlds are defined by lists of geometric primitives:
cubes and spheres in a Constructive Solid Geometry (CSG) tree, NURBS patches a set of
implicit functions, a soup of triangles, or just a cloud of points.
The term “generative modeling” describes a paradigm change in shape description, the
generalization from objects to operations: A shape is described by a sequence of
processing steps, rather than just the end result of applying operations. Shape design
becomes rule design. This approach is very general and it can be applied to any shape
representation that provides a set of generating functions, the “elementary shape
operators”. Its effectiveness has been demonstrated, e.g., in the field of procedural mesh
generation, with Euler operators as complete and closed set of generating functions for
meshes, operating on the half-edge level [9].
Fig. II-8 - Creation of a simple house model using GML, the extrude operator is repeatedly applied to
the ground polygon. To create the roof, the combined operator collapse-mid is applied to the faceCW
and faceCCW edges of the edge returned by the extrude operation.
Generative modeling, gains its efficiency through the possibility to create high-level shape
operators from low-level shape operators. Any sequence of processing steps can be
grouped together to create a new “combined operator”. It may use elementary operators, as
well as other combined operators. Concrete values can easily be replaced by parameters
which makes possible the separation of data from operations: the same processing
sequence can be applied to different input data sets. Data can be used to produce different
shapes by applying different combined operators, from, e.g., a library of domain-dependent
modeling operators. This makes possible the creation of very complex objects from only a
few high-level input parameters, such as, a style library [2].
17
Fig. II-9 – Parameterization/Configuration of a Chair with GML
GML is a concrete implementation of the generative approach. Its main feature is that it is
a full functional programming language that can nevertheless be efficiently used as a file
format for low-level shape descriptions. Only 25 Kilobytes GML code of a Gothic window
style library are sufficient to generate connected manifold control meshes for a variety of
windows [10].
Fig. II-10 – Gothic Style Building generated with GML
The GML comes with an integrated visualization engine. Thus, it can also be seen as a
viewer with an integrated modeler that overcomes the usual separation of 3D modeling
from interactive visualization. Curved parts are represented as subdivision surfaces that,
within 1-2 seconds, unfold to seven million vertices after four steps of recursive
refinement. The surface is adaptively displayed at interactive rates using optimized
methods for culling and per-face per-frame multi-resolution rendering [11].
Fig. II-11 – Configuration of Different Wheel Rim Styles using GML
18
2.1.6 Euclides Framework and JavaScript
Enabling an easy access to programming languages that are usually difficult on a direct
approach will dramatically potentiate their use. GML [9] is such a language and can be
described as being similar to Adobe‟s PostScript. A major drawback of all PostScript
dialects is their unintuitive reverse Polish notation, which makes both - reading and writing
- a burdensome task. According to Strobl, M., et al. [12] a language should offer a
structured and intuitive syntax in order to increase efficiency and avoid frustration during
the creation of code. To overcome this issue, Strobl, M., et al. [12] propose a new approach
to translate JavaScript code to GML automatically. Within the last few years generative
modeling techniques have gained attention especially in the context of cultural heritage.
Because a generative model describes a rather ideal object and not a real one, generative
techniques are a basis for object description and classification. This procedural knowledge
differs from other kinds of knowledge, such as declarative knowledge, in a significant way.
It can be applied to a task. This similarity to algorithms is reflected in the way generative
models are designed: they are programmed. In order to make generative modeling
accessible to cultural heritage experts, Schinko, C., et al. [13] created a generative
modeling framework which accounts for their special needs. The result is a generative
modeler called Euclides based on an easy-to-use scripting language (i.e. JavaScript). The
generative model meets the demands on documentation standards and fulfills sustainability
conditions and its integrated meta-modeler approach makes it independent from hardware,
software and platforms.
Fig. II-12 – Example of a 3D application created for this thesis using Euclides and JavaScript. It allows
the control of several shape parameters on the “Classic Building form example”.
19
2.1.7 Autodesk Revit Architecture 2012
In the latest years, Autodesk made a strong effort in incorporating new technologies (e.g.
multitouch…) and new amazing functionalities (physics, energy analysis, parametric
design) in existing products like Autodesk Revit 2012 or Maya 2012, making a strong
contribution for the development of really innovative products.
Autodesk Revit Architecture can be used to create massing designs; explore design
alternatives based on qualitative and quantitative feedback; and help address various
environmental, constructability, and aesthetic concerns that can arise during project
realization [14].
Fig. II-13 – Energy Consumption Study using Autodesk Revit
In the early stages of a design, visualizing a concept in 3D enhances a designer‟s ability to
communicate ideas. Analyzing these ideas yields the ability to predict and optimize the
real-world performance of the built project.
These attributes form a core value of the Building Information Modeling (BIM) process,
which Revit Architecture software is purpose-built to support.
20
In Autodesk Revit Architecture, users have access to a robust collection of easy-to-use
modeling tools that facilitate design conceptualization, visualization, and communication.
This software supports several new modeling operations, including adaptive, component-
driven geometry, robust UV grid manipulation and increased schedule functionality
through reporting parameters. In addition, Revit users on Autodesk Subscription can now
access tools that enable them to better assess the impact of their early design decisions on
energy consumption and carbon emissions without leaving the Revit environment. In order
to clearly illustrate a complete workflow using the conceptual design and analysis tools
and to address the new features introduced with the previous release [14]:
The Project requirements section outlines the criteria that will drive the building
design;
The Parametric Massing Design section describes the steps taken to explore
massing design alternatives informed by qualitative and quantitative feedback;
The Site and Environmental Analysis section addresses the impact of building mass
and orientation on energy consumption and overshadowing;
The custom “Panelization” section uses the mass design options generated in the
first section as the basis for informed panel‟s studies.
Fig. II-14 – Conceptual Design in Autodesk Revit Architecture
21
2.1.8 Project Vasari and Project Nucleus
Autodesk Project Vasari is an easy-to-use, expressive design tool for creating building
concepts and it‟s build on the same technology as the Autodesk Revit platform.
Project Vasari goes further, with integrated analysis for energy and carbon, providing
design insight where the most important design decisions are made. And, when it‟s time to
move the design to production, simply bring your Vasari design data into the
Autodesk Revit platform for BIM, ensuring clear execution of design intent.
Project Vasari is still under development and is primarily intended to reduce the building
energy loads, not to replace the more detailed analysis tools. It is able to produce
conceptual models using both geometric and parametric modeling functionality. The
designs can be analyzed using the built-in energy modeling and analysis features. The tools
depends on Green Building Studio (Autodesk‟s green building analysis web service) in
many input energy related parameters [15].
Fig. II-15 – Sun Studies using project “Vasari”
Project Vasari is focused on conceptual building design using both geometric and
parametric modeling. It supports performance-based design via integrated energy modeling
and analysis features. This new technology preview is now available as a free download
and trial on Autodesk Labs.
22
Project Nucleus integrates the Nucleus simulation engine from Autodesk Maya into
Autodesk Revit Architecture and Project Vasari. It allows designers to experiment with
"form-finding" in the conceptual design phase by simulating forces directly in Revit
Architecture and Project Vasari (the latest technology preview of Project Vasari already
includes the Project Nucleus functionality).
Fig. II-16 - Panel Study using Revit and Vasari
Project Nucleus can simulate a wide range of physical phenomena in real time, like wind,
gravity, constraints, and collisions. These forces can help architects generate free-form
shapes, many of which would be impossible to model by hand [16].
Fig. II-17 - Using Revit, Vasari and Nucleus Physics for a Panel Study, plus Analysis
23
2.1.9 Autodesk Adaptive Components
Adaptive geometry can be sized and positioned in the context where it is used. When you
designate under constrained geometry as adaptive, you specify the geometric elements
allowed to change, while controlling the elements that you want to remain a fixed size or
position [17].
“Adaptivity” is the functionality, within Inventor, that allows the size of a part/feature to be
determined by setting a relationship to another part in an assembly. Basically, “adaptivity”
is a special way to add constraints. These constraints differ from regular constraints in that
they are driven from a separate file. This separate file can be an assembly file or another
part within the assembly file.
A good example of “adaptivity“, is constraining a shaft to a hole in another part. If set up
correctly, when the size of the hole changes the diameter of the shaft updates as well.
“Adaptivity” is normally used during the initial design phase of a model, when changes are
made rapidly and many parts are affected.
Fig. II-18 – Adaptive Components in Autodesk Revit
24
Once a design is released, and parts become standard parts, available for use in other
designs, “adaptivity” should be removed to eliminate the possibility of inadvertently
changing a released design. Removing “adaptivity” also improves performance.
Fig. II-19 – Adaptive Panel example in Autodesk Revit
As with using any other constraint, forethought should be given to how a design may
change before “adaptivity” is applied. If a part is not likely to change, it is better to apply
normal (non-adaptive) constraints. “Adaptivity” should be used only when absolutely
necessary [17].
Fig. II-20 – Another Adaptive Panel example, built using Adaptive Components
25
2.2 Evolutionary Architecture and the Use of
Algorithms in Optimization of Problems
The first references to this field of computation, Evolutionary Solvers or Genetic
Algorithms [18], can be found in the early 60's when Lawrence J. Fogel published the
revolutionary paper "On the Organization of Intellect" [19] which steered the first
endeavors into evolutionary computing. The early 70's saw further ventures with important
work produced by Ingo Rechenberg and John Henry Holland (and others) [20].
Evolutionary Computation didn't gain popularity beyond the programmer world until
Richard Dawkins (one of my favorite authors) came out with the book, "The Blind
Watchmaker" in 1986 [21], which was published with a small program that generated an
apparently endless stream of body-plans called "Bio-morphs" based on human selection.
Fig. II-21 – Image taken from the book “The Selfish Gene” by Richard Dawkins
After the 80's, the dawn of the personal computer has made it possible for individuals
without government funding to apply evolutionary principles to personal projects and
making it a common jargon. The term "Evolutionary Computing" is very well commonly
known at this time, but is still very much a programmer‟s tool (by programmers and for
programmers) [18, 22].
26
The applications out there that apply evolutionary logic are either aimed at solving specific
problems or they are generic libraries that allow other programmers to develop their own
software [21].
Fig. II-22 – Several visions related to Evolutionary Architecture and Biomimetic
One of the most important works ever published published about Evolutionary
Architecture was the book of John Fraser, “An Evolutionary Architecture” [23], in the
book introduction one can read: “…in this book the author investigates the fundamental
form-generating processes in architecture, considering architecture as a form of artificial
life, and proposing a genetic representation in a form of DNA-like code-script, which can
then be subject to developmental and evolutionary processes in response to the user and the
environment. The aim of an evolutionary architecture is to achieve in the built
environment, the symbiotic behavior and metabolic balance found in the natural
environment. To do so, it operates like an organism, in a direct analogy with the underlying
design process of nature”.
Fig. II-23 – Evolutionary examples taken from the book “An Evolutionary Architecture” by John
Fraser [23]
27
Also, Gordon Pask wrote on his foreword on this same book: “The book also proposes a
fundamental change in practice… „The role of the architect here, I think, is not so much to
design a building or city as to catalyze them: to act that they may evolve‟. Promising
sustainable design methods are unquestionably emerging through the use of evolutionary
computation and environmental simulation tools, as this is indeed an essential need in
today‟s architecture world”.
Fig. II-24 – Dynamic Geometry Computation, “Shanghai Tower - Geometry Generate and Rendering”
(Michael Peng)
Eugene Tsui on his work and book “Evolutionary Architecture: Nature as a Basis for
Design” [24] and also Javier Senosiain, Michael Paulin, William McDonough, Renzo
Piano and many others architects incorporate in their projects ecological and sustainable
principles, but also integrate an understanding that constructions require “an holistic
approach studying the form, materials and efficiency that Nature have becoming the
infallible mentor in the creation of an comfortable and symbiotic world” [25].
Fig. II-25 – Ecological House of the Future (Eugene Tsui)
28
2.2.1 Differential Evolution (DE)
Differential Evolution (DE) [26] has been very successful in solving the global continuous
optimization problem [27]. It mainly uses the distance and direction information from the
current population to guide its further search. The global optimization problem arises in
almost every field of science, engineering or business, and an enormous amount of effort
have been devoted to solving this problem. The major challenge of the global continuous
optimization is that the problems to be optimized may have many local optima (Sun, J., et
al. [27]).
Fig. II-26 – The Global Optimization Problem (example in Matlab). Search of the highest mountain
peak among a neighborhood of other high mountains peaks.
Evolutionary Algorithms (EA‟s) are similar to the evolution process of a biological
population which can adapt to the changing environments in order to find the optimum of
the optimization problem by evolving a population of candidate solutions. Differential
Evolution (DE) is one of the most successful EAs for the global continuous optimization
problem. Several examples of problem solving using DE were already presented in the
past, particularly those ones presented by the creators of the DE algorithm (Price, K. S., et
al., [28]).
29
Evan Greenberg [29] discusses in is master thesis a natural behavior called “Branching”
that occurs in natural systems for functional reasons. The branching logic for each specific
system is quite different due to environmental and mathematical factors. In the
computation of branching systems, these mathematical factors can be incorporated easily
into the coding of each system. Nevertheless, the environmental components deserve
further consideration in the simulation of these natural systems. Through the engine of
genetic algorithms based on evolutionary developmental theory, the specific logics
observed and analyzed in branching patterns of river systems or trees, can be simulated and
optimized in a digital environment.
Fig. II-27 - Observation, Analysis and Computation of Branching Patterns in Natural Systems (by
Evan Greenberg [29])
There are some biological terminologies which are used in evolutionary algorithm
implementations, such as:
Individual: an autonomous piece characterized by a chromosome. In this case, one
possible solution to the design problem;
Population: a group of individuals;
Population Size: the number of individuals in a population Gene (a functional block
of DNA);
30
Allele: A possible value of a gene;
Chromosome: Strings of DNA. In this case, a list of parameters;
Locus: The place of a gene in a chromosome.
In evolutionary algorithms there are also three different types of operators: Selection,
Crossover, and Mutation. After initializing the parameters, these three operators are
iterated until the results satisfy the terminal criteria defined.
Each step of the algorithms is explained as follows (Kawakita, G. [30]):
Initialization - In this step, some parameters including the population size, number
of generations and so on are entered. After that, the initial input randomly generates
genotype individuals of the first generation. Particularly, population size is
significant in terms of the operations, the lengthier the chromosome length is, the
bigger the population size is. Additionally, a bigger population size requires longer
calculation time until convergence. However, small population sizes may result in
premature and undesirable convergence;
Evaluation - Fitness scores are calculated for further selection of fitter
chromosomes. One of the most important aspects in this step is the Fitness Function
which calculates the fitness measurement of each individual. This operation is
deeply related to the efficiency of the whole Genetic Algorithm (GA) flow;
therefore, it needs to be determined carefully;
Selection - The fitter chromosomes in the population are basically selected for
reproduction. As in biological evolution, the fitter chromosomes are more likely to
be selected and reproduced in each generation. Meanwhile, lower fitness
chromosomes are also possibly selected, but with a lower probability. This
probabilistic selection depends on the selection method. There are several types of
selection such as elite selection, roulette selection, tournament selection, etc. Each
selection type has advantages and disadvantages. For instance, in elite selection, the
31
fitter chromosomes are certainly selected in order; however, premature
convergence is highly possible;
Crossover - Crossover roughly mimics the genetic operation of biological
recombination between two chromosomes. The fitter chromosomes are chosen by
the selection operator. However, it is not effective enough to evolve the population.
The crossover operator encourages more variation by exchanging genes between
two chromosomes;
Mutation - The mutation operator randomly flips or changes genes in a
chromosome between alleles, generally with a very low probability. Chromosomes
generated by the crossover operator are basically copies of the parent
chromosomes. Therefore, premature convergence possibly occurs. Chromosomes
that have been mutated help to avoid premature convergence. Generally speaking,
the mutation rate should be 1/L, where L is the length of chromosome. Moreover, if
the mutation rate is too big, the algorithm becomes similar to a random search;
Terminal Criterion - In this step, the conditions required to terminate the algorithm
is evaluated. If the process is regarded as being completed, the fittest individual in
the generation is outputted as one of the possible optimum solutions.
The general conditions of convergence in evolutionary algorithms are as follows:
If the fittest score in the population satisfies the certain target – star gene;
If the average fitness score in the population satisfies the certain target –
population improvement;
If the increase or decrease of fitness scores in the population becomes below a
certain value – convergence;
If the number of generations becomes over the defined value – finite iteration.
32
Fig. II-28 – Simple EA’s steps
As it happens with every algorithm, there are several different variations of the differential
algorithm, in order to classify the different variants, the notation: DE/x/y/z was introduced,
where:
x specifies the vector to be mutated which can be “rand” (a randomly chosen
population vector) or “best” (the vector of lowest cost from the current population);
y is the number of difference vectors used;
z denotes the crossover scheme. Example: “bin” (Crossover due to independent
binomial experiments).
Using this notation, the basic DE-strategy that is generally described can be written as:
DE/rand/1/bin, but there are other variants, e.g. DE/best/2/bin.
33
The DE algorithm work‟s (in general) as follows (Price, K. S., et al.[28]):
1. The DE algorithm maintains a population of N points in every generation, where
each point is a potential solution and N is a control parameter;
Then the algorithm evolves and improves the population iteratively:
2. In each generation, a new population is generated based on the current population;
3. To generate descendants for the new population, the algorithm extracts distance and
direction information from the current population members and adds random
deviation for achieving diversity;
4. If an offspring has a lower objective function value than a predetermined
population member, it will replace this population member;
5. This evolution process continues until a stopping criterion is met (e.g., the current
best objective function value is smaller than a given value or the number of
generations is equal to a given maximum value).
Fig. II-29 – General Evolutionary Algorithm: i: initialization, f(X): evaluation, ?: stopping criterion,
Se: selection, Cr: cross-over, Mu: mutation, Re: replacement, X*: optimum. Author: Johann "nojhan"
Dréo
34
The optimization method known as Differential Evolution (DE) has several parameters that
determine its behavior and efficacy in optimizing a given problem. The selection of good
parameters for DE it is an important question that is discussed by Pedersen, M. [31], this
paper gives a list of good possible choices of parameters values for various optimization
scenarios with the intention to give an easy help when choosing the best values for
achieving the best results, these interrelated parameters are: Crossover (CR), usually a
good initial value for CR would be 0.9 or 1.0 to check if a quick solution is possible,
Number of evaluations (Fitness Evaluations), Number of elements in each generation (NP),
a good value for NP is between 5*D (D = Dimension of the problem) and 10*D, but NP
must be at least 4 to ensure that DE will have enough mutually different vectors to which
to work on, Differential Weight (F), a good initial value for F is usually 0.5, Number of
variables of the problem (Problem Dimensions) and Size of the domain for each variable of
the problem.
Fig. II-30 - DE optimization performance (Pedersen, M. [31]) on several different problems using
DE/rand/1/bin algorithm. Plots show the mean fitness achieved. Over 50 optimization runs
35
In simple terms, optimization is the attempt to maximize a system‟s desirable properties
while simultaneously minimizing its undesirable characteristics. What these properties are
and how effectively they can be improved depends on the problem at hand (Price, K. S., et
al. [32]).
The optimization method known as Differential Evolution (or DE) was originally
introduced by Storn and Price [32] and offers a way of optimizing a problem without using
its gradient. This is particularly useful if the gradient is difficult or even impossible to
derive (Pedersen, M. [31]). DE maintains a population of agents which are iteratively
combined and updated using simple formula to form new agents. The general purpose
optimization method known as DE has a number of parameters that determine its behavior
and efficacy in optimizing a given problem, these parameters must be chosen accordingly
[31, 32]. Small changes to the DE implementation can cause dramatic changes in the
behavioral parameters that cause good optimization performance. The parameters given by
Pedersen [31] have been tuned for the DE/rand/1/bin algorithm. If other DE
implementation is chosen, different parameters values must be selected.
According to Storn and Price [26], the DE algorithm (DE/rand/1/bin) was demonstrated to
converge faster and with more certainty than many other acclaimed global optimization
methods. DE also requires fewer control variables, it‟s robust, easier to use and fits itself
on parallel computation scenarios. Storn and Price [26] compared and tested DE against
other optimization methods in solving minimization problems. The DE algorithm uses
simultaneous search vectors in order to help escape local minima (not using only a usual
greedy criterion, where a new parameter vector is accepted if an only if it reduces the value
of the cost function).
Also DE was compared to Simulated Annealing [33] (annealing relaxes the greedy
criterion by occasionally permitting an uphill move, allowing a parameter vector to climb
out of a local minimum, however in a long run this method also leads to a greedy
criterion).
36
Usually users demand that a practical minimization technique should fulfill five
requirements:
1. Ability to handle non-differentiable, nonlinear and multimodal cost functions;
2. Parallelizability to cope with computation intensive cost functions;
3. Ease of use, i.e. only a few control variables are needed to steer the minimization.
These variables should also be robust and easy to choose;
4. Good convergence properties, i.e. consistent convergence to the global minimum in
consecutive independent trials.
Once more, according to Storn and Price [26], the DE algorithm was designed to fulfill all
of the above five requirements. To fulfill requirement (1) DE was designed to be a
stochastic (random, non-deterministic) direct search method. Requirement (2) is useful for
computationally demanding optimizations, DE fulfills this requirement by using a vector
population where the stochastic perturbation of the population vectors can be done
independently. To satisfy requirement (3) (contrary to basic local minimization methods
with annealing concept), DE´s self-organizing scheme takes the difference vector of two
randomly chosen population vectors to perturb an existing vector, the perturbation is done
for every population vector, according to the authors this is the crucial idea of DE. Finally
to fulfill requirement (4), good convergence properties are mandatory for a good
minimization algorithm, this can only be shown with extensive testing under various
conditions.
DE generates new parameter vectors by adding the weighted difference between two
population vectors to a third vector (mutation). The mutated vector‟s parameters are then
mixed (crossover) with the parameters of another predetermined vector (target vector), to
achieve the so called trial vector. If the trial vector yields a lower cost function value than
the target vector, the trial vector replaces the target vector in the following generation
(selection). Each population vector has to serve once as the target vector so that NP
competitions take place in one generation.
37
In one DE variant (DE/best/2/bin), the usage of two difference vectors, seems to improve
the diversity of the population if the number of population vectors NP is high enough.
The DE algorithm participated in the First International IEEE Competition on Evolutionary
Optimization (1st ICEO) and at that time, DE proved to be the fastest evolutionary
algorithm. Encouraged by these results Storn and Price [26] tested DE against other
preeminent minimization methods, such as, Annealing Methods, i.e., Annealed Nelder and
Mead strategy (ANM) and the Adaptive Simulated Annealing (ASA), also against,
Evolutionary Strategies (ESs)/Genetic Algorithms (GAs), i.e., Breeder Genetic Algorithm
(BGA) and the Evolutionary Algorithm with Soft Genetic Operators (EASY) and finally
against Stochastic Differential Equations (SDE). Regarding the test against ANM and ASA
methods, DE was the only strategy that could find all global minima on test suite, also on
most cases DE found the minimum in the least number of function evaluations. Regarding
the test against BGA and EASY, the DE performed favorably against those methods and
needed the least number of functions, DE exhibits superior performance when compared to
the SDE‟s reported best results, none of the DE trial runs (1000 trial runs) failed to find the
global minimum and the DE control variable settings could remain the same for most of
the test functions (indication of DE‟s robustness). As demonstrated by the authors, DE
outperformed most of the other mentioned minimizations approaches in terms of required
number of function evaluations necessary to locate a global minimum of the test functions.
DE is also easier to use as it requires only a few robust control variables which can be
drawn from a well-defined numerical interval, the DE method of self-organization is quite
remarkable. Future research should include a mathematical convergence proof like the one
that already exists for Simulated Annealing and also further research must be done to
determine why DE converges so well.
38
2.2.2 Pros and Cons of using Evolutionary
Algorithms (EAs)
EA‟s can be characterized by some features such as coded parameters, global heuristic
search, fitness-based selection, and so on [26].
Pros of using Evolutionary Algorithms
o They are a very general search method, and are not specific to any particular
application;
o EA‟s are very effective at avoiding becoming trapped in local optima;
o They are remarkably flexible and able to tackle a wide variety of problems.
There are classes of problems which are by definition beyond the reach of
even the best solver implementation and other that are just very difficult to
solve by any algorithm, but majority of the problems encountered on a daily
basis really fall into the evolutionary solvable category;
o Evolutionary Algorithms are also quite forgiving. They will happily chew
on problems that have been under- or over-constrained or otherwise poorly
formulated;
o Because the run-time process is progressive, intermediate answers can be
harvested at practically any time. Unlike many dedicated algorithms,
Evolutionary Solvers can give us a never ending stream of answers, where
newer answers are generally of a higher quality than older answers. So,
even a prematurely aborted run will yield something which could be called
a result. It might not be a very good result, but it will be a result that can be
used.
39
Cons of using Evolutionary Algorithms
o They are fitness-based, and it is necessary to calculate fitness scores for
each individual every generation, therefore, processing loads are very high.
In order to produce good solutions, bigger population sizes are generally
necessary, and the bigger population size causes higher processing loads;
o The most “serious” disadvantage is the complexity and variety of initial
parameters. Satisfactory solutions require accurate input of initial setup
parameters such as population size, mutation rate and so on; however, they
are changeable depending on the objectives. Therefore, it is somehow
complicated and unpredictable for beginner users to determine accurate
setup parameters, not only with this algorithm but with any other algorithm
as well.
2.2.3 Other Evolutionary Based Algorithms - DE/EDA
and Hybrid-DE
In a paper by Sun, J., et al. [27] a combination of Differential Evolution (DE) and
Estimation of Distribution Algorithm (EDA), called DE/EDA is presented, for the global
continuous optimization problem. DE/EDA combines global information extracted by
EDA with differential information obtained by DE to create promising solutions. In such
way, both the global information and local information are used to guide the further search.
DE/EDA was compared with the best version of the DE algorithm and best version of the
EDA on several commonly utilized test problems. According to Sun, J., et al. [27], the
experimental results demonstrate that DE/EDA outperforms the DE algorithm or the EDA
alone. Also, in another proposal a new approach is presented, the algorithm Hybrid
Differential Evolution (HDE) [34] is according to the authors a simple population based,
stochastic function method that constitutes in an extension from the original algorithm of
DE as introduced by Storn and Price [26].
40
2.3 Advanced Rendering, Visualization and Interaction
Techniques in Architecture
Nowadays, one important factor in any project and more specifically in architecture is the
graphical presentation of the project. Presently it‟s possible to find many fantastic options
to present final data and information to final customers (or for discussion on an early phase
of the project), from simple presentation of 3D models to internet online services (e.g.
Microsoft Bing Maps, Google Earth, GIS…), 3D games engines, serious games, virtual
and augmented reality, professional rendering and holograms.
Moreover, a more advanced interaction with information can be achieved through the use
of multitouch, augmented reality, interactive visualizations, dashboards, game controllers,
face/hand recognition, haptic devices, airborne ultrasound sensors or brain control. Also,
all these technologies can be sometimes a replacement for other production techniques,
like prototyping, rapid prototyping or small productions, where the only purpose, is to
show or demonstrate the functionality of a product. It‟s not the purpose of this work to
explain in detail all these technologies, but a description of technologies that are
considered state of the art is presented, which in some cases, are technologies used by the
author of this work, throughout his master course.
2.3.1 Multitouch (MTT)
Multitouch designates a set of interaction techniques that allow users to control graphical
applications with several fingers. Multitouch devices consist of a touch screen (e.g.,
computer display, table, wall or touchpad), as well as software that recognizes multiple
simultaneous touch points, different from the standard touchscreen (e.g. computer single
touch touchpad, ATM machines) which recognizes only one touch point [35]. Tabletops
(a.k.a. Horizontal Interactive Displays) are fascinating interfaces with unique features. In
everyday work, on educational and entertainment environments, tabletops can provide a
stunning experience of intuitive interaction by utilizing direct touches and gestures, this is
ideal for small group collaborations.
41
Recent developments of various technologies such as display and multi-touch technologies
open up new possibilities to enrich interaction on horizontal interactive displays. The book
“Tabletops - Horizontal Interactive Displays” (Müller-Tomfelde, C. [36]) brings together
the current research in the domain of tabletops, the book integrates and summarizes
findings from the most important international tabletop research teams. It provides a state-
of-the art overview and allows for the discussion of emerging and future directions in
research and the technology of tabletops.
On the first year of this master course, the author of this thesis participated in the
development of a project called “MTT4ALL” [37]. This project aimed the development of
a horizontal multitouch device, employing all the knowledge acquired during the
realization of this master course (including topics like sustainability, temperature
simulation, molds construction, energy efficiency…), this equipment should also include
new functionalities, like the use of active 3D stereo.
Fig. II-31 – 3D Anaglyph and Active Stereo Visualization on a Multitouch DI Table (MTT4ALL
multitouch table was built by the author of this master thesis)
On the second year of the master course, the author of this thesis (with support from
CDRSP) completed this project on his own, by building a fully functional prototype
according to the previously projected plans, this multitouch equipment is now part of the
virtual reality lab at CDRSP (which the author of this thesis also helped to create). This
equipment is actively used now on regular demonstrations at the CDRSP virtual lab.
42
Fig. II-32 – Initial MTT4ALL scale model (left) and Final MTT4ALL Functional Prototype in Use ,
running Fraunhofer virtualDesk Application, developed by the author of this master thesis for
Fraunhofer Austria (right)
In order to validate the architecture of the planned exhibition halls, the Frankfurt fair used
the instant reality player for interactive exploring the architectural 3D models. Starting
from the architectural 2D plans, a high quality 3D model of the booth area has been
generated by Mainfeld. This model is usually visualized using “instantPlayer” within a
CAVE environment or in a big wall screen.
Therefore the walk model has been implemented realizing intuitive navigation through the
VR model. In contrast to the free 3D flying mode the walk mode avoids the penetration of
walls and it supports a walking simulation on floors and stairs. Thus an intuitive navigation
algorithm has been implemented, giving special attention to users with no experience in 3D
and computer graphics. Using this architecture visualization the decision making process
of the architecture planning was supported. VR visualization of architecture has been
proofed to be a valuable planning tool.
The instantReality framework provides a comprehensive set of features to support both
classic Virtual Reality (VR) and advanced Augmented Reality (AR) equally well. The goal
was to provide a very simple application interface while still including the latest research
results in the fields of high-realistic rendering, 3D user interaction and total-immersive
display technology.
43
The system design includes various industry standards, like VRML and X3D, to allow an
easy application development and deployment [38].
Fig. II-33 – Fraunhofer, Multitouch Architecture Visualization – Messe Frankfurt GmbH (using the
instantReality Framework)
2.3.2 Virtual and Augmented Reality
One particular type of visualization is the Virtual Reality (VR), where the visual
representation of information is presented using an immersive display device (e.g. a stereo
projector for achieving stereoscopy). VR is also characterized by the use of a spatial
metaphor, where some aspect of the information is represented in three dimensions so that
humans can explore the information as if it were present (where instead it is remote), sized
appropriately (where instead it was on a much smaller or larger scale than humans can
sense directly), or had shape (where instead it might be completely abstract) [39].
It‟s possible to see in the next images, two examples of stereoscopy created by the author
of this thesis at the CDRSP Research Centre. A 3D stereo visualization enables users to
feel immersed (as if they were inside the simulation/virtual scene).
44
In Fig. II-34, we can see a passive anaglyph visualization that can be experimented by
wearing one of those existing cyan/red glasses.
Fig. II-34 – Anaglyph Visualization (created by the author of this master thesis)
In Fig. II-35, an active stereo visualization that can be perceived by wearing the new
nVIDIA Vision active stereo glasses.
Fig. II-35 – Active Stereo Visualization (created by the author of this master thesis)
Augmented Reality (AR) combines real and virtual, is interactive in real time and is
registered in 3D (Azuma, R. [40]). In other words: AR is a paradigm shift in accessing,
understanding and experiencing information as it combines the digital and the real world
around us (through real time video capturing, using cameras).
45
With Augmented Reality it is possible to make a more natural experience. The company
“metaio” [41] is without a doubt a reference in this field. Also Fraunhofer have several
research teams working around this topic, in Fig. II-36, we can see an augmented reality
system developed by Fraunhofer IGD that allows seeing the “Old Roman Forum ruins”
[42], the installation uses only a monitor, a camera to film the real environment and virtual
reality software to superimpose virtual objects as if they were really there (by recognition
of a printed giant photo placed on the walls of the museum).
Fig. II-36 – An Augmented Reality system developed by Fraunhofer (Monitor + Camera + Virtual
Reality Software)
The DAVE [43] stands for Definitely Affordable Virtual Environment and it is an
immersive projection environment, a four-sided CAVE. Affordable means that by mostly
using standard hardware components it‟s possible to greatly reduce costs when compared
to other commercial systems.
Fig. II-37 – DAVE, CGV Austria: Images are projected on the back projection side walls and on the
floor from above, mMirrors are used to reduce the space needed
46
Also, by exchanging just the graphics cards every couple of years and less frequently the
PCs, the latest graphics hardware can be used, spending less money. Optimally a normal
room could be used without constructional changes.
Fig. II-38 – CGV Austria, The DAVE, a 3D Immersive System (exploring the National Library of
Vienna)
In 2005 a new version of the DAVE [43] was built in Graz, Austria. This type of
immersive display is the best way to explore virtual worlds. However it requires a large
room and some effort. An alternative technology is the Head Mounted Display which is
uncomfortable and typically has a much lower resolution.
Fig. II-39 – The SweetHome3D (modeling application) output is transferred with a web service to an
OpenSG CAVE application which lets the user walk through a 3D representation of the house plan
47
Currently, at Fraunhofer, a new version of the HEyeWall was built (a large tiled rear
projection setup). With dimensions of 4m x 2m this new HEyeWall version is one of the
largest multi touch screens in the world. With a resolution of approximately 8 Megapixels
(4000 x 2000) a pixel has a size of 1 mm2. While the authors (Lancelle, M., et al. [43])
already implemented a soft edge blending, the team is currently working on better hiding
the seams that are currently visible in the homogeneous regions. A preview of the results is
shown in the photo bellow.
Fig. II-40 – HEyeWall (High Resolution Multitouch Screen)
2.3.3 Computer Generated Holography
Computer Generated Holography (CGH) is a powerful technology suitable for a wide-
ranging of display types. Although CGH-based display systems are currently too expensive
for many applications, they will become a viable alternative in the near future (Slinger, C.,
et al., [44]).
Invented in 1947 by Dennis Gabor, holography from the Greek “holos”, for whole, is a 3D
display technique that involves using interference and diffraction to record and reconstruct
optical wave fronts. Holography‟s unique ability to generate accurately both the amplitude
and phase of light waves enables applications beyond those limited by the light
manipulation capabilities of lens or mirror based systems.
48
Computer-generated holography is an emerging technology, made possible by increasingly
powerful computers, that avoids the interferometric recording step in conventional
hologram formation. Instead (Fig. II-41), a computer calculates a holographic fringe
pattern that it then uses to set the optical properties of a spatial light modulator, such as a
liquid crystal micro display. The Spatial Light Modulator (SLM) then diffracts the readout
light wave, in a manner similar to a standard hologram, to yield the desired optical
Wavefront [44].
Fig. II-41 – Computer generated holography. A computer calculates a holographic fringe pattern for
display by the Spatial Light Modulator (SLM), which diffracts laser light to yield an interactive, true
3D image
Compared to conventional holographic approaches, CGH does not rely on the availability
of specialized holographic recording materials, it can synthesize optical wave fronts
without having to record a physical manifestation of them, and for example, it can generate
3D images of nonexistent objects and offers unprecedented wavefront control by making it
easy to store, manipulate, transmit and replicate holographic data.
Although CGH-based display systems can be built today, their high cost makes them
impractical for many applications.
However, as compute power and optical hardware costs decrease, CGH displays will
become a viable alternative in the near future.
49
Main advantages of CGH are that it provides flexible control of light, making it suitable for
a wide range of display types, including 2D, stereoscopic, auto stereoscopic, volumetric,
and true 3D imaging.
Fig. II-42 – Tradeshow (An Hologram System)
CGH-based display technology can produce systems with unique characteristics
impossible to achieve with conventional approaches [44].
Fig. II-43 – Although he was in Melbourne, Telstra's chief technology officer, Hugh Bradlow (right),
makes is presence felt at a conference in Adelaide (Photo: Telstra)
Mid-air displays which project floating images in free space have been seen in science
fiction movies for several decades.
Recently, they are attracting a lot of attention as promising technologies in the field of
digital signage and home TV, and many types of holographic displays are proposed and
developed. You can see a virtual object as if it is really hovering in front of you. But that
amazing experience is broken down the moment you reach for it, because you feel no
sensation on your hand.
50
The objective is to add tactile feedback to the hovering image in 3D free space. One of the
biggest issues is how to provide tactile sensation. Although tactile sensation needs contact
with objects by nature, the existence of a stimulator in the work space depresses the
appearance of holographic images.
Therefore some kind of remote controllable tactile sensation is needed. That is achieved by
an original tactile display. One paper by Hoshi, T., et al., [45], explains the technologies
employed for a “Touchable Holography”.
Fig. II-44 – Touchable Holography Interaction System. An aerial imaging system, a non-contact tactile
display and a Wiimote-based hand-tracking system are combined. In this figure, the ultrasound is
radiated from above and the user feels as if a rain drop hits his palm
2.3.4 Advanced Rendering
Rendering is the process of generating an image from a model (or models in what
collectively could be called a scene file), by means of computer programs. A scene file
contains objects in a strictly defined language or data structure. It would contain geometry,
viewpoint, texture, lighting, and shading information as a description of the virtual scene.
The data contained in the scene file is then passed to a rendering program to be processed
and outputted to a digital image or raster graphics image file [46].
51
The term "rendering" may be by analogy with an "artist's rendering" of a scene. Though
the technical details of rendering methods vary, the general challenges to overcome in
producing a 2D image from a 3D representation stored in a scene file are outlined as
the graphics pipeline along a rendering device, such as a Graphic Card Processor Unity
(GPU) [46].
A GPU is a purpose-built device able to assist a CPU in performing complex rendering
calculations. If a scene is to look relatively realistic and predictable under virtual lighting,
the rendering software should solve the rendering equation. The rendering equation doesn't
account for all lighting phenomena, but is a general lighting model for computer-generated
imagery. “Rendering” is also used to describe the process of calculating effects in a video
editing file to produce final video output [46].
Fig. II-45 – 3D Photorealistic Rendering (created by Harchi, an architecture company based in
Portugal)
Rendering is a very intensive process that consumes huge amounts of computer resources
(memory, CPU…). Recently nVIDIA released CUDA, a parallel computing architecture
that enables dramatic increases in computing performance by harnessing the power of the
GPU.
With CUDA, software developers, scientists and researchers are finding broad-ranging
uses for CUDA, including real-time image and video processing, computational biology
and chemistry, fluid dynamics simulation, CT image reconstruction, seismic analysis, ray
tracing and much more.
52
Fig. II-46 – IPL/CDRSP future building (Rendered in Autodesk Maya 2011 by the author of this
master thesis)
Computing is evolving from "central processing" on the CPU to "co-processing" on the
CPU and GPU. To enable this new computing paradigm, NVIDIA invented the CUDA
parallel computing architecture that is now shipping in Tesla, Quadro and GeForce GPUs.
For example an nVIDIA Quadro FX-4800 graphic card is composed by 192 CUDA
parallel processor cores, if a rendering application is prepared to use the GPU, then the
performance of the rendering platform will be radically improved.
Fig. II-47 – Medium Quality Rendering of a factory installation (created in Deep Exploration by the
author of this thesis)
Founded in Wales, UK, in 2003, iCreate [47] is an established business company,
delivering high quality illustration and animation to architectural firms, property
developers, creative agencies, house building companies and urban planners.
53
The company iCreate creates architectural visualization and 3D architectural renderings,
and has developed so far awesome work on planning and developing new real estate
properties, bringing new buildings to life in 3D virtual reality, CGIs, 3D flythrough,
photorealistic renderings and interactive 3D models). They are specialists in presenting
new developments at their best light before they're actually built, enabling a secure
planning permission, sell property “off plan”, and also helping involving communities in
regeneration and town planning.
Fig. II-48 – High Quality Real-time Interactive Rendering (www.iCreate3d.com)
2.3.5 Virtual World Interactivity
Virtual reality (VR) systems must exploit additional input and output options in order to
make interaction in Virtual Environments (VEs) much more intuitive and to increase the
user's immersion into the virtual world. When developing VR applications, developers
should be able to put more effort on modeling advanced interaction and system behavior.
Many systems and tools for developing virtual reality applications have been proposed to
achieve this goal. However, no de facto standard is available (Steinicke, F., et al., [48]).
Johannes Behr and Dirk Reiners [49] presented interesting additional material and tutorials
to support and simplify different aspects of the VR/AR application development process.
54
Most of the literature defines some basic elements which are critical but variable for all
VR/AR applications [49]:
Virtual Content
The content of the medium defines the virtual world. The imaginary space, manifested
through a medium, defining any collection or number of objects in a space and the
relations and rules in the corresponding simulation.
Immersion
Being mentally and physically immersed are important conditions for VR applications.
Where physical immersion is a defining characteristic of virtual reality and mental
immersion is probably the goal of most media creators. Therefore it is important that
there is a synthetic stimulus of the body‟s senses via the use of technology. This does
not imply all senses or that the entire body is immersed. Most systems focus on vision
and sound. Some include touch and haptic, generally known as force feedback. Other
senses are much harder to stimulate by a computer and are only used in very few
specific research environments.
Sensory feedback
Sensory feedback is an ingredient essential to virtual reality. The VR system provides
direct sensory feedback to the participants based on their physical position. In most
cases it is the visual sense that receives the most feedback, as it is also the sense that
brings most of the information from the environment into the human system anyway.
Interactivity
For VR to seem authentic, it must respond to user actions, namely, be interactive. The
system must produce sensory feedbacks according to the user action. Most systems
give visual feedback with an update rate from at least 30 times per second. More is
desirable, and immersion gets lost when the time lag between user actions and
“sensable” reactions exceeds 100 ms. There is not a single interaction and navigation
55
method or device that would define VR nor AR. There is no Window, Icon, Menu, 2D-
Pointer defining a single metaphor, but every application designer is free to choose
whatever is most attractive or appropriate for the current set of goals. This gives the
developer a lot of freedom, but on the other hand also asks for a new set of
development tools, standards and methods.
Enrico Gobbetti and Riccardo Scateni wrote a report [50] where a short survey of the field
of virtual reality is provided, highlighting application domains, technological requirements,
and currently available solutions.
Fig. II-49 – Manipulation of VR data, provided by MEMPHIS [1] (TiciView VR system).
IGI/Fraunhofer Research group, 2007, Seoul - South Korea, (the author of this master thesis was a
member in the team responsible for the development of this system)
Also in this area of full interaction, iCreate3D [47] has developed many awesome
interactive 3D applications for architecture, as seen in Fig. II-50.
Fig. II-50 – Advanced Interactive Visualization (using iViewer), starting from left to right: (a)
Skyscrapper; (b) an Apartment inside the Skyscrapper (www.icreate3D.com)
56
At CDRSP, the author of this master thesis also created a Virtual Factory Simulation,
where a user can “navigate” inside a factory as if he was a maintenance technician. This
serious game will allow the interaction with different machines and to simulate fire drills,
on virtual scenarios (but very similar to a real environment).
Fig. II-51 – Virtual Factory simulation/serious game that will allow a company to give training to
users, (this project was created at CDRSP by the author of this master thesis)
57
2.4 “A World Full of Sensors”
More and more, an increased interest in incorporating sensing technologies in buildings in
an early stage of the project development is expected. These sensors can help monitor
movement of crowds inside a building, prevent fires and warn about imminent
earthquakes, monitor the health of persons and machines and they can also be integrated on
walls and monitor the degradation of the building.
Sensors can be used in the initial development phases to help the engineers in testing and
choosing the right materials or between different design options. Sensors can help reduce
energy consumption by detecting where persons are and by activating light only when
needed, as well as monitoring the temperature inside buildings. Sensors can even track
persons and objects inside a building. This sensor problematic should be faced as an
important part of any new building project [51]. Next, some interesting and innovative
works related to the field of buildings monitoring are presented.
2.4.1 Sensors Feed Information into Virtual Worlds
Environmental monitoring presents many challenges to wireless sensor networks, such as,
the need to collect and process large volumes of data before displaying the information to
the user in an easy to understand format. A very interesting project called Tricorder it‟s
according to Lifton, J., et al. [52], a prototype for a mobile sensor network browser.
There are in fact, several recent attempts to achieve this objective and more surprisingly,
new projects combining the use of really emergent technologies are predictable, based on
augmented reality, multitouch and social networks. This increase on the presence of more
and more sensors in the surrounding environment, leads to the need of the development of
new software applications that can allow the users to easily find, “see” and interact with
this huge amount of sensors.
58
Fig. II-52 – Screenshot of the Tricorder device showing the floorplan of a lab, overlayed with Plug
icons to represent sound, light, current consumption, motion and vibration. Also average date from all
sensors is displayed [52]
It‟s possible to already foresee a near future reality (the so called Pervasive Mixed
Reality), where persons will walk in a real environment full of sensors, e.g., in an oil
platform, a shopping mall or in a city, with the help of augmented reality. Persons will be
equipped with mobile equipment with or without augmented reality Organic Light-
Emitting Diode (OLED) glasses; they will be able to request information about those
sensors; sensors will then respond and by pointing the camera into any direction, the user
will be able to locate and see the hidden sensors that are installed on the surrounding
environment. In this way, it will be possible to identify and locate immediately the various
buildings, equipment, services or products. At the same time, the mobile equipment will
receive valuable data from those sensors/active tags. After this, the information can be
instantly represented in a much more understandable and useful way, through the use of
augmented reality, finally the user can interact with this information in a natural and
meaningful way.
SensAR [53] is, according to the authors, a prototype augmented reality interface designed
for monitoring environmental information. The inputs are sound and temperature data,
which are located inside a networked environment. Users can therefore visualize 3D and
textual representations of environmental information in real-time using a lightweight
handheld computer. Also IBM launched a new 3D Data Center in OpenSim, this allows the
creation of Virtual Data centers that are aimed at IT professionals. This 3D application
should let the professionals monitor data centers, more effectively, over long distances,
adding visual information about the real environment to the virtual world through the use
of a network of sensors (Llewelyn, G., [54]).
59
On Ubiquitous Sensor Portals, another recent development is the so called X-Reality, also
known as Dual Reality or Cross Reality. Here the main intention is to mix the real world
and the virtual world by connecting "location-specific 3D animated constructs" in virtual
worlds to in-building sensor installations. IBM also wants to make use of this right now, in
order to “use high fidelity 3D virtual replicas of real plants or factories to remotely browse
and influence industrial processes in real-time”.
In one project, MIT, recently created a cross reality environment called "ShadowLab"
which is in fact a Second Life map of the Media Lab's third floor animated by data
collected from a network of 35 “smart, sensor-laden power strips” (a.k.a. PLUGs) [55].
Fig. II-53 – A Portal in Second Life shows sensor data over time
Landay, J., et al. [56], define cross-reality as the union between Ubiquitous
Sensor/Actuator Networks and shared online virtual worlds, a place where collective
human perception meets the machines view of pervasive computing, the authors also state
that five of their articles expand on aspects for this theme and they should be read as well.
Also, Media Lab says, that the team has plans to explore ways to bridge networked
electronic sensors and human perception; through "cross reality" implementations that
render and manifest phenomena between the real world and virtual environments, through
densely embedded sensor and actuator networks (Coleman, B., [57]).
Complex Event Processing (CEP) is a fundamental functionality for cross-reality
environments. With CEP, raw sensor data generated in the real world can be transformed
into more significant information that has some real meaning for the virtual world.
60
Lifton, J., et al. [58] presented DejaVu, a general purpose event processing system built at
ETH Zurich. SmartRFLib, a new cross-reality application, it uses DejaVu and enables real-
time event detection over RFID data streams feeding a virtual library on Second Life [59].
Fig. II-54 – A Virtual DataPond in the Virtual Atrium (left) and a real DataPond in the real Media
Lab Atrium (right)
2.4.2 Remote Monitoring of Persons Inside
Buildings
On an another project developed by Graz University of Technology, Fraunhofer Institute
for Computer Graphics Research and Darmstadt Technical University, the problematic of
monitoring persons inside buildings is discussed. With a new approach for showing a large
number of detection results in real-time, using various levels of abstraction, this system
allows the viewing of information of eight surveillance cameras in one 3D interface
(Settgast, V., et al., [60]).
Fig. II-55 – Different representations of a person detection in 3D, starting from left to right: (a)
Billboard with a thin colored surrounding line, (b) additional geometry, (c, d) overlay marker which is
not occluded by the scene
61
2.4.3 Kinetic, Responsive Performative and Adaptive
Architecture
Smart architecture is becoming a buzzword in architecture and its associated disciplines.
Nevertheless, it is not clear what is included in smart architecture and how it relates to or
differs from such similar related camps as responsive architecture, performative
architecture, kinetic architecture and adaptive architecture. One specific paper written by
Senagala, M., [61] poses the essential and critical questions about smart architecture from
a complex-adaptive systems point of view. This work also illustrates the attributes of smart
architecture with a number of seemingly unrelated, however conceptually connected
design developments.
In an experiment (Elsacker, E. and Y. Bontinckx. [62]), a relationship between a CAD
model and a weather “sensoring” system connected to actuators, allows for an autonomous
system that changes the form of the pavilion in reaction to weather conditions.
Fig. II-56 – A model which is a kinetic pavilion that reacts on weather data
This project orients itself on the development of a new kind of pavilion that‟s capable of
acting upon changing weather conditions, human movement or human moods/mindsets. Its
shape has been made dependable of ecological choices and parameters extracted from the
pavilion‟s surroundings (through environment “sensing”). Just like every other organism,
this new prototype changes itself when parameters take different values (Elsacker, E. and
Y. Bontinckx. [62]).
62
2.5 Reverse Engineering and Rapid Prototyping
The creation of “scale-models” aims to better show the final result of a project. In this way
there is the need to take in account the materials chosen for the real project (materials used
on the real building), that will help on selecting the right rapid prototyping technology, in
order to achieve a realistic mockup. The different rapid prototyping techniques allow the
construction of scale models with different and complex structures and with a layer by
layer construction; all geometries can be obtained with more precision (than using manual
techniques). Also depending on the objective of the scale model, several different
properties are preferred, there are scale models for presentation purposes, only to show
how a building will look like (as a whole, or as an individual building), the scale model can
represent the interior or the exterior of a building, there are also scale models used for
simulation and testing purposes (functional prototypes). These “additive” techniques have
a link with the requisites needed for scale models construction (Ahrens, C. H., et al., [63]).
There are five main advantages of using rapid prototyping techniques:
High precision when reproducing a real building on scale model
Production of complex geometries
A low cost versus high production quality ratio
Multiple materials choice, that allow a good reproduction and a good final result
Reverse Engineering can also be associated to rapid prototyping techniques for the creation
of city scale models for urban rehabilitation, with a very high precision and good quality,
that would be nearly impossible to achieve with other manual techniques (Alves, N. M.
and P. Bártolo, [64]).
63
2.5.1 Reverse Engineering
Nowadays, urban rehabilitation of old buildings or historic places, with an associated high
cultural value is very interesting and desirable. The creation of scale models for
representing existing buildings, using reverse engineering, allow realistic and reliable
reproductions (that can be edited in CAD software in order to repair damaged areas). In
this way, an understanding how to combine reverse engineering techniques with rapid
prototyping is fundamental (Alves, N. M. and P. Bártolo, [65]).
The reverse engineering process comprehends several phases as seen in Fig. II-57, this
process is not completely sequential as seen in the figure, several iterations and overlays
can occur. Ideally models obtained by reverse engineering should show the same
geometric properties of the original model, however these models are obtained using
approximations due to some errors and imprecisions (numeric errors) that occur during the
reconstruction process (Alves, N., et al., [66]).
Fig. II-57 – Phases of the Reverse Engineering Process
The data acquisition process represents the first phase of the reverse engineering process.
This phase refers to the gathering of numeric data necessary to the reconstruction of the 3D
geometry of the object, it‟s a critical phase of the whole process, because it is directly
related to the geometric quality of the obtained model.
Data Acquisition
Pre-Processing
Shape Reconstruction
CAD Model
64
Several methods for capturing the digital data, has been developed, as seen in Fig. II-58.
Each method relies on a specific interactive process with the object and there are several
nondestructive methods, based on using techniques with and without direct surface contact,
and destructive methods, that rely on removing material, using slicing machines and
abrasive tomography.
Fig. II-58 – Reverse Engineering - Classification techniques for 3D Data Acquisition
Each method has several advantages as well as disadvantages, requiring a careful selection
of the reverse engineering method [64, 66, 67]. The objective of the pre-processing is the
treatment of the obtained data by acquisition and includes the selection mechanisms of
points for the model definition and for eliminating the points associated to errors of
3D Acquisition Techniques
Using Contact
Destructive Object Slicing
Non-Destructive
CMM with Contact
Probe
Articulated Arm with
Contact Probe
Without Contact
Transmissive Computer Assisted
Tomography
Reflective
Non-Optical
Sonar
Microwaves Radar
Optical
Active
Triangulation
Image Radar
Active Stereoscopy
Moire Interferometry
Holography
Interfrometry
Passive
Passive Stereoscopy
Movement
Shadowing
Contour/Silhouette
Focus/Unfocus
Textures
65
measurement. This phase includes the use of softening techniques. The 3D reconstruction
phase aims to obtain digital models with a high precision. Several strategies have been
proposed, especially the segment-and-fit methods and the triangulation algorithm of
Delaunay.
Fig. II-59 – Steinbichler Comet 5 Photogrammetry 3D scan equipment at CDRSP Reverse Engineering
Laboratory
After completing a 3D scan of a part, it can also be useful to compare the acquired 3D data
with a referential, e.g. original CAD model, differences between 2 manufactured parts.
This comparison (analysis and inspection) will evidence the differences between the
referential part and the reversed engineered part, allowing the detection of errors on the
newly produced part.
Fig. II-60 – A 3D scan of a real graphite electrode used in molds industry. (3D scan and
analysis/inspection done by the author of this master thesis, using Steinbichler Comet 5 equipment,
and Steinbichler Comet Plus and Comet Inspect software)
66
In a case study (created by Dr. Nuno Alves [66]), a scale model of a building: ”Senhor da
Pedra”, Óbidos in Portugal, with a high cultural value (it‟s one of the main “Baroque” style
constructions in Portugal and it was finished in 1747, please see Fig. II-61), was produced
with the use of reverse engineering and rapid prototyping techniques.
Fig. II-61 – “Santuário do Senhor da Pedra”, Óbidos, Portugal
Several photos were taken to this building from different perspectives, then, two “point
images” were obtained, please see Fig. II-62.
Fig. II-62 – Points obtained for the “Santuário do Senhor da Pedra” building
Next and after overlaying these two “point images”, a cloud of three dimensional points
was extracted, corresponding to the studied building, please see Fig. II-63.
Fig. II-63 – 3D cloud of points for the “Santuário do Senhor da Pedra” building
67
After getting the cloud of points, a STL file is created and the 3D model is obtained.
Fig. II-64 - a) STL and b) 3D model
At last, a scale model (Fig. II-65) was produced using rapid prototyping. The resultant
scale model contains precise details, complex geometries and it is a really realistic scale
model (Alves, N., et al., [66]).
Fig. II-65 – Scale model of “Santuário do Senhor da Pedra” obtained by rapid prototyping
68
2.5.2 Rapid Prototyping
The production technologies must be flexible and allow a rapid processing of a wide range
of materials, so that it can replicate (the best possible) real materials at several levels
(texture, color and capacity of reflecting/transmitting light). In a project developed by
CDRSP, called “MaquetaPlus” [68], in which the author of this thesis also participated,
several existent technologies and materials were identified, such as, prototyping
technologies based in addition of material (e.g. SL, SLS, FDM...) and prototyping
technologies based on subtraction of material (e.g. HSM), that allow the processing of
different supplies like, ceramic, metallic and plastic materials.
Nowadays, the diversity of technologies and materials it‟s so big that selecting the right
rapid prototyping method is difficult but crucial, otherwise rapid prototyping will be
considered an expensive and non-attractive method. The selection of the right technology
will allow the production of scale models with the expected properties at the first attempt.
Actual rapid prototyping processes can be grouped by the state or initial form of the
materials used in the production.
In this sense, there are three different main categories: processes based on liquid materials,
solid materials and powder particles.
Liquid based: the raw material used in the production is on an initial liquid state
before being processed. In this category there are technologies that require the
action of a UV Light (e.g. Stereo-lithography, SL) or a liquid resin jet followed by
the action of an UV light (e.g. Inkjet printing, IJP);
Solid Based: the raw material is on a solid state, like a filament or blade. Some of
these processes first melt the material before actually using the material (i.e.,
fusion/deposition, FDM), others just cut a thin slice of material (i.e., object slicing
manufacturing, LOM or the paper slicing technology, PLT);
69
Powder based: the raw material is a powder form before being processed. A laser
can be used in its processing (e.g. Selective Laser Sintering, SLS or Final Shape
Sintering, LENS) or also a gluing component applied by a jet head (e.g. 3D
Printing) [63, 69];
2.5.3 Rapid Prototyping Techniques
Stereo-lithography is a layer by layer construction process where the prototypes are built
through polymerization of a liquid resin by the action of a laser, as seen in Fig. II-66. After
the creation of each layer, the platform will go down, in a way that the solidified layer can
be uniformly covered again by another layer of liquid resin, with the help of a blade, and a
new cycle will begin. This process is repeated until all the necessary layers are over and
the prototype is finished [63].
Laser
Mirror
Construction
platform
Resin Recoat
Elevator
Part
(being built)
Fig. II-66 – Main components of a Stereo-lithography machine
After the construction of the physical model, is still necessary to consider an additional
phase for “post-curing”, in that way a higher global density and better mechanical
properties in the produced model are achieved. This final polymerization is done with the
help of UV radiation, heat or with an additional chemical agent.
70
The time necessary for this extra step of “post-curing” is dependent on other conditionings,
such as: the type of resin, construction parameters, the construction (and velocity of
construction) strategy and also the strength of the laser beam. This process presents high
costs, for buying the equipment, the necessary materials, this leads to a high cost of the
final prototype, however with an excellent final precision. Also, this process only allows
the use of a small variety of raw materials (resins) it allows the construction using two
different colors (only translucent colors and using a special resin). The production of these
prototypes needs some supports, and the production velocity is done on an average speed,
also the mechanical resistance is considered to be medium [63]. All the prototypes will
need to be “post-cured” in a special hoven, please see Fig. II-67.
Fig. II-67 – Prototypes produced using Stereo-lithography
The extrusion process, initially developed by Crump Company under the designation of
“Fused Deposition Modeling – FDM” consists on a layer by layer production, using an
extruded thermoplastic material.
This process (Fig. II-68) uses a special extruding head that will melt a filament of
thermoplastic material and it will move on the X and Y axis. Besides this main material
extruding head, there is an additional one that will extrude a support material, this material
cab be wax or a water soluble material. The melted material is deposited over the previous
layer and it will solidify.
71
Material
Roll
Platform
Extrusion Heads
Model
Material filament
Fig. II-68 – Simplified FDM process
After each deposition of a layer the platform will go down, just the enough for allowing for
a new layer to be deposited and the process will repeat until the prototype is finished
(Ahrens, C. H., et al., [63]).
The distance between filaments it‟s determined by the diameter of the extrusion jet, this jet
cannot be exchanged during the production process, as so, this jet should be carefully
chosen so that the material viscosity is not affected, avoiding changes in the space between
the deposited filaments. The temperature inside the camera, where the prototype is
produced, should be maintained a little bit below the fusion temperature of the material
that will be deposited, in this way, it will be necessary just a little bit of energy in order to
melt the filament, avoiding deformations in the prototype being produced.
The equipment‟s that are used in this process have a medium initial cost, however the raw
materials used in the production have a high cost, in the end a medium-high cost it is
associated to this production process. It has a medium precision with a medium variety of
materials to choose from, and the speed of the production process is slow with a medium
mechanical resistance.
72
The production of prototypes needs support materials and presents a regular finishing in
the end of the production process and it‟s also possible to use colored materials. Some
good scale prototypes can be obtained using this technique (Ahrens, C. H., et al., [63]).
Fig. II-69 – Scale models obtained using FDM
The additive production process by slicing (a.k.a. Laminated Object Manufacturing or
LOM) consists in the deposit and the repetitive and selective slices cuts of material,
containing adhesive glue in one of the sides. The raw material used with this process
comes in a reel (Fig. II-70). After the deposition of a slice/sheet, a rolling press is heated
and rolls through the surface, activating the adhesive glue of the bottom part, gluing it
together with the previous sheet. A CO2 laser beam (25 or 50 W), that is directed using a
group of mirrors (that are controlled by an X/Y axis moving system) it‟s used to cut the
geometry profile of the part, in the actual layer. This laser also cuts in small rectangles all
the material that does not belong to the part/model, in this way it will be easier to remove
the remaining (unwanted) material in the end of the production process. The platform will
go down (according to a Z axis coordinate and a new material section goes forward and the
process will continue until the prototype is finished. The material that does not belong to
the part (around the part) will act as natural support during the production process (Ahrens,
C. H., et al., [63]).
73
8.JPG
Mirror
Laser
Feeding Roll
Paper Roll
Thermal Roll
Part
(being built)
Paper
Construction
Platform
Support
Material
Fig. II-70 – Scheme of the LOM Process
The equipment used in this process has a medium cost, the raw material cost and the
overall prototype cost will be reduced (as well as the final precision of the prototype). The
production speed is slow and the mechanical resistance of the prototype is medium, this
process does not need any extra support material, but the overall surface finishing is
regular or bad, this process does not allow colored prototypes, please see Fig. II-71,
(Ahrens, C. H., et al., [63]).
Fig. II-71 – Scale model for Oporto music house, produced using LOM (Portugal)
The Selective Laser Sintering process (SLS), it‟s a rapid prototyping process, based in a
layer by layer construction method that allows a huge variety of metallic, ceramic and
polymeric materials to be used. As seen in Fig. II-72, a thin layer of material is spread over
a platform, next a CO2 laser beam runs through this layer, so that a new section of the
prototype can be built. In stereo-lithography the laser will supply energy for the
74
polymerization of the resin, in the SLS process, the laser beam supplies energy to promote
the heating and melting of material in powder form, promoting the material sintering.
After finishing one layer, the platform goes down, in order to get room for spreading a new
powder layer over the construction section and a new sintering will happen (Ahrens, C.
H., et al., [63]).
Mirror
Laser
Part
(being built)
Powder
Tank
Roll
Powder
tank
Construction
Platform
Powder
Powder
Fig. II-72 – Scheme of the Selective Laser Sintering (SLS)
An important aspect in a part dimensional control is the regulation of the amount of energy
transferred to the powder when the beam passes, in order to avoid an excessive fluidity of
the material or even its decomposition, in that case, the contraction of the material would
be unpredictable. The chamber must be heated up, at a temperature just below the fusion
temperature, so that only one small amount of energy will be enough for the particles
sintering. In this way, the thermal contraction of the production chambers is also
prevented. This is a process, where the use of supports it‟s not necessary, the powder itself
will be enough. The initial cost for the acquisition of this equipment it‟s very high, and the
raw material has a medium cost, the prototype will have a high cost in the end and the
precision is considered to be medium, a high variety of raw materials is available (carbon
fibers, thin polyamide, polystyrene).
The production speed is medium and the mechanical resistance of the produced prototype
is high with a good surface finishing, this technique doesn‟t allow the use of colors, also
the prototype should be hit by air jets glass microspheres in order to eliminate the
remaining sintering powder (Ahrens, C. H., et al., [63]).
75
The 3D printing, it‟s a process that uses a construction head, composed by several multiple
jets in order to apply a layer of polymeric material in three dimensions (Fig. II-73). All the
prototypes are produced by the printing head, composed by the multiple jets that allow a
layer by layer construction. If the part is bigger than the working space, the platform will
do a repositioning in the Y axis so that the process can continue.
Part
(being built)
Construction
Platform
Multiple jet
heads
Polymer
Fig. II-73 - 3D Printing Process
Many of the models that are produced using the 3D printing process are fragile and can be
easily damaged or deformed. In these cases, wax infiltrations can make these produced
models stronger, and using ink, colored models can be obtained. The initial cost of
acquiring the equipment is considered low, and the raw materials have a medium cost, the
produced prototypes will have a low cost, but the precision is also low. A medium variety
of different raw materials can be used and the production speed is high. In this process,
support materials are not needed, the surface finishing is regular, the prototypes need some
post processing by using air jets and resin infiltration [63-65, 67, 70].
Fig. II-74 – Complete Scale Model obtained using the 3D printing process
76
There are also, hybrid processes, using the combination of two or more methods for
additive production, where the advantages of each process can be explored together.
Hybrid technologies allow new possibilities, such as the use of different materials and
shapes that could not be done before. In production technologies the term “hybrid” is
commonly used to designate processes/products that combine different kinds of
technologies.
The best compilation available about rapid prototyping and other related topics can be
found in the annual report published by Wholers Associates [71].
Fig. II-75 – Combining several technologies/processes
2.5.4 Personal Fabrication and Future Manufacturing
It‟s important to refer here, the revolutionary switch that the world is facing today in the
way manufacturing is seen and in the way it will be seen in a really “near” future. Today‟s
economy presents challenges that provoke the “need” for producing new ideas, concepts,
prototypes and even finished goods in a really reduced amount of time. More and more the
production at home, of high quality scale models or products is a reality. Rapid Prototyping
equipment‟s are now widely available and consumers just want new and innovative
products all the time and this, ultimately leads to the necessity of having reduced times of
setup in the manufacturing and assembling lines. If the product to be produced is highly
customizable or unique, even the production series will be really small.
77
This certainty opens the path for a new reality in production, where eventually every
person can conceive and produce its own ideas and products at home with a relatively low
cost and sell them easily on the web. The report commissioned by the US Office of
Science and Technology Policy [72] beautifully outlines the emergence of personal
manufacturing technologies, it describes their potential, economic and social benefits and
recommends programs that the government should consider to realize this potential. The
personal manufacturing machines, sometimes “called fabbers”, are the small sized, low
cost descendants of factory scale, mass manufacturing machines. In the future, the world is
expecting to achieve a state where the creation of machines that replicate themselves will
be a reality, the real implications in Humankind of such transformations are yet to be
studied, analyzed and seen. For now, one can rest and just buy these cheap and powerful
rapid prototype machines to easily produce building scale-models at a very low-cost, as
this will help on the design and study of new and more efficient buildings.
Fig. II-76 - A brush made in a 3D printer, using two different materials, printed simultaneously into a
single and not assembled functional object (Object Inc.)
78
2.6 Building Information Modeling (BIM) and Automated
Construction of Buildings
In today‟s world of architecture and building constructions, new tools are being developed
in an effort to gather all the information about the design, project, development,
construction and maintenance of a particular new building. These new tools allow the users
to do a complete track and simultaneously control and visualize in real-time all the
operations in all phases of a project (even through is use by customers, capturing the entire
life-cycle), allowing to go back in time and see what happened at a specific stage of the
project or moving forward in time and plan in advance. These tools are also essential to the
vision of autonomous/automated construction of buildings. Classic CAD software is used
by architects to create the geometrical shapes of a building design. In traditional
workflows, these shapes are then interpreted by construction workers that build the real
structure.
2.6.1 Building Information Modeling (BIM)
Building Information Modeling (BIM) aims to create a virtual model of a building adding
additional and important information to the geometry. This incorporates all building
constituents and the 3D model is used and worked on by the whole design team (architects,
engineers and specialist consultants). This information might also be handed over to the
client and final users of the building, serving as a database for facility management and
maintenance. This BIM model (e.g. using the “sgxml” format) does not only contain
geometrical information, but also contains properties and relations of building objects and
components: geometrical properties (area, volume, and amount), physical properties (U-
value, weight), cost information and planning relations (neighbors, room functions). This is
expected to drastically change the traditional planning workflow and improve
communication of the design team. Planning mistakes could be found early in the process,
incorporated easier and faster, calculations of areas and costs become more precise [73].
79
Also, keeping track of time and having the capability of visually, seeing the 3D model of a
building and all the associated information, and being able to navigate back and forward in
time (the so called N-D CAD/BIM tools) is indispensable on such an information platform.
A series of four essential reports from McGraw-Hill Construction [74-77], provide a very
insightful vision on the BIM topic as well as other topics (e.g. real world of construction,
simulation tools, surveys...).
Fig. II-77 – BIM Virtual Information (virtual simultaneous visualization of six different phases of an
ongoing building project, created using GraphiSoft ArchiCAD platform)
2.6.2 Automated Construction of Buildings
A new automated construction system is all set to change the face of home-building, for it
is soon going to step out of the developmental phase and become a functional house-
building robot. A prototype was developed by Behrokh Khoshnevis of the University of
Southern California [78]. A computer-controlled crane or gantry will build houses
efficiently without any manual labor. It will use a quick-setting concrete-like material and
will build up structures in a layer-by-layer fashion. This system will be skillful at erecting
walls of almost any shape and specification and can finish a full-fledged house in a matter
of a day without any breaks.
80
Fig. II-78 – Vision for an Automated System for Autonomous Construction of Buildings (Behrokh
Khoshnevis)
In addition to the savings on labor costs, the machine is believed to be eco-friendly in
contrast to the standard home construction method, which churns up to seven tons of waste
and fumes. Such is the quality of this prototype that NASA is eyeing this machine for aid
in building a lunar habitat. The maker suggests that the robot‟s ability to rapidly construct
housing will help authorities in reconstructing shelters in areas struck by natural calamities.
Fig. II-79 – A Real Prototype for an Automated System that will allow the Creation of Buildings
Although automation has advanced in manufacturing, the implementation of automation in
construction has been slow. Conventional methods of manufacturing automation do not
lend themselves to construction of large structures with internal features. This may explain
the slow rate of growth in construction automation. According to the referred authors [79],
Contour Crafting (CC) is a layered fabrication technology that has a great potential in
automated construction of whole structures as well as subcomponents. Using this process,
a single house or a colony of houses, each with possibly a different design, may be
automatically constructed in a single run, embedded in each house, will be all the conduits
for electrical, plumbing and air-conditioning.
81
III.
Simulation
Tools in
Architecture
3 Outline
Regarding the use of simulation tools for the creation of an automated optimization
framework, the first feature to look for is the feasibility/easiness of programming
integration. A simple text import of simulation parameters and a mere export of results also
in text format was not the ideal path of development for this research work, not only
because that would mean a decrease in performance, but also because it would be one more
bottleneck for the development of a multithread framework for optimization. Therefore,
the quest for finding simulation packages that could allow a complete integration between
Euclides/JavaScript, the evolutionary optimization framework and the chosen simulation
package, was not easy. Several simulation tools/packages were evaluated in terms of
simulation results as well as easiness of API integration [80]. Also there were some
constrains, related to licenses for software development purposes that had to be taken in
account.
82
3.1 EnergyPlus and DesignBuilder
EnergyPlus [81] is a modular and complete building energy simulation program that
engineers, architects and researchers use to model energy and water usage in buildings.
Modeling the performance of a building with EnergyPlus enables building professionals to
optimize the building design to use less energy and water. This software is also well
known, for being tested extensively before each release.
With EnergyPlus is possible to model heating, cooling, lighting, ventilation, other energy
flows and water use. It includes many innovative simulation capabilities, like time-steps
less than one hour, modular systems and plant integrated with heat balance-based zone
simulation, multi-zone air flow, thermal comfort, water usage, natural ventilation and
photovoltaic systems.
Fig. III-1 – EnergyPlus Simulation Zones
EnergyPlus [81] has its roots in both the BLAST (Building Loads Analysis and System
Thermodynamics) and DOE–2 programs. BLAST and DOE–2 were both developed and
released in the late 1970s and early 1980s as energy and load simulation tools. Their target
audience was the design engineer or architect that wishes to size appropriate HVAC
equipment, develop retrofit studies for life cycling cost analyses, optimize energy
performance or other problematic.
83
Born out of concerns driven by the energy crisis of the early 1970s and the recognition that
building energy consumption is a major component of the American energy usage
statistics, the two programs attempted to solve the same problem from two slightly
different perspectives.
Both packages had their merits and shortcomings, their supporters, detractors and solid
user bases, internationally [81]. Like its parent programs, EnergyPlus is an energy analysis
and thermal load simulation program. Based on a user‟s description of a building from the
perspective of the building‟s physical make-up, associated mechanical systems, etc….
EnergyPlus will calculate the heating and cooling loads necessary to maintain thermal
control set points, conditions throughout a secondary HVAC system and coil loads, and the
energy consumption of primary plant equipment as well as many other simulation details
that are necessary to verify that the simulation is performing as the real building would.
Many of the simulation characteristics have been inherited from the legacy programs of
BLAST and DOE–2. A not exhaustive list of some of the features of the first release of
EnergyPlus, that give an idea of the rigor and applicability of EnergyPlus to various
simulation situations can be found in "Constrating the Capabilities of Building Energy
Performance Simulation Programs", written by Crwaley, D. B., et al., [81]. There are
software interfaces, like DesignBuilder, the first exhaustive user interface for the
EnergyPlus dynamic thermal simulation software. It allows a fast and easy geometry
introduction and offers a set of tools that allow an easy modeling of buildings.
Fig. III-2 – DesignBuilder and its Buildings Energy Efficiency Rating
84
3.2 Autodesk Ecotect Analysis
Autodesk Ecotect Analysis is a software package with a distinctive approach to conceptual
building design. It couples an intuitive 3D design interface with a comprehensive set of
performance analysis functions and interactive information displays. The latest version of
Autodesk Ecotect contains a few new refinements for the export of building models to
EnergyPlus (and also Radiance for ambient light simulation). This means that it‟s possible
to work within an advanced modeling and visualization interface, making use of a vast
array of conceptual design tools, while still using EnergyPlus, considered one of the best
analysis and validation software.
Fig. III-3 - Working in EnergyPlus-mode inside Ecotect, when defining operational schedules
Fundamentally, there are five main reasons to consider Ecotect as part of an analysis
workflow in building design:
1. Modeling and Visualization as a Conceptual Design Tool
Ecotect provides its own fast and intuitive modeling interface for generating even
the most complex building geometry. Most importantly however is that the model
is editable. Tasks such as resizing or inclining walls, manipulating complex curves,
rearranging zones, moving apertures or even adding and deleting surfaces are all
straightforward;
85
Fig. III-4 - Internal daylight factors shown over a standard working plane
2. OpenGL
To assist in the design process, the model can be visualized in OpenGL, overlaying
Sun-path diagrams, displaying shadow information, lighting grids or simply
moving the model around in real time, is also possible. With its unique
“sketchiness” parameters, analysis results can be presented directly within the
context of the building model, knowing that the client will understand that they are
looking at preliminary ideas and not the future finished product;
Fig. III-5 - Overlaying a Sun-path on the model view
86
3. One Central Repository For All Building Data
Each material in Ecotect can store a wide range of information including basic
thermal and surface properties, detailed layer descriptions, acoustic response and
even cost and environmental impact data (if available). Likewise, complex annual
operational schedules and hourly profiles can be generated and assigned for
controlling occupancy, appliances or internal conditions;
4. Internal Analysis Functions
Ecotect offers a wide range of internal analysis functions which can be used at any
time while modeling. They provide almost instantaneous feedback on parameters
such as sun penetration, potential solar gains, thermal performance, internal light
levels, reverberation times and even fabric costs while the user develops and refine
the design. More importantly, generative functions can be used in the design,
allowing to automatically shape shading devices given specific performance
parameters or even interactively spraying acoustic rays to accurately position
reflectors;
Fig. III-6 - Annual cumulative solar radiation over the external surfaces
87
5. Powerful Scripting Engine (LUA) and Easy Integration Capabilities
One of the most important functions for the development of the current work was
the programming integration capabilities of the simulation tool. Ecotect is based on
LUA, one of the best scripting engines available. Also it allows for many different
ways of software development integration (sockets, LUA scripts, MS Windows
Dynamic Data Exchange – DDE…).
Finally, Autodesk Ecotect Analysis also includes innovative building energy and carbon
analysis tools made available through the Green Building Studio web-based service [82].
This web service provides a user-friendly front end to powerful building energy analysis
software. All of the computationally intensive hourly simulations are carried out on remote
servers and the results are provided in a web browser.
3.2.1 Short Comparison between Autodesk Ecotect
and EnergyPlus
In Table III-1 it‟s possible to see a comparison between Autodesk Ecotect Analysis and
EnergyPlus in order to compare the CIBSE and ASHRAE calculation methods [83].
Table III-1 - Characteristics of Two Different Simulation Tools [83]
Software
Autodesk Ecotect Analysis (CIBSE
Admittance method)
EnergyPlus (ASHRAE’s Heat
Balance method)
Time step
Daily and hourly
Hourly
Zoning
Multi-zone
Multi-zone
Heating regime
Intermittent or continuous
Intermittent or continuous
Heating set point
temp
Optional
Optional
Cooling
calculation
Simple dynamic
Detailed dynamic
Internal gains
Daily + hourly values
Hourly
Outside conditions
Local meteorological data can be
implemented
Local meteorological data can be
implemented
88
3.3 Ansys: AirFlow
According to Ansys company, the Ansys simulation tools are currently used in all of the
following key “Green Technology” areas [84]: Pollution Reduction, Renewable Energy &
Fuels, Energy Efficiency, Indoor Air Quality and Green Building Design.
Fig. III-7 – Coloured contours of Thermal Confort in a conference room predicted for a particular
ventilation system design (Ansys, Inc. Proprietary)
3D Airflow is a highly functional, innovative software program for analysis of airflow in
simulated 3D geometries. It enables the user to perform examinations of thermal
conditions, CFD simulations and to generate physically founded results even in the early
design phase of buildings.
The priority of the development of 3D Airflow was the simple and user-friendly handling.
It‟s possible to construct complex 3D geometries in a short time and to relocate, reinsert or
resize both single and whole building parts (at any time). Additionally, the central 3D input
integrated in 3D Airflow enables the user to construct a simulation model which can, in
consideration of the integral analysis of buildings, be used also for other simulations
(Lighting, Thermal, Acoustic) [85].
89
Ansys offers not only AirFlow but also other very interesting simulation tools for the
architecture field, like CivilFEM or Fluent. This simulation package is really interesting
due to the fact that it gathers so many different simulation methodologies like CFX and
CFD. However this package was not used in the current study because it was not possible
to get access to the needed programming libraries and other API integration material [86].
Fig. III-8 – Ansys CFD Modelling of Regional Flow Patterns near Cape Shopping Centre (Stephan
Schmitt & Thomas Kingsley; Qfinsoft, SA)
90
3.4 Sustainability Tools in Architecture – Comparison
Studies/Audits
Recently, an interesting study by Attia, S., et al., [80], was conducted in order to make
available to the community a more comprehensive examination of a wide range (top ten
most used) of scientifically validated Building Performance Simulation tools (BPS)
available internationally (i.e. Ecotect, HEED, Energy 10, eQUEST, DOE-2, Green
Building Studio, IES VE, Design Builder, Energy Plus and Energy Plus-SketchUp Plugin
called OpenStudio).
The study by Attia, S., et al., [80] also points out the gap between the “wishes” of users
and the real functionalities of the different Building Performance Simulation tools (BPS).
Lastly, this study also summarizes the key findings and underlines the major requirements
for future improvement and development of BPS tools, mainly from an architectural
perspective.
Also, in an audit published by RICS [87], it was examined how RICS members were
engaging with the sustainability agenda and the tools, techniques and information that they
were using to do their professional advice and work.
The research [87] was based on a major survey of 47000 RICS members across three
main global regions (UK and Europe, the Americas and the Rest of the World). For the
first time, this work provided detailed information on why the sustainability topic is so
important.
91
IV.
A Global
Optimization
Framework
4 Problematic of “Form Follows Energy” and the
Pursuit of Solutions
One of the main tasks of this master thesis was the exploration of studies (made by other
researchers related to the architecture field) that could clarify and exemplify the
problematic of how the form of a building determines this “Form Follows Energy”
relationship.
How to create buildings that takes in account the specific weather of a region and the
number of solar hours and the orientation of the building, this seems to be a very difficult
question. Several experts of numerous fields of research try to answer these questions and
develop methods and tools which intention is to facilitate architects and engineer‟s lives on
new projects that ultimately will improve the quality of living for all of us.
In the doctoral thesis of Christina Lemke [88] some difficulties were pointed out, analyzed
and discussed. Can the existent common building forms be used to harness solar radiation
or there is the need for developing new forms in order to increase the solar energy
extraction and at the same time contribute for a better adaptation as well as minimizing
thermal losses.
92
Fig. IV-1 – Different possible buildings forms, all with 1000m3 of volume, this can allow the
comparison of results obtained with different forms (PhD Thesis of Christina Lemke [88])
A wide variation of building forms (sixty four shapes) was also presented and analyzed in
respect to different climatic and radiation-geometric conditions for different regions to
check if they meet today‟s conditions for harness solar radiation and thermal conditions of
houses. Also the influence of a building‟s form and its orientation is discussed in order to
check is fitness for a maximum solar potential. In sum, there is a huge challenge to
demonstrate the relation between a building‟s form and its energy consumption, solar
potential and the co-relation between these factors and the final location and orientation of
a building [88].
In another work by AlAnzi, A., et al. [89], a simplified analysis method to estimate the
impact of the building shape on energy efficiency of office buildings in Kuwait is
presented. This method, based on results obtained from a comprehensive whole building
energy simulation analysis. The simplified method is, according to the authors, suitable for
architects during the preliminary design phase, to assess the impact of shape on the energy
efficiency of office buildings. Studies have shown that the building shape can have a
significant impact on the energy costs of heating and cooling, however, no general
guidelines on the impact of the form on the energy efficiency of buildings were made
available for architects and designers. The correlation equation described in this paper was
presented as “the solution” to be utilized by architects during the preliminary design phase
to assess the impact of shape on the energy efficiency of office buildings in Kuwait.
93
Another effort, achieved numerical results to derive parametric equations as a function of
the thickness of insulation as well as the surface to volume ratio (A/V), taking into account
the effect of increasing height as well as increasing length and width. According to Bansal,
N., et al. [90], the two important parameters that affect the energy consumption in a
building are the insulation and its shape defined by a surface to volume ratio (A/V). The
architects and building practitioners, however, find it difficult to estimate energy and loads
(cooling and heating) for these recommended U-values (coefficient of heat transmission, a
measure of the rate of non-solar heat loss or gain through a material or assembly, U-values
estimate how well a material allows heat to pass through, the lower the value he U-value,
the greater a product's resistance to heat flow and the better its insulating value). It is also
found to be difficult (according to Bansal, N., et al. [90]) to run simulation programs
directly.
There are also studies that try to relate the heating consumption of the buildings with their
shape, always discussing some inherent limitations. In a particular reading, a specific
parameter has been chosen to characterize the shape of the buildings, the shape coefficient
is designed as the ratio between the external skin surfaces and the inner volume of the
building, so, fourteen buildings have been chosen according to their varieties in shapes and
their representativeness in current constructions and this precise article sets out a study on
the link between the building shape and their energy consumption. However and according
to Depecker, P., et al. [91], several parameters are missing in their work, such as
orientation or climate in order to confirm these results.
Fig. IV-2 – Definition of an elementary volume, according to Depecker, P., et al. [91]
94
The purpose of several other articles is to present a method of solving the optimization
problem of the internal partitions of a building, the shape of the building, as well as heat
sources, illustrating the example of multi-criteria optimization of the big blocks of flats and
to present rational methods of multi-criteria optimization of the shape and structure of
energy-saving buildings, as well as optimization of heat sources taking into account the
energy criteria [92] [93] [94].
Often, studies tend to provide mere guidelines that should be applied in order to minimize
energy consumption in buildings, due to the fact that this is a very difficult topic and there
was indeed an increased interest and importance for the reduction of energy consumption
in architecture and urban planning in the latest years. This kind of approach usually plans
for the season with the severest weather, often forgetting that temperatures in cities at
certain latitudes can drop below thermal comfort limits in winter and that temperatures in
cities at other latitudes often raise above thermal comfort limits in summer. Holistic
approaches to energy efficient building forms seem to be needed and sometimes offer a
systematic comparison and an evaluation of the relationships between urban built form and
energy efficiency of a few generic forms. However and according to Okeil, A. [95], some
limitations of these studies have also to be observed, like the performance at several
latitudes, or the performance reported for some optimized generic forms. In a real life
situation the form of the building will have to respond to many other forces that might lead
to a reduction in performance.
Fig. IV-3 – Flow field at a street intersection with a tall building, illustrating exchanges between the
streets and additional mixing processes due to the large building
95
Others provide a simplified analysis method to predict the impact of the shape for an office
building based on its annual cooling and total energy use. This simplified analysis method
is developed based on detailed simulation analyses exploiting several combinations of
building geometry, glazing type, glazing area and climate. The shape of the building has a
significant impact on both construction costs and energy costs of buildings and these
studies have investigated the impact of the building shape on its thermal performance for
selected climates in Europe. Therefore a simplified analysis tool is proposed in some
works, to assess the impact of building shape on total annual energy use for office
buildings. According to Ourghi, R., et al. [96], the analysis indicates that there is a strong
correlation between the shape of a commercial building and its energy consumption. These
methods were applied for several cities around the world and were found to be accurate for
cooling dominated climates.
Finally, five of the most commonly used single span shapes of greenhouses (Even-span,
Uneven-span, Vinery, Modified arch and Quonset types) have been also selected for a
comparison. The length, width and height (at the center) are kept the same for all the
selected shapes. A mathematical model for computing transmitted total solar radiation at
each hour, for each month and at any latitude for the selected geometry greenhouses was
developed for both east-west and north-south orientation. Computed transmitted solar
radiation was then introduced in a transient thermal model developed to compute hourly
inside air temperature for each shape and orientation. The selection of optimum shape and
orientation of a greenhouse can lower the heating and cooling loads of the installed
systems thereby saving a lot of operating cost. Hence, in this study, an attempt has been
made to select the most suitable shape and orientation of a greenhouse for different
climatic zones (latitudes) on the basis of total solar radiation availability and its subsequent
effect on the greenhouse air temperature. From this study important conclusions were
taken regarding greenhouse shapes and their relation to maximum solar radiation
extraction by using several different “manual simulations” (Sethi, V. P., [97]).
96
Fig. IV-4 – View of greenhouse shapes in E-W orientation
For a better understanding of how decisions are taken to improve energy performance of
office buildings in warm climates and why energy tools are not so popular among the
architects, a survey was made by Aldomar Pedrini, S. S., [98], that shows the clear
preferences of architects for using intuition and assumptions, rather than energy tools,
guidelines, rational thinking or tests of hypothesis. This study presents some
recommendations to make energy tools more suitable for early stages of the building
construction process, such as, making the geometric modeling more compatible.
Parametric analysis could be optimized if the user defined the range and the interval of
values that a specific variable can assume, then the alternatives would be automatically
created and simulated. The architects look for simple answers and the outputs should make
the comparison of different solutions easily, fast and “whenever possible through an
automated tool”.
97
4.1 Answer to the Problematic: Optimal Forms - A
Global Optimization Framework
The conducted studies in the present work, aimed the creation and presentation of new
methods to integrate sustainable design methods combined with evolutionary computation
and environmental simulation tools.
The main objective of this part (applied work and the development of a global optimization
framework) was to research and develop innovative intelligent and automated design
methods that allow the creation of environmentally optimized architecture, through the use
of Euclides and JavaScript (for procedural modeling), evolutionary algorithms, simulation
tools and finally through the development of a new global optimization framework.
This work started (as seen before) with essential research on the various topics related to
architecture, optimization of buildings, simulation tools, common methods and techniques,
always collecting information relevant to this study, as well as making national and
international contacts with architects and other researchers that have a close relation with
this subject.
98
4.2 Identification of Essential Forms Used in the Real
World
After an initial step (of research), a deeper look was taken on studies specifically related to
the problematic of “form follows energy” and all the different basic buildings shapes,
parameters and conclusions presented by many published studies (as seen before), this
standard and common real world building shapes were then carefully analyzed and studied.
Fig. IV-5 – 3D Models that Represent Real World Factories
It was decided to center the work on greenhouses, because they are the most basic and
suitable building forms to test the framework developed in this study and therefore it
constitutes a very straightforward case study to confirm early tests.
Three standard “real world” basic shapes (cylinder, classic with an “angled roof” and cube
shapes) were chosen as good candidates for a procedural 3D modeling, as seen in Fig.
IV-6.
Fig. IV-6 - Selected Basic Forms inspired in real world building shapes, starting from left to right: (a)
Cube, (b) Classic and (c) Cylinder) created using Euclides and Rendered using Deep Exploration
99
4.2.1 Procedural Shape Generation
After analyzing the different 3D shapes, in order to extract all the fundamental form
parameters, the essential parametric equations were obtained, because they provide the
necessary associated math that is needed for creating 3D models using computer graphics
techniques (e.g. OpenGL).
The software K3DSurf and the software Maple were used at this stage, in this way it was
possible to create and visually try different parameters combinations, getting the
correspondent simplified expression code in the end.
The usage of these software‟s was essential for understanding the different shapes and to
infer the necessary parameters.
Fig. IV-7 – Tests for creating different 3D shapes (and controlling its parameters) using parametric
equations
100
As seen in Fig. IV-8, essential parts of the intended 3D shape, could actually be generated
using parametric equations (using K3DSurf and Mapple).
Fig. IV-8 – Definition of Parametric Equations in Mapple 14
In the end (please see Fig. IV-9) an optimized code was obtained. Some extra code was
added for creating the missing parts of the shape and then all the code was refactored using
JavaScript and Euclides to create a complete 3D shape.
Fig. IV-9 - Optimized Code Generation produced by Mapple 14
101
4.2.2 Code Writing Using Euclides and JavaScript
The code development process started by producing the necessary JavaScript code, through
the usage of Euclides (the Fraunhofer JavaScript Procedural Shape Generator), in order to
accomplish the creation of the three standard basic shapes (please see Fig. IV-6), using a
procedural method, so that later, the desired geometric parameters of those shapes could be
dynamically controlled by the evolutionary algorithm.
Fig. IV-10 – Piece of JavaScript Code to generate the 3D Cylinder Shape (Code adapted from
parametric equations and Mapple 14)
This new JavaScript/Euclides code, originated the creation of a new application that
allowed a manual interaction with each of the shapes parameters (width, height, roof angle,
orientation), as seen previously in Fig. II-12 (Chapter II).
Fig. IV-11 – Resulting JavaScript/Euclides interface that allows the control of each shape parameter
102
4.3 Simulation Tools Integration
After achieving the creation of these basic procedural types of 3D shapes, an investigation
on general simulation packages that could be used for a complete integration on the
optimization process to be developed, was conducted.
The research contemplated different simulation packages related to the building
construction business, simulation tools like, Energy Plus, DOE, Ansys, Autodesk Ecotect
or the new preview version of Autodesk Vasari (a very promising and innovative project in
this field that was also tested).
Due to the specific needs and the assumed commitment of having a complete programming
integration (rather than just using external scripts or batch files) between the chosen
simulation package and the developed global optimization platform and also after several
unsuccessful integration tests and attempts, finally it was obvious to use Autodesk Ecotect
and its powerful internal script engine based on LUA (a standard scripting language).
Autodesk Ecotect was preferred, mainly because:
It‟s a common used simulation tool used by architects all around the world;
It‟s was already used in past studies related to the topic of this study (better
comparison);
Also because (as said previously) it was the simulation package more suitable for a
complete integration at this stage.
In the beginning, the procedural shapes produced using Euclides and JavaScript, were
exported to the “Wavefront” format (.obj files) and several individual and manual thermal
analysis studies (as well as other types of analysis) were performed using Autodesk
Ecotect, as a way to understand the functionalities, problems and advantages of this
powerful and widely accepted simulation tool.
103
Then the Autodesk Ecotect thermal analysis parameters and conditions were manual
defined and several LUA scripts were created in order to automate this manual simulation
analyses and in order to incorporate all the parameters and conditions in a script form, in
this step it was essential to search for detailed information on how to perform correct
Autodesk Ecotect simulations and information on how to create the necessary scripting
code [99-102].
The .Net programming language, C#, was then selected for the development of a new
Autodesk Ecotect simulation library. By implementing the previously developed Autodesk
Ecotect LUA scripts, internally into this new simulation library, it was created a flexible
and comprehensive library, able to run several kinds of analysis inside Autodesk Ecotect,
in an automated way [103]. This library can now be used, not only as part of the developed
global optimization framework, but also in any other project or third party standalone
(independent) applications that may benefit from this kind of automated integration with
Autodesk Ecotect.
The next step was to create a web service [104, 105] (using NetBeans 6.9.1 and MS Visual
Studio 2010) that could be used to integrate Autodesk Ecotect and the .net simulation
library with the JavaScript Procedural Shape Generator (Euclides) and the developed Java
global optimization framework.
This web service also allows the running of simulations in a remote way through the web.
Also, in the future this web service can be installed in several web servers (together with
Autodesk Ecotect), allowing simultaneous runs and parallel simulations, as a way to speed
up the optimization process (in the future other simulation packages can also be integrated
in this newly developed framework).
104
4.3.1 Simulation in Ecotect and the Admittance
Method
The CIBSE admittance method dates back from the 1960‟s when it was developed in order
to enable engineers in practice to estimate peak cooling loads using manual calculations. It
basically constitutes a simple representation of a highly dynamic process; i.e. the building,
as a physical body with thermal mass and heat capacity, under the influence of variable
outside conditions and variable internal gains. The admittance method (also known as the
“means and swings” method) uses a steady state approach for the “mean” values in
combination with a dynamic part that describes all deviations from steady state, i.e. the
“swings”. The admittance method is based on the assumption that all thermal dynamics can
be represented by the response to a sine wave with a period of 24 hours [106].
The admittance method is broadly used and has been shown to be an extremely useful
design tool. It is not as physically precise as some of the more computationally intensive
techniques, such as, the response factor or finite difference methods. However for the
purposes of design decision-making, the admittance method is considered to be the best
choice, as it provides near immediate feedback and the accuracy of results can be
progressively improved as the building model develops (Hensen, J. and M. Radošević,
[107]). Using this method there is no need for solving partial differential equations and is
also possible to use the method for manual calculations. By introducing several parameters
the admittance method expresses the building dynamics in a simple way. The parameters
are defined depending on: the type of the thermal input, thermal properties and thickness of
the construction, surface finishes and furnishings within the space (Hensen, J. and M.
Radošević, [107]).
In other words, for load or temperature calculations it is necessary to determine the mean
values either for temperature or for load and then the swing (mean to peak) for either one
or the other value. Both values are obtained from the heat balance equation. The method
involves heat balance equations for both steady state and for the deviations from the mean.
The later equation involves parameters, such as: decrement and time delay of the thermal
input from the outside towards inside due to the heat accumulation within the construction
(Hensen, J. and M. Radošević, [107]).
105
The overall calculation procedure is presented in the report Environmental Design, CIBSE
Guide A, published by the Chartered Institution of Building Services Engineers (CIBSE)
[108].
The Thermal Admittance, Y [W/m²K], can be defined as the amount of energy leaving the
internal surface of the element into the room per unit degree of temperature swing. This is
under theoretical conditions where the internal environmental temperature undergoes
periodic oscillation and the external environmental temperature is constant [109].
4.3.2 Initial Manual Workflow Tests
The developed global optimization framework is responsible for the 3D shape generation
and for the parameterization, the 3D geometry is passed to the simulation library that in its
turn passes the simulation parameters and also the 3D geometry to the analysis tool
(Autodesk Ecotect).
Fig. IV-12 - 3D shape generation in Euclides (1.000 m3 of volume); followed by Thermal Analysis; and
presentation of the analyzed shape. Other types of analysis can be performed as well.
Several manual simulations using Autodesk Ecotect, were conducted, in order to check if
the models generated by using Euclides and JavaScript were suitable for the simulation and
in order to discover all the necessary Ecotect parameters to complete the simulation
process. For this (and as an initial test), the models were manually exported to the .obj
format and manually imported into Autodesk Ecotect.
106
After putting all the parts working together, this simulation results are now automatically
obtained and the values are passed back to the optimization framework, that will decide
(evolutionary algorithm) if the optimization process is over or if a new 3D shape needs to
be automatically generated with new values on the mutable parameters, (i.e., new values
for width, height, orientation of the building or the roof angle) maintaining constant, in this
case study, the total volume of the final building (e.g. 1000 m3), as seen in Fig. IV-12 and
Fig. IV-13 (please see the work conducted by Christina Lemke [88] for getting more
specific information).
Fig. IV-13 – Another 3D shape generation in Euclides (maintaining 1.000 m3); followed by Thermal
Analysis; and presentation of the analyzed shape
107
4.4 Differential Evolution
After the integration between the web service and the global optimization framework, the
attentions were focused in the optimization process, for implementing this, evolutionary
algorithms were chosen, more specifically Differential Evolution - DE (by Storn and Price
[26]).
For the inclusion of the differential evolution algorithm, the Fraunhofer/CGV Austria
“FhAFoundation” library was used. This library contains several different algorithms and
math functions. It was specifically used, the package “cgv.math” (with
“cgv.math.linalg.Matrix”, “cgv.math.linalg.Vector”, “cgv.math.linalg.DenseVector” and
“cgv.math.optimization.MultivariateFunction”). Also cgv.euclides was used, for a direct
import of the three Euclides/JavaScript 3D shapes implementation (Classic_Shape_Opt.jar,
Cylinder_Shape_Opt.jar and Cube_Shape_Opt.jar).
In the Fraunhofer implementation of the Differential Evolution (DE), the algorithm was
changed in order to ensure that it will not leave the specified domain, that is, parameters
will always be valid and also, the parameters will not be just clipped, a uniform
distribution is ensured.
Also, it‟s possible to adjust the following parameters of the DE algorithm (please refer to
the report on “Good Parameters for Differential Evolution”, published by Pedersen [31] for
the selection of good values for these parameters):
Recombination between two chromosomes (two different buildings, i.e. two
different 3D shapes, with some different parameters, e.g. different orientation or
width…), differentialEvolutionCrossOver, value used: 0.9/0.6;
Mutation operator randomly flips or changes genes in a chromosome, using a
weighting difference (different values on the several parameters of the building, i.e.
3D shape), differentialEvolutionWeighting, value used: 0.7.
108
And the respective stop criterion parameter values:
Minimum improvement, differentialEvolutionXTolerance, value used: 1.0;
Total amount of individuals generation, differentialEvolutionLimit, value used:
1000 evaluations.
Essential simulation parameters also used in the optimization process:
Volume of the shape (1000 m3), volume;
Name of the remote machine running Autodesk Ecotect, ecotectServerName;
Port number for web service connection, ecotectServerPort;
Ecotect occupancy and activity level for the analyzed Zone (Shape):
ecotectOccupancy (number of persons) and ecotectActivity (please see Table
IV-1);
Table IV-1 – Activity Level in Autodesk Ecotect
Activity Level
Work
W (watts consumed)
0
Sedentary (Sitting at a desk)
70 W
1
Walking (Moving around)
80 W
2
Significant Activity
100 W
3
Significant Fatigue
150 W
109
The Ecotect weather file: ecotectWeather ("PRT_Coimbra.085490_IWEC.wea");
The name of the (Euclides/JavaScript) file containing the 3D geometry of the
building (compiled into JAR format), shapeGenerator (e.g.
Shape_Cylinder_Opt.jar);
The default Autodesk Ecotect material (“BrickTimberFrame”), at this stage only
the default material is being used and it is automatically assigned to the entire
building (other materials can be already manually assigned to the entire building,
before beginning the optimization process, future work can be developed to
automatically handle other types of materials).
Fig. IV-14 – Example List for Materials that can be used inside Autodesk Ecotect
The 3D Shape is read (e.g. Cylinder_Shape_opt.jar) and the respective quantity of
parameters and their valid domains are also extracted (e.g. volume of the shape, minimum
and maximum values for orientation, width and length or any other parameters passed),
this parameters and values are inside the JAR file and are automatically loaded into the
optimization framework.
110
4.5 The Developed Global Optimization Framework
It‟s possible to see in Fig. IV-15 a complete overview of the global optimization
framework developed as part of this study.
Fig. IV-15 - Overview of “Optimal Forms”, the developed and proposed Global Optimization
Framework
The global optimization framework is focused on controlling at least three specific
parameters of the fundamental 3D shapes created in this work, namely, the width, the
height and the orientation of the building in a specific location (this location can be chosen
from a list of available locations and the correspondent temperature files are automatically
loaded into Autodesk Ecotect using the developed simulation library), in the particular case
of this study, it was selected the weather values for the city of Coimbra in Portugal (also,
tests using the specific weather files for other cities, e.g. for the city of Graz in Austria,
were already executed as well).
111
The length of the building is automatically calculated (by relating the width, the height and
the volume that was initially fixed on 1000 m3) for the entire building.
The 3D geometry of the shape to be analyzed in Autodesk Ecotect is passed to the web
service through an internal 3D data structure (instead of exporting it to an .obj file, like it
was done initially when using the manual process).
It will then be started a new thermal analysis in Autodesk Ecotect and in return, the newly
calculated admittance value will be returned by the thermal analysis web service call.
This admittance value is in its turn, passed back to the evolutionary algorithm (that will try
to minimize it) and the optimization process continues until a stopping criterion is met,
returning in the end, the optimum values for width, height and orientation for a specific
city location and an overall building material.
Fig. IV-16 – The Global Optimization Framework running autonomously (simulation in Ecotect
followed by 3D shape generation in order to evolve a population of new buildings with different
parameters using the differential evolution algorithm)
112
4.6 Case Studies and Presentation of Results
The results presented here, aim to demonstrate that an automated tool/platform for
procedural 3D shape generation and for autonomous optimization of building forms can be
built, by developing JavaScript code (using Euclides) and linking it with an Evolutionary
Algorithm, in this study, the Differential Evolution (DE) algorithm was used, finally
combining all of this, with standard and accepted analysis tools in the architecture field.
The intention of this work, at this stage, is to validate that procedural languages and
evolutionary algorithms can indeed greatly contribute to help architects and other entities
related to building construction, in early stages of the design process, to test and analyze
their creations in terms of thermal and energy efficiency. More parameters and form
functions can be and should be integrated into this framework in order to create a suitable
tool for supporting building projects.
The results here discussed, were already exposed to some groups of architects (nationally
and internationally), that immediately demonstrated interest in helping developing more
these methods and are willing to support their daily work on the results produced using this
newly developed optimization framework.
Also, this kind of tools (and frameworks) can achieve in a near future, a more holistic
approach to the several problems and constraints related to the green and sustainable
construction problematic.
Using an automated optimization framework, in contrast to guidelines, table of values,
thumbs of rule, manual and independent simulations, intuition and personal experience or
any other manual and tedious processes is indeed a great improvement in the building
design process, especially new ones, that must take in account environmental and energy
efficiency factors.
113
4.6.1 Case Study 1 – Classic Shape Building
Optimization
In this first case study, a proof of concept is presented, that recreates the normal decision
process of an architect when designing a new building (based on the “classic shape form”
of a home, with an angled roof), to be constructed in Coimbra city, in Portugal. It‟s
assumed that the design project is still in an early design phase. There is already a rough
idea for the final shape of the building and there are also some important project
constraints for the building dimensions (width, height, roof angle...), that are fundamental
to the future construction of the projected building in its specific future physical location.
In this specific case study the assumed future constraints of the building were:
Width must be between 15 m (meters) and 20 m (meters);
Height must be between 3 m and 6 m;
Volume must be 1000 m3, in order to analyze the energy dependence between form
and volume, the shape coefficient is designed as the ratio between the external skin
surfaces and the inner volume of the building, according to Christina Lemke [88],
Bansal, N., et al. [90] and Depecker, P., et al. [91];
Length is automatically calculated, it‟s a relation between width, height and
volume;
Roof Angle must be between 38o and 60o;
Orientation of the building can vary between 0o and 180o and for testing and
demonstration purposes of the optimization framework, tests were also conducted
between 0o and 360o;
Location: Coimbra city, Portugal (specific region weather data was loaded into the
simulation tool used, i.e. Autodesk Ecotect).
114
Several runs (more specifically seven optimization tests) were conducted for this specific
case study using the “Classic Shape Form” as the assumed final building form (however
other similar tests were already conducted using the Cube Shape as well as the Cylinder
Shape).
Table IV-2 - This table shows several runs using the developed framework in order to optimize the
design of a “Classic Shape Form” building using the Differential Evolution (DE) algorithm. In this
table is also possible to see the different constrains and parameters used in the optimization process
In Table IV-2, is possible to analyze what were the project‟s constraints (domain values)
used in each optimization run (i.e. Results Order column). Also, it‟s possible to see what
were the different settings used for the evolutionary algorithm configuration. As already
seen, in the previous chapters, according to Storn and Price [26], the value of 0.9 for
crossover is always a good initial value to check if the algorithm can make a fast
convergence or not, on the given problem. Also, there were some optimizations runs,
where the project constrains were modified on purpose, although within the real project
constrains (e.g. on optimization run number four, the height was constrained between 3 m
and 4 m), to check if there were any significant changes in the global optimization times
and results. In Table IV-2, we can also visualize consequent runs utilizing the same
constrains/parameters values, this means that consecutive runs were executed in order to
analyze the consistency of the obtained results.
115
Table IV-3 - Results obtained in each of the optimization runs (according to Table IV-2)
The “time needed” column, shows the total time needed for presenting a “minimized”
solution for the problem given, i.e., the time presented does not refer to times consumed by
the differential evolution algorithm alone, but instead, it refers to the total time consumed
by the global optimization framework, i.e., (Procedural Generation of each different shape
+ Differential Evolution Algorithm + Autodesk Simulation). Also, it should be noticed that
one of the stop criterion used was the maximum number of individuals to be evaluated (in
this case equal to 1000) but the number of evaluations needed was between 35 and 137
evaluations (2 to 8 generations). So the stop criterion was only given by dx, equal to 1.0,
i.e., the minimum improvement margin on four consecutive optimizations, i.e., after having
four times an improvement less than 1.0, than it was considered to have found a minimized
and optimized solution for the given problem. Analyzing optimization runs 5, 5_1 and 6
it‟s also possible to see that if the architect of this “case study” project had chosen to
constraint the roof angle only to values between 38o and 45o, than the results would be
dramatically different, in these last runs we can see that the total admittance is smaller than
the ones achieved on the initial optimization runs, but of course, that would depend on
each project constrain, the purpose of the framework is always to find the best solution
possible for the supplied constrains.
116
4.6.2 Case Study 1 - Presentation of Results
On the next images, it‟s possible to check the behavior assumed by the differential
evolution algorithm on the different global optimization runs.
All the results obtained for the different optimization runs are represented in the previous
table, Table IV-3.
In Fig. IV-17, it‟s possible to observe the results obtained by using the differential
evolution algorithm on this specific problem, only in two generations of 17 individuals
each.
The algorithm avoids stopping on local minima and tries to find the best minima possible,
for the specific constrains of the supplied problem.
Fig. IV-17 – The algorithm chooses different individuals (buildings) for analysis, not stopping on the
local minima that was found along the optimization process and giving room for jumping those same
local minima
135
136
136
137
137
138
138
139
139
140
140
017 34
Admittance (W/oK)
Evaluations
Classic Shape Run #1
Run #1
117
In Fig. IV-18, there was an individual that was selected for evaluation that was completely
different from the other selected individuals (i.e. different values for width, height,
orientation, roof angle), this could be because the evolutionary algorithm generated a
completely different individual or just because there was some “error” in the simulation
process, nevertheless, the algorithm and the framework, is able to overcome such problems
and the search for better individuals continued.
The evolutionary algorithm generated other individuals that were better adapted (smaller
admittance) and the final result was similar to run 1, but it just took more time to Autodesk
Ecotect to finish the simulation process for the generated individuals on this particular
optimization run.
Fig. IV-18 – In this run it was generated a completely different individual (building shape), but the
admittance value was really high and other better individuals were found
For the purpose of code errors debugging the Fraunhofer Evolutionary Algorithm library is
using a deterministic function for the random generation (instead of a “fully” random
function).
0
200
400
600
800
1000
1200
1400
017 34
Admittance (W/oK)
Evaluations
Classic Shape Run #2
Run #2
118
In Fig. IV-19, the results shown, refers to an optimization run, where a smaller value for
crossover was used, a less diversity of individuals was generated.
However the final result (admittance) was identical to the previous runs.
Fig. IV-19 – A new optimization run, this time using a different value for crossover (0.6)
135
136
136
137
137
138
138
139
139
140
140
017 34
Admittance (W/oK)
Evaluations
Classic Shape Run #3
Run #3
119
In Fig. IV-20, we can see that more tight constraints were used (the domain of the different
parameters of our problem is more constrained), the result was worse than on previous
runs.
We should always let the global optimization do is job, the results will be better than trying
to manually guess and do a fine tuning where the best solution can be, as an attempt to
achieve faster results. By doing that the result will be similar to fall on a local minimum of
a much more unconstrained problem (of course this will depend on how well our problem
is constrained or not, or on the real constrains of the intended project).
Fig. IV-20 – Other “tighter” constrains were chosen, and the results were slightly worst then the initial
attempts (runs 1, 2 and 3) where a wider domain of search was used
135
140
145
150
155
160
165
017 34
Admittance (W/oK)
Evaluations
Classic Shape Run #4
Run #4
120
Next, in Fig. IV-21 it‟s possible to observe the individuals generated on each different
optimization run and the correspondent results.
Fig. IV-21 – The optimization runs (1, 2 , 3 and 4) are plotted here simultaneously
Next, the “real” case study problem (original constrains) were supplied to the global
optimization framework, the search domain was much wider.
On the first runs (1, 2, 3 and 4) only 35 individuals were sufficient in order to find a good
result that means that the framework had to call and accomplish 35 different simulations
inside Autodesk Ecotect.
However, in this “new and more unconstrained” problem (the domain of the problem
variables is wider), the framework needed to analyze 137 different individuals (different
shapes, with different parameters) in order to find a solution (a better one) for our original
problem.
017 34
130
135
140
145
150
155
160
165
170
Evaluations
Admittance (W/oK)
Optimal Forms - Classic Shape Optimizations
Run #1 Run #2 Run #3 Run #4
121
Although, 137 different individuals (8 generations) may be considered, as a not so huge
amount of different individuals to be analyzed, these results also show that 137 different
individuals were sufficient to find a solution and also there was no need to spend a huge
amount of time on different Autodesk Ecotect simulations.
The requirement for doing more simulations, would also represent a significantly increase
on the overall time needed by the framework to come out with a solution.
Fig. IV-22 – By allowing the algorithm to generate buildings that could use a different orientation (less
constrained regarding the orientation of the building), more evaluations were needed, but a much
better result was obtained
80
90
100
110
120
130
140
150
160
170
017 34 51 68 85 102 119 136
Admittance (W/oK)
Evaluations
Classic Shape Run #5
Run #5
122
The optimization run number 5, was immediately followed by a consecutive optimization
run (5_1) with exactly the same constrains, in order to evaluate about the consistency of
the results provided by the global optimization framework on successive runs.
Fig. IV-23 – A new consecutive optimization run (using exactly the same values) in order to check if the
behavior was consistent from run to run
80
90
100
110
120
130
140
150
160
170
017 34 51 68 85 102 119 136
Admittance (W/oK)
Evaluations
Classic Shape Run #5_1
Run #5_1
123
On optimization run number 6, an optimization was conducted using the same constrains
except for the orientation of the building, now the orientation could vary from 0o to 360o.
The results show that because the building has no doors or windows yet (and because the
orientation domain is set to be between 0o and 360o), there can be another different optimal
solution.
Of course this result would be different, on a more advanced stage of the design process,
where the building would already have doors and windows represented, at that time only
one good solution would be possible (like when we set the orientation to be between 0o and
180o.
Fig. IV-24 – By allowing the algorithm to search for solutions in an orientation domain between 0o and
360o and because the building does not have doors or windows yet, the algorithm found a good solution
by orienting the building on a different direction (when compared to runs 5 and 5_1)
80
90
100
110
120
130
140
150
160
170
017 34 51 68 85 102 119 136
Admittance (W/oK)
Evaluations
Classic Shape Run #6
Run #6
124
Finally, in Fig. IV-25 the plots of optimization runs 5, 5_1 and 6 are gathered in order to
show once more the regularity of results achieved with the current optimization framework
as well as the convergence of the differential algorithm.
Fig. IV-25 – By plotting all the individuals generated by the global optimization framework for runs (5,
5_1 and 6) it’s possible to check the consistence of results obtained on these more complete
optimization runs
017 34 51 68 85 102 119 136
85
95
105
115
125
135
145
155
165
Evaluations
Admittance (W/oK)
Optimal Forms - Classic Shape Optimizations
Run #5 Run #5_1 Run #6
125
After obtaining an optimization through the use of Optimal Forms – the developed global
optimization framework, the architect can continue completing the project and perform
other types of analyzes in Autodesk Ecotect (shading and shadows, solar access/solar
exposure, lighting analysis, acoustic response…).
It must be referred, that Optimal Forms also exports the optimized shape to a file on the
simulation server, using the format .obj (Wavefront format), the user can use this file to
import the geometry to any CAD software or into another simulation package (there is also
an option in Optimal Forms, that allows the saving of each one of the evaluations to a
separate .obj file).
Fig. IV-26 – Case Study 1 (Final Classic Building Shape) – Daily and Annual Sun Paths + Shadows and
Daylight Levels at 14:00 PM, for the 8th of September (based on specific climate/weather files for
Coimbra, Portugal); Width = 20 m; Height = 6 m; Roof Angle = 60o; Volume = 1000 m3; Orientation:
122,10o
126
4.6.3 Case Study 2 – Cube Shape Building
Optimization
In this second case study, a proof of concept is presented, that recreates the normal
decision process of an architect when designing a new building (this time, based on the
“cube shape form” of a home), to be constructed in Coimbra city, in Portugal. It‟s assumed
that the design project is still in an early design phase. Once again, there is already a rough
idea for the final shape of the building and there are also some important project
constraints for the building dimensions (width, height...), that are fundamental to the future
construction of the projected building in its specific future physical location.
In this specific case study the assumed future constraints of the building were:
Width must be between 8 meters (m) and 10 meters (m);
Height must be between 5 m and 6 m;
Volume must be 1000 m3, in order to analyze the energy dependence between form
and volume, the shape coefficient is designed as the ratio between the external skin
surfaces and the inner volume of the building, according to Christina Lemke [88],
Bansal, N., et al. [90] and Depecker, P., et al. [91];
Length is automatically calculated, it‟s a relation between width, height and
volume;
Orientation of the building can vary between 0o and 180o;
Location: Coimbra city, Portugal (specific region weather data was loaded into the
simulation tool used, i.e. Autodesk Ecotect).
127
One optimization run is here presented for this specific second case study, using the “Cube
Shape Form” as the assumed final building form.
Table IV-4 – This table shows an optimization run using the developed framework in order to optimize
the design of a “Cube Shape Form” building using the Differential Evolution (DE) algorithm. In this
table is also possible to see the different constrains and parameters used in the optimization process
In Table IV-4, is possible to analyze once again, what were the project‟s constraints
(domain values) used in each optimization run (i.e. Results Order column). Also, it‟s
possible to see what were the different settings used for the evolutionary algorithm
configuration.
As already seen, in the previous case study and according to Storn and Price [26], the value
of 0.9 for crossover is always a good initial value to check if the algorithm can make a fast
convergence or not on the given problem.
128
Table IV-5 - Results obtained in each of the optimization runs (according to Table IV-4)
For solving this optimization problem, the global optimization framework and Autodesk
Ecotect worked for 10 hours and 35 minutes (800 evaluations, 47 generations) in an old
Laptop Dell Precision M90 (Intel Core 2 Duo T7600, 2.33 Ghz) with 3 GB of memory.
Autodesk Ecotect took much more time to perform each thermal analysis on the Cube
Shape Form.
Fig. IV-27 – We can observe that at evaluation 289 the optimization framework had already achieved a
very good result (compared to the final result), but because the stop criterion used was, 1000
evaluations or dx = 1.0, the optimization continued until dx = 1.0 for four consecutive times
195
200
205
210
215
220
225
230
235
117 289
Admittance (W/oK)
Evaluations
Optimal Forms - Cube Shape Optimization
Run #1
800
129
Fig. IV-28 - Case Study 2 (Final Cube Building Shape) – Daily and Annual Sun Paths + Shadows and
Total Radiation Levels at 14:00 PM, for the 8th of September (based on specific climate/weather files
for Coimbra, Portugal); Width = 10 m; Height = 6 m; Volume = 1000 m3; Orientation: 121,86o
After obtaining this optimization through the use of Optimal Forms, once more the
architect can continue completing the project and perform other types of analyzes in
Autodesk Ecotect, in Fig. IV-28 it‟s possible to observe the shadow produced by the
building at 14:00 PM for the 8th of September, as well as the total radiation that reaches all
walls of the building (this information could also be used for a future installation of solar
panels).
131
V.
Conclusions
and Future
Work
5 Summary
Architects and engineers can use digital design information to better analyze and
understand how their projects will perform before they are actually built. Generating and
evaluating multiple alternatives simultaneously and autonomously makes the comparison
easier and faster. In the end, it allows better and more informed sustainable design
decisions, this is essential and contributes for the resolution of the green building
problematic of society. The Global Optimization Framework presented here cannot yet be
considered the solution for all optimization problems, as the evolutionary algorithm used
cannot “in principle” guarantee convergence or optimum solutions and the computation
times can be long until an acceptable solution is found and presented (depending on the
problem and on the number of parameters of that same problem the convergence can be
slow). However in tests and in the optimizations runs conducted, the results obtained were
really good and promising, the algorithm converged always, achieving interesting and
useful results in a fairly short period of time. The work developed, can be seen as an
important step towards the future development of an automated and more complete tool
that can help architects and engineers save time and money in an early stage of the project,
allowing the analysis of different “not yet known” solutions for a specific problem. When
compared to the usual solutions available to architects and engineers, where several
manual tries and empiric knowledge is employed in the attempt of finding a good or a
better solution, the current proposal presented in this work is really a huge improvement.
132
5.1 Final Conclusions
As far as it‟s known, this is the first time that a framework gathering generative modeling,
simulation tools and evolutionary algorithms is developed and presented.
The results obtained, deserved already a very positive feedback from Fraunhofer, which
considered this work as unique, highly valid and extremely promising.
It‟s really not so important, which “energy” comparison methods or simulations tools that
we are at this stage using, we could use admittance or conductance or any other method
and we could also use Ansys, EnergyPlus or any other simulation tools.
Nevertheless, there are some other conclusions to derive from the study presented, such as,
that we can indeed use evolutionary algorithms to create better tools for helping architects
and engineers when designing new buildings. This would allow the creation of more
sustainable and environmental friendly buildings, in a faster and in an independent way.
The results obtained with this newly developed optimization framework are indeed very
promising and the simple case studies presented here demonstrate the immediate
applicability for such a framework.
Also, it makes perfect sense to create an “intelligent” framework that allows the use of
different methods, simulation tools, that can work in an autonomous way, with the
objective of tackling several different real problems not only related to architecture, but
also other problems as well (medicine, financial, engineering in general…).
Of course, more tests and developments are needed and if a different problem is presented,
than one have to decide what will be the best strategy or algorithm to use as well. In the
library provided by Fraunhofer, there are several optimization algorithms available, they
can be used for attacking different problems.
More parameters or constrains can and must be added to create a more “commercial and
user friendly” software that can incorporate other more complete optimization problems.
133
5.2 Future Work
Possible future directions for research (following on from this study) can be undertaken in
five main areas:
Incorporation of more simulation variables (different materials, hvac…) and the
creation of different and more complex 3D shapes;
Simulation of the interaction between several buildings simultaneously (a street or
a complete city generated procedurally);
Implementation of other algorithms for solving a different problematic;
Integration with other simulation packages;
Integration of procedural methods and evolutionary algorithms inside CAD tools;
Use of the developed global optimization framework and its optimization methods
for the optimization of 3D geometry, but applied to different fields of interest (e.g.
industrial, medical, automotive…).
Also other new algorithm approaches and more specifically some new hybrid algorithm
approaches can be implemented in order to try to find better and faster ways to solve
different problems. Currently, the developed global optimization framework is based in
one of the most common used simulation tools of the market in the architecture field, i.e.
Autodesk Ecotect. This tool also works as a doorway for simulation in other tools related
to energy efficiency, like Autodesk Green Building Studio, the new Vasari project, Energy
Plus, Radiance and many other simulation tools. Still, other simulation packages (e.g.
Ansys or Abaqqus) can also be integrated into this global optimization framework, as a
way to tackle other specific problems (e.g. CFD, CFX or Flow/Motion of Persons in Urban
Environments). This work is not “the solution” of all building construction problems, but
represents a big and a major step forward in using procedural shape generation and
evolutionary algorithms in order to design a better and a much more sustainable world.
135
REFERENCES
[1] Kim, S. and D. Weissmann, "Middleware-based Integration of Multiple CAD and PDM
Systems into Virtual Reality Environment", Computer-Aided Design & Applications, vol.
3, pp. 547-556, 2006.
[2] Ulrich, T., et al., "Semantic Fitting & Reconstruction", ACM Journal on Computing and
Cultural Heritage, p. 20, 2008.
[3] Ganster, B. and R. Klein, "An Integrated Framework for Procedural Modeling", 2007.
[4] Parish, Y. I. H. and P. Müller, "Procedural Modeling of Cities", 2001.
[5] Couturier, A. (2011, 01-06-2011). Suicidator City Engine for Blender - Create 3D Cities
with a single click. Available: http://arnaud.ile.nc/sce/
[6] Müller, P., et al., "Procedural Modeling of Buildings", 2006.
[7] User, C. (2007). GenerativeComponents from Bentley. Available: www.caduser.com
[8] (2011, 20-05-2011). Grasshopper: Generative modelling for Rhino. Available:
http://www.grasshopper3d.com/
[9] Berndt, R., et al., "Generative 3D Models: A Key to More Information within Less
Bandwidth at Higher Quality", ACM Siggraph, pp. 111-122, 2005.
[10] Hohmann, B., et al., Proceedings of the 3rd ISPRS International Workshop 3D-ARCH
2009: ISPRS, 2009.
[11] Fellner, D. W., et al., Vision, Modeling and Visualization 2007. Proceedings: UNI Trier,
2007.
[12] Strobl, M., et al., "Euclides – A JavaScript to PostScript Translator", in Proccedings of
the International Conference on Computational Logics, Algebras, Programming, Tools,
and Benchmarking (Computation Tools) ed, 2010, pp. 1-8.
[13] Schinko, C., et al., "Digital Heritage. Modeling Procedural Knowledge: A Generative
Modeler for Cultural Heritage", presented at the Third International Conference,
EuroMed 2010., Berlin, Heidelberg, New York, 2010.
[14] Autodesk, "Conceptual Design and Analysis in Autodesk Revit Architecture 2011",
Autodesk, 2010.
136
[15] Louvain, U. c. d., "State of the Art of Existing Early Design Simulation Tools for Net Zero
Energy Buildings: A Comparison of Ten Tools", Université catholique de Louvain,
Louvain2011.
[16] Autodesk, Project Nucleus: Autodesk, 2010.
[17] Nanneman, K., "Ins and Outs of Using Adaptivity in Autodesk Inventor", Autodesk,
Orlando2005.
[18] Ruten, D. (2010, 05-06-2011). Evolutionary Principles applied to Problem Solving.
Available: http://www.grasshopper3d.com/profiles/blogs/evolutionary-principles
[19] Fogel, L. J., "On the Organization of Intellect", Phd, University of California, Los Angeles,
1965.
[20] CIS, I. (2011). IEEE Computacional Inteligence Society, Natural-Inspired Problem
Solving. Available: http://ieee-cis.org/awards/recipients/
[21] Dawkins, R., The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe
without Design: W. W. Norton, 1986.
[22] Grasshopper. (2011, 05-06-2011). Library, Algorithms and Architecture. Available:
http://www.grasshopper3d.com/page/library-algorithms-and
[23] Frazer, J., An Evolutionary Architecture: John Frazer and the Architectural Association,
1995.
[24] Tsui, E., Evolutionary Architecture: Nature as a Basis for Design: Wiley, 1999.
[25] Ruano, D. S. (2011, 05-06-2011). Evolutionary Architecture | Arquitectura Evolutiva.
Available: http://biodsign.wordpress.com/2009/09/26/evolutionary-architecture-
arquitectura-evolutiva/
[26] Storn, R. and K. Price, "Differential Evolution – A Simple and Efficient Heuristic for
Global Optimization over Continuous Spaces", Journal of Global Optimization, pp. 341-
359, 1997.
[27] Sun, J., et al., "DE/EDA - A new Evolutionary Algorithm for Global Optimization",
Information Sciences, pp. 249–262, 2005.
[28] Price, K. S., et al., Differential Evolution - A Practical Approach to Global Optimization
2005.
[29] Greenberg, E., "Master of Science Candidate, Emergent Technologies and Design
Program, The Architectural Association School of Architecture", presented at the ACADIA
2008: SILICON + SKIN, EVOLUTIONARY COMPUTATION, 2008.
[30] Kawakita, G., "Environmental Optimisation Methods in Sustainable Design Process: In
Combination with Evolution-Based Digital Technology", Master of Science Master of
Science in Energy Efficient and Sustainable Building, Department of Architecture, Oxford
Brookes University, Oxford, 2008.
137
[31] Pedersen, M., "Good Parameters for Differential Evolution", Hvass Laboratories, 2010.
[32] Price, K. S., et al., "Differential Evolution - A Practical Approach to Global Optimization",
2005.
[33] Corana, A., et al., "Minimizing Multimodal Functions of Continous Variables with the
Simulated Annealing Algorithm", ACM Transactions on Mathematical Software, pp. 272-
280, 1987.
[34] Ko, C. L., et al., "Hybrid differential evolution Estimating Parameters for Generalized
Mass Action Models with Connectivity Information", BioMed Central, pp. 1-9, 2009.
[35] Multi-Touch Technologies - NUI Group, 2009.
[36] Müller-Tomfelde, C., Tabletops - Horizontal Interactive Displays vol. XXIV: Springer,
2010.
[37] Silva, N., "MTT4ALL – An Integral Project For The Development Of A Multitouch
Equipment", in CDRSP, I. P. d. L. (CDRSP), Ed., ed. Marinha Grande: CDRSP, 2010, p.
1.
[38] Jung, Y., et al., "Adapting X3D for Multi-touch Environments", in Web3D '08:
Proceedings of the 13th international symposium on 3D web technology, 2008.
[39] Wikipedia. (01-02-2011). Interactive Visualization - Virtual Reality. Available:
http://en.wikipedia.org/wiki/Interactive_visualization
[40] Azuma, R., "A Survey of Augmented Reality", presented at the Presence: Teleoperators
and Virtual Environments 6, 1997.
[41] metaio. (2011, 07-01-2011). metaio - Augmented Reality. Available:
http://www.metaio.com/
[42] IGD, F. (2011, 31-02-2011). Virtual museum guide. Available:
http://www.fraunhofer.de/en/press/research-news/2010-2011/02/virtual-museum-
guide.jsp
[43] Lancelle, M., et al., "Definitely Affordable Virtual Environment - DAVE", in IEEE, 2008, p.
1.
[44] Slinger, C., et al., "Computer-Generated Holography as a Generic Display Technology",
Cover Feature - IEEE Computer Society, pp. 46-53, 2005.
[45] Hoshi, T., et al., "Touchable Holography", presented at the SIGRAPH 2009, 2009.
[46] Wikipedia. (2011, 01-06-2011). Rendering (computer graphics). Available:
http://www.renderit.cc/services/photo-realistic-architectural-rendering.html
[47] iCreate3D. (2011, 04-06-2011). Interactive Architectural Visualization and 3D
Architectural Renderings using iViewer. Available: http://www.icreate3d.com/
138
[48] Steinicke, F., et al., "A generic virtual reality software system's architecture and
application", presented at the ICAT '05 Proceedings of the 2005 international conference
on Augmented tele-existence, New York, NY, USA, 2005.
[49] Behr, J. and D. Reiners, "Class Notes: Don’t be a WIMP", in SIGGRAPH 2008, 2008, p.
170.
[50] Gobbetti, E. and R. Scateni, "Virtual Reality: Past, Present, and Future", in for Advanced
Studies, Research and Development, ed, 1998.
[51] Silva, N., et al. Wireless Sensor Network Implementations - Recent Developments
(Revision Article) [Online]. Available: http://www.mendeley.com/research/wireless-
sensor-network-implementations/
[52] Lifton, J., et al., "Tricorder: A mobile sensor network browser", presented at the CHI
2007, 2007.
[53] Goldsmith, D., et al., "Augmented Reality Environmental Monitoring Using Wireless
Sensor Networks", in IV '08 Proceedings of the 2008 12th International Conference
Information Visualisation, 2008.
[54] Llewelyn, G. (2008). IBM Puts Data Centre in OpenSim.
[55] MacManus, R., "Cross Reality: When Sensors Meet Virtual Reality", Pervasive Computing
- IEE, vol. 8, p. 1, 2009.
[56] Landay, J. and J. A. Paradiso, "Cross-Reality Environments", Pervasive Computing -
IEEE, vol. 8, pp. 14-15, 2009.
[57] Coleman, B., "Using Sensor Inputs to Affect Virtual and Real Environments", Pervasive
Computing - IEEE, vol. 8, pp. 16-23, 2009.
[58] J. Lifton, J., et al., "Metaphor and Manifestation Cross-Reality with Ubiquitous
Sensor/Actuator Networks", Pervasive Computing - IEEE, vol. 8, pp. 24-33, 2009.
[59] Dindar, N., et al., "Event Processing Support for Cross-Reality Environments", Pervasive
Computing - IEEE, vol. 8, pp. 34-41, 2009.
[60] Settgast, V., et al., "Spatially Coherent Visualization of Image Detection Results using
Video Textures", presented at the 33rd Workshop of the Austrian Association for Pattern
Recognition, Austria, 2009.
[61] Senagala, M., "KINETIC, RESPONSIVE AND ADAPTIVE: A COMPLEX-ADAPTIVE
APPROACH TO SMART ARCHITECTURE", presented at the SIGRADI 2005, Lima, Peru,
2005.
[62] Elsacker, E. and Y. Bontinckx. (2011, 26-05-2011). Kinetic Pavilion. Available:
http://www.kineticpavilion.com/
[63] Ahrens, C. H., et al., "Prototipagem Rápida: Tecnologias e Aplicações.", 2007.
139
[64] Alves, N. M. and P. Bártolo, "Computer rapid design I: accuracy analysis", International
journal of product development, pp. 183-202, 2004a.
[65] Alves, N. M. and P. Bártolo, "Computer rapid design I: accuracy analysis", International
journal of product development, pp. 203-214, 2004b.
[66] Alves, N., et al., "Engenharia Inversa: Aquisição e inspecção de formas 3D por técnica
de contacto", 2006.
[67] Bidanda, B., et al., "Reverse Engineering: an evaluation of perspective non-contact
technologies and applications in manufacturing systems", 1991.
[68] IPL/CDRSP, "Relatório técnico-científico do Projecto MaquetaPlus (União Europeia -
QREN)", IPL/CDRSP, Marinha Grande2010.
[69] Vasconcelos, P. M., "Fabrico Rápido Indirecto de Ferramentas Compósitas a partir de
modelos de Prototipagem Rápida", 2004.
[70] Ferreira, R., et al., "Agile-CAD for reverse engineering", Virtual and rapid
manufacturing, 2008.
[71] Associates, W., "Wohlers Report 2010", Wholers Associates, 2010.
[72] Lipson, H. and M. Kurman, "Factory @ Home: The Emerging Economy of Personal
Fabrication (Overview and Recomendations)", US Office of Science and Technology
Policy, 2010.
[73] Gmelin, S. and K. Agger, "Complex Geometry in Architecture based on Building
Information Modelling", Aarhus School of Architecture, 2010.
[74] Construction, M.-H., "The Business Value of BIM in Europe - Getting Building
Information Modeling to the Bottom Line in the United Kingdom, France and Germany",
2010.
[75] Construction, M.-H., "The Business Value of BIM in Europe - Getting Building
Information Modeling to the Bottom Line", 2010.
[76] Construction, M.-H., "Building Information Modeling (BIM) - Transforming Design and
Construction to Achieve Greater Industry Productivity", 2010.
[77] Construction, M.-H., "Green BIM - How Building Information Modeling is Contributing to
Green Design and Construction", 2010.
[78] ImpactLab. (2008, 02-06-2011). Automated Construction System To Change Face of
Home-Building. Available: http://www.impactlab.net/2008/09/10/automated-
construction-system-to-change-face-of-home-building/
[79] Khoshnevis, B., "Automated Construction by Contour Crafting (CC)", presented at the
8th International Conference on Rapid Prototyping, Tokyo, Japan, 2001.
140
[80] Attia, S., et al., "ARCHITECT FRIENDLY: A COMPARISON OF TEN DIFFERENT BUILDING
PERFORMANCE SIMULATION TOOLS", Eleventh International IBPSA Conference, pp.
204-211, 2009.
[81] Crwaley, D. B., et al., "Constrating the Capabilities of Building Energy Performance
Simulation Programs", 2005.
[82] Autodesk, "Getting Started with Green Building Studio 2011", Autodesk, 2011.
[83] Chinnayeluka, S. R., "PERFORMANCE ASSESSMENT OF INNOVATIVE FRAMING SYSTEMS
THROUGH BUILDING INFORMATION MODELING BASED ENERGY SIMULATION", Master
of Science in Industrial Engineering, Department of Construction Management and
Industrial Engineering, Faculty of the Louisiana State University and the Agricultural and
Mechanical College B.E, Osmania University, 2011.
[84] Ansys, I., "ANSYS Simulation Software Products are Effective Green Building Design
Tools", Ansys, Inc., 2009.
[85] ALware, "3D Airflow - Simulation of Room Air Flow", 2010.
[86] Associates, C., "ANSYS/CivilFEM Overview", CAE Associates, 2009.
[87] Dixon, T., et al., "A Green Profession?: An Audit of Sustainability Tools, Techniques and
Information for RICS Members", Oxford Brookes, Georgia State University, University of
Melbourne and King Sturge, 2007.
[88] Lemke, C. R., "ArchitekturForm & SolarEnergie", Master Doctor, Promotion Committee,
Technischen Universität Hamburg, Hamburg, 2009.
[89] AlAnzi, A., et al., "Impact of building shape on thermal performance of office buildings in
Kuwait", Energy Conversion and Management, vol. 50, pp. 822-828, 2009.
[90] Bansal, N. K. and A. Bhattacharya, "Parametric equations for energy and load
estimations for buildings in India", Applied Thermal Engineering, vol. 29, pp. 3710-
3715, 2009.
[91] Depecker, P., et al., "Design of buildings shape and energetic consumption", Building
and Environment, vol. 36, pp. 627-635, 2001.
[92] Jedrzejuk, H. and W. Marks, "Optimization of shape and functional structure of buildings
as well as heat source utilisation example", Building and Environment, vol. 37, pp.
1249-1253, 2002.
[93] Jedrzejuk, H. and W. Marks, "Optimization of shape and functional structure of buildings
as well as heat source utilization. Basic theory", Building and Environment, vol. 37, pp.
1379-1383, 2002.
[94] Jedrzejuk, H. and W. Marks, "Optimization of shape and functional structure of buildings
as well as heat source utilisation. Partial problems solution", Building and Environment,
vol. 37, pp. 1037-1043, 2002.
[95] Okeil, A., "A holistic approach to energy efficient building forms", Energy and Buildings,
vol. 42, pp. 1437-1444, 2010.
141
[96] Ourghi, R., et al., "A simplified analysis method to predict the impact of shape on annual
energy use for office buildings", Energy Conversion and Management, vol. 48, pp. 300-
305, 2007.
[97] Sethi, V. P., "On the selection of shape and orientation of a greenhouse: Thermal
modeling and experimental validation", Solar Energy, vol. 83, pp. 21-38, 2009.
[98] Aldomar Pedrini, S. S., "The Architects Approach To The Project Of Energy Efficient
Office Buildings In Warm Climate And Tthe Importance Of Design Methods", presented
at the Ninth International IBPSA Conference, Montréal, Canada, 2005.
[99] (2011, 01-01-2011). ARCHIVED ECOTECT RESOURCES - COMMUNITY WIKI. Available:
http://wiki.naturalfrequency.com/wiki/Concepts
[100] (2011, 01-01-2011). Ecotect LUA Resources. Available:
http://ecotectlua.wordpress.com/
[101] (2011, 01-01-2011). Performative Design. Available:
http://www.drajmarsh.com/topics/faq-topic/autodesk-ecotect
[102] (2011, 01-01-2011). Lua Scripting for Ecotect. Available:
http://matthewburke.com/courses/ecotect/
[103] (2011, 01-01-2011). The Proving Ground Wiki. Available:
http://theprovingground.wikidot.com/ecotect-dde
[104] Peiris, C. (2011, 02-01-2011). Creating a .NET Web Service. Available:
http://www.15seconds.com/issue/010430.htm
[105] (2011, 03-01-2011). Advanced Web Service Interoperability using NetBeans. Available:
http://netbeans.org/kb/docs/websvc/wsit.html
[106] IBPSA, "IBPSA News", The journal of the International Building Performance Simulation
Association, vol. 14, p. 69, 2004.
[107] Hensen, J. and M. Radošević, "Teaching building performance simulation - some quality
assurance issues and experiences", presented at the The 21st Conference on Passive
and Low Energy Architecture, Proc. PLEA, 2004.
[108] CIBSE, "Environmental design, CIBSE Guide A", Chartered Institution of Building
Services Engineers, 1999.
[109] Center, M.-T. C., "Dynamic Thermal Properties Calculator", MPA, 2010.