Conference PaperPDF Available

Opossum: Introducing and Evaluating a Model-based Optimization Tool for Grasshopper

Authors:

Abstract and Figures

This paper presents Opossum, a new optimization plug-in for Grasshopper, a visual data-flow modelling software popular among architects. Opossum is the first publicly available, model-based optimization tool aimed at architectural design optimization and especially applicable to problems that involve time-intensive simulations of for example day-lighting and building energy. The paper details Opossum's design and implementation and compares its performance to four single-objective and one multi-objective solver. The test problem is time-intensive and simulation-based: optimizing a screened façade for daylight and glare. Opossum outperforms the other single-objective solvers and finds the most accurate approximation of the Pareto front.
Content may be subject to copyright.
OPOSSUM
Introducing and Evaluating a Model-based Optimization Tool for
Grasshopper
THOMAS WORTMANN
1Singapore University of Technology and Design, Singapore
1thomas_wortmann@mymail.sutd.edu.sg
Abstract. This paper presents Opossum, a new optimization plug-in
for Grasshopper, a visual data-flow modelling software popular among
architects. Opossum is the first publicly available, model-based opti-
mization tool aimed at architectural design optimization and especially
applicable to problems that involve time-intensive simulations of for ex-
ample day-lighting and building energy. The paper details Opossum’s
design and implementation and compares its performance to four single-
objective and one multi-objective solver. The test problem is time-
intensive and simulation-based: optimizing a screened façade for day-
light and glare. Opossum outperforms the other single-objective solvers
and finds the most accurate approximation of the Pareto front.
Keywords. Design Tool; Architectural Design Optimization;
Model-based Optimization; Sustainable Design.
1. Architectural Design Optimization
This paper presents Opossum, a new optimization plug-in for Grasshopper, a visual
data-flow modelling software popular among architects. Opossum is the first pub-
licly available (from www.food4rhino.com), model-based optimization tool aimed
at architectural design optimization (ADO) and especially applicable to problems
that involve time-intensive performance simulations of for example day-lighting
and building energy. Such simulations play an increasingly larger role in archi-
tectural design processes and were for example employed by the designers of the
Louvre Abu Dhabi (Imbert et al. 2013). Model (or surrogate)-based optimization
methods find good results with small numbers of simulations (Holmström et al.
2008; Costa & Nannicini 2014; Wortmann & Nannicini 2016). This high speed of
convergence is important for sustainable design problems such as daylighting and
building energy, where a single simulation takes several minutes or hours to com-
plete. In such cases, it is impractical to perform the thousands of simulations re-
quired by population-based metaheuristics such as genetic algorithms (GAs). But
P. Janssen, P. Loh, A. Raonic, M. A. Schnabel (eds.), Protocols, Flows and Glitches, Proceedings of the
22nd International Conference of the Association for Computer-Aided Architectural Design Research in Asia
(CAADRIA) 2017, 283-293. © 2017, The Association for Computer-Aided Architectural Design Research
in Asia (CAADRIA), Hong Kong.
284 T. WORTMANN
model-based methods are rarely used in ADO (Evins 2013). Grasshopper and
other architectural, parametric design software such as Dynamo (Asl. et al. 2015)
and DesignBuilder (Singh & Kensek 2014) come equipped with only metaheuris-
tics. Opossum fills this gap by making model-based optimization available to a
wider audience and accessible to non-experts.
1.1. GLOBAL BLACK-BOX OPTIMIZATION
Simulation-based optimization problems define the relationship between variables
and performance objectives not with an explicit, mathematical function but by
evaluating a parametric model with numerical simulations. This relationship of-
ten exhibits local optima and complex, non-linear dependencies between variables.
Global black-box (or derivative-free) optimization methods do not require a mathe-
matical formulation and therefore are particularly appropriate for simulation-based
ADO problems. Such methods balance establishing an overview over the design
space with focusing on a promising region of the design space to find the best so-
lution. There are three categories of black-box methods: (1) Direct search, (2)
model-based methods and (3) metaheuristics. Global direct search and model-
based methods alternate between global and local search, while metaheuristics
limit an initially global search to an increasingly local one as the optimization pro-
gresses.
1.1.1. Direct Search
Direct search methods evaluate design candidates in a deterministic sequence. The
Hooke-Jeeves and Nelder-Mead Simplex algorithms (Nocedal & Wright 2006) are
classic examples of local direct search algorithms. DIRECT (Jones et al. 1993) is
a global direct search algorithm that recursively subdivides the design space into
hyper-boxes. This paper tests the implementation of DIRECT in the free, open-
source NLopt library (Johnson 2010).
1.1.2. Model-based Methods
Model-based methods employ surrogate models (explicit estimates of the implicit
mathematical formulations of black-box problems) to guide the search for good
solutions. Trust region methods (Nocedal & Wright 2006) employ local mod-
els, while more recent, global model-based methods model design spaces com-
pletely. Global methods construct models with a variety of statistical (e.g. Poly-
nomial Regression and Kriging) and machine learning-related (e.g. Neural Net-
works and Support Vector Machines) techniques (Koziel et al. 2011). Opossum,
the optimization tool presented here, approximates the design space using radial
basis functions (Gutmann 2001). Surrogate models accelerate optimization pro-
cesses, since they are much faster to calculate than the underlying simulations. Ap-
proaches that completely replace time-intensive simulations with surrogate models
(e.g. Yang et al. 2016) and then apply optimization methods are limited by the
models‘ initial precision. Increasing the models’ precision requires a larger sam-
ple size, which can negate the initial speed advantage. Contrastingly, model-based
methods iteratively build and refine models during the optimization process. One
OPOSSUM 285
optimization steps consist of (1) searching the model for a promising solution to
evaluate, (2) simulating the found solution and (3) updating the model based on
the simulation results. In this way, model-based methods continuously increase the
model’s accuracy. The ten runs of the model-based algorithm in the benchmark be-
low each constructed and updated a surrogate model from 200 simulated solutions.
In testing the accuracy of these models against all 10.200 solutions simulated dur-
ing the benchmark, the maximum deviation was 53% of the true objective value,
but the mean deviation was only 11% of the true value. Global model-based meth-
ods are particularly effective for optimizing problems with costly evaluation-for
example time-intensive simulations-and complex relationships between variables
and objective (Holmström et al. 2008). This effectiveness, combined with oppor-
tunities for visualization and interaction afforded by the surrogate model, makes
model-based optimization attractive for ADO (Wortmann et al. 2015). Opossum,
the optimization tool presented here, provides an easy-to-use interface to RBFOpt
(Costa & Nannicini 2014), a state-of-the-art model-based optimization library. In
the 2015 GECCO Black-Box Competition, which considered 1.000 mathematical
benchmark problems with two to sixty-four variables, RBFOpt ranked first among
the open-source solvers (Loshchilov & Glasmacher 2017).
1.1.3. Metaheuristics
Population-based metaheuristics (Talbi 2009) start with randomly generated pop-
ulations of design candidates that they improve heuristically. Unlike direct search
and model-based methods, metaheuristics do not rely on mathematical proofs of
convergence but draw their inspiration from natural processes, such as genetic evo-
lution or “swarm intelligence”. Due to this lack of rigor and poor performance on
benchmarks (Rios & Sahinidis 2013; Costa & Nannicini 2014), the mathematical
optimization community regards metaheuristics as “methods of last resort” (Conn
et al. 2009). Nevertheless, metaheuristics are the most popular category for ADO,
and GAs the most popular algorithm (Evins 2013). This popularity is due to a
relative ease of implementation, a wide availability, and a perception that meta-
heuristics are especially appropriate for complex problems with multiple optima
(e.g. Attia et al. 2013; Evins 2013). This paper tests implementations of single-
and multi-objective GAs and of particle swarm optimization (PSO) and simulated
annealing (SA).
1.2. PARETO-BASED OPTIMIZATION
Multi-objective optimization aims to find good values for more than one perfor-
mance objective. The problem in this paper considers daylight and glare, but com-
bines the two into a single objective by subtracting them from each other. In other
words, it defines an a priori weightage between daylight and glare. Pareto-based,
multi-objective algorithms, which often are GAs, do not define such weightages,
but instead try to satisfy all objectives as much as possible. Often, there is a trade-
off between objectives: e.g. allowing more daylight into a room can lead to more
glare. Pareto-based optimization illuminates such tradeoffs by searching for non-
dominated solutions, i.e., solutions where improving one of the objectives is only
possible by worsening others, and graphing them on the Pareto front. Compared to
286 T. WORTMANN
single-objective optimization, Pareto-based optimization is a less established field,
with a smaller number of algorithms, proofs of convergence and experimental re-
sults. However, it nevertheless is popular in ADO (Evins 2013). Radford and Gero
(1980) suggest an affinity between the numerous tradeoffs addressed by architec-
tural design and the smaller number of objectives addressed by multi-objective op-
timization. But Pareto-based optimization is less efficient than its single-objective
counterpart since the former focuses on finding not only good solutions, but also
on achieving a good spread of solutions on the Pareto front. In the benchmark of
a building energy problem with two objectives by Hamdy et al. (2016), it took
1400-1800 function evaluations, i.e. simulations, for the Pareto fronts to stabilize.
To explore this efficiency difference, this paper compares five single-objective
algorithms with the multi-objective HypE (Bader and Zitzler 2008). It evaluates
HypE’s performance as a single-objective algorithm and compares the Pareto front
found by HypE with the fronts found implicitly by the single-objective algorithms.
Extensions of model-based methods to Pareto-based optimization exist: Knowles
(2006) recalculates a single surrogate model with different weightages at every
iteration. Akhtar and Shoemaker (2016) employ one surrogate model for every
performance objective.
2. Opossum: a New Model-based Optimization Tool
RBFOpt is programmed in Python 2.7 and relies on libraries for numerical com-
putations (NumPy and SciPy) and auxiliary optimization (Pyomo). But Grasshop-
per supports only IronPython, a Python variant integrated with Microsoft’s .NET
framework that does not support these libraries. Opossum is written in C#, which
Grasshopper supports. The C# program starts an external Python 2.7 process that
runs RBFOpt. Opossum and RBFOpt exchange data via a (hidden) command line
window.
2.1. ALGORITHM PARAMETERS
RBFOpt has more than 40 parameters, some of which are interrelated and some
of which dramatically change its behavior. One can choose between two model-
based algorithms: Gutmann (2001) and MSRSM (Regis & Shoemaker 2007). For
global search, Gutmann evaluates the surrogate model’s point of largest curva-
ture, based on the assumption that this point yields a large improvement of the
model’s accuracy. MSRSM searches the model for points that balance improving
the model’s accuracy with the promise of better solutions, using either a genetic
algorithm, random sampling or mathematical solvers. There are five choices of in-
terpolating radial basis functions (linear, multi-quadratic, cubic, thin plate spline
and automatic selection), which result in models with varying accuracy depending
on the optimization problem.
2.2. OPOSSUM GUI
To make this complexity accessible to non-experts, Opossum’s GUI consists of
three tabs that afford increasing levels of control (figure 1): (1) The first tab lets
users choose between minimization and maximization, select one of three pre-sets
OPOSSUM 287
of parameters, and start and stop the optimization. The pre-sets (Fast, Extensive
and Alternative) are based on intensive testing with mathematical test functions.
“Fast” runs MSRSM with a genetic algorithm. “Extensive” is identical, but spends
more time on searching the model. “Alternative” runs Guttmann, which works
well in certain cases. The first tab also displays an animated convergence graph
to inform users about the progress of the optimization. (2) The second tab lets
users define stopping conditions based on the number of iterations or the elapsed
time, and to conduct and log multiple optimization runs. (3) The third tab ac-
cepts command line parameters for RBFOpt. When desired, this “expert” window
gives the user full control, with the parameters entered here overriding parameters
set by the first two tabs. In Grasshopper, Opossum follows the look and behav-
ior of existing optimization tools, including conventions regarding the colors of
optimization components and their connections to variables and objective values
(figure 2). Double-clicking on an optimization component opens a window with
a GUI unique to each tool, which, for Opossum, contains the three tabs discussed
above. Opossum thus presents a complex and innovative optimization library in a
manner that is easy-to-use and familiar to users of Grasshopper.
Figure 1. The three tabs of Opossum’s GUI, from left to right.
Figure 2. Opossum in Grasshopper. The curves on the left link to the variables and the one on
the right to the objective.
3. Evaluation
To evaluate RBFOpt and Opossum, we compare its performance with five other
solvers available in Grasshopper: the GA and SA included in Galapagos (Rut-
ten 2010), the PSO of Silvereye (Cichocka et al. 2015), the DIRECT algorithm
included in Goat and HypE included in Octopus (Vierlinger 2013). We test the
solvers on a time-intensive, simulation-based problem: optimizing a screened
façade for daylight and glare. We simulate daylight and glare with DIVA-for-
Rhino 4.0 (Jakubiec & Reinhart 2011).
288 T. WORTMANN
3.1. EXAMPLE PROBLEM: OPTIMIZING DAYLIGHT AND GLARE
We consider a single room in Singapore (figure 3). The rectangular room has a
South-facing, 10.8 meter-long and 3.6 meter-high façade, and is 7.2 meters deep.
The room’s floor is raised 20 meters above the ground level. For the façade, we
propose a porous screen with a triangular grid of 1.692 circular openings. To avoid
controlling every opening with an individual variable and to create a graduated,
cloudy appearance, a grid of forty “attractor points” controls the openings, with
weights in the range [0.0, 1.0]. To create a soft falloff, we calculate the radius of
every opening as the average of the values of all attractor points, weighted by the
inverse cubes of their distances to the opening and multiplied by the maximum
radius of 65 millimeters. Openings with a radius below 10 millimeters are closed
completely. This formulation results in a problem with forty continuous variables.
Figure 3. Diagram of the room being optimized in terms of daylight and glare. The numbers
from one to forty indicated the positions of the attractor points, the crosses the sensor grid for
simulating UDI and the cone the camera position and view for simulating DGP. The
visualization on the right represents the best solution found, with 86% UDI and 24% DGP.
The objectives of the optimization are to (1) maximize Useful Daylight Illumi-
nance (UDI) while (2) minimizing Daylight Glare Probability (DGP). UDI mea-
sures the annual percentage of time that a sensor point receives an amount of day-
light that is sufficient for office work while avoiding glare and excessive heat gains
(300-3000 lux). This problem calculates UDI as the average from a seven by five
grid of sensor points. DGP measures glare as a percentage for a specific camera
view and for a specific point-in-time. This value indicates whether the glare is
imperceptible (35% > DGP), perceptible (40% > DGP ≥ 35%), disturbing (45%
> DGP ≥ 40%) or intolerable (DGP ≥ 45%). We calculate this value for a single
camera. The south-facing camera points directly at the screen and is in the center
of the room at a height of 1.6 meters from the floor. An annual glare simulation
calculates direct sunlight only for the daylight hours of five days (21th of June, Au-
gust, September, October and December). In Singapore, these days add up to 59
daylight hours. For the remaining hours in the year, the simulation interpolates the
direct sunlight contribution. Nevertheless, annual glare simulations can take hours
even at low quality settings. This time-intensiveness makes such simulations im-
practical as an optimization objective, especially for the repeated runs necessary
OPOSSUM 289
for benchmarking. Instead, we approximate annual glare as the average of the 59
DGP values corresponding to the 59 daylight hours on which the more extensive
annual glare simulations rely. Although less accurate than a full annual simula-
tion, this approach yields a good qualitative assessment of the amount of glare one
would experience in the room. If quality of daylight and avoidance of glare are
equally important, subtracting (approximated) average annual DGP gfrom aver-
age annual UDI uyields a single maximization objective (both UDI and DGP are
in the range [0,1]). We turn this result into a minimization objective by subtracting
it from 1.0:
min f(x) = 1.0u(x) + g(x)(1)
On an Intel Xeon E5-1620 CPU with eight threads and 3.6 GHz, one evaluation
of this objective, i.e. generating the parametric geometry and performing the day-
lighting and glare simulations, takes about 90 seconds.
3.2. BENCHMARKING METHODOLOGY
We compare the results of the six solvers (DIRECT, RBFOpt, GA, SA, PSO and
HypE) from ten runs with 200 function evaluations. To make the results indepen-
dent from computing speed and implementation details, we compare the solvers in
terms of the number of function evaluations, rather than in terms of running time.
In practice, compared to the time required for function evaluations, running time
differences between solvers often are negligible.
We run all solvers with default parameters, except for the GA and HypE, where
we reduce the population sizes to 25 to achieve a larger number of generations.
Choice of parameters can have a significant impact on the performance of opti-
mization algorithms, especially for metaheuristics (Talbi 2009), and is problem-
dependent. Nevertheless, we assume that the authors of the tested solvers have
chosen sensible default parameters. Furthermore, in practice there usually is a
limited evaluation budget, which is better spent on using solvers that are efficient
out-of-the box than on tuning algorithmic parameters.
We compare the solvers in terms of two criteria: (1) speed of convergence
and (2) stability. Speed of convergence refers to how fast algorithms approach
the optimum, measured as the improvement per function evaluation. Stability
refers to the reliability of optimization algorithms and is a concern especially for
stochastic algorithms such as metaheuristics. One measures stability by applying
statistical measures such as standard deviation to the results of repeatedly running
an optimization algorithm on the same problem. We compare the Pareto-based
solver with the single objective ones by calculating single objective values for the
solutions found by HypE with the weighted objective function (formula 1). We
compare the single-objective solvers with HYPe by recording individual UDI and
DGP values. Plotting the non-dominated solutions allows a visual comparison of
the Pareto fronts found by the six solvers in terms of the quality and spread.
3.3. RESULTS
The convergence graph on the left in figure 4 depicts the average, current best
value found by the solvers relative to the number of evaluations. DIRECT is the
290 T. WORTMANN
worst-performing solver and shows little improvement because its recursive subdi-
vision proceeds too slowly in the forty dimensions corresponding to the variables.
(DIRECT tends to show better performance on lower dimensional problems (Wort-
mann & Nannicini 2016).) SA performs relatively poorly, while the remaining
metaheuristics, including the Pareto-based HypE, perform similarly and improve
the objective by around 40%. Opossum’s RBFOpt is the best-performing solver
with an improvement of 50%. Note RBFOpt’s rapid progress after 40 evaluations:
Here the algorithm starts profiting from the surrogate model, while the earlier eval-
uations sample the objective function with a quasi-random Latin Hypercube De-
sign. The box plot on the right in figure 4 indicates the range of objective values
found by the solvers in five runs. DIRECT is fully deterministic. Of the remain-
ing algorithms, RBFOpt is the most stable, with the single-objective metaheuristics
displaying wider, less stable ranges.
Figure 4. The convergence graph on the left displays the number of function evaluations on
the x, and the average objective value on the y axis. The box plot on the right indicates the
range of objective values found by the six solvers in ten runs. .
In figure 5, RBFOpt and HypeE have found the closest approximations of the
Pareto front. The diagonal fronts indicate a tradeoff between maximizing day-
light and minimizing glare, although large improvements of daylight quality can
be achieved by accepting small increases in glare. (Note that low average DGP
values can contain isolated instances of disturbing or intolerable glare.) This im-
provement is especially noticeable for RBFOpt: It finds high-daylight solutions
that also suffer less glare than the next best daylight solutions by other solvers.
HypE suggests a steep tradeoff between daylight and glare, while RBFOpt indi-
cates that the tradeoff is less dramatic.
OPOSSUM 291
Figure 5. Pareto fronts found during each solver’s most representative run. ”Best” is the
combined front from all solvers and runs (The markers’ color indicates the solver). UDI
indicated on the x-, and average DGP indicated on the y-axis.
4. Conclusion
We have presented a clear example of a time-intensive, simulation-based ADO
problem that benefits from model-based optimization. Although algorithmic per-
formance is problem-dependent, Opossum’s RBFOpt is the best choice when the
evaluation budget is small. The comparison with a Pareto-based algorithm indi-
cates that, as a single-objective algorithm, HypE performs similar to other meta-
heuristics. But the single-objective RBFOpt finds the closest approximation of
the Pareto front, especially for high-daylight solutions. Designers should thus em-
ploy multi-objective optimization judiciously and only when a large evaluation
budget is available to avoid an inaccurate approximation of the Pareto front. The
future development of Opossum will aim to make it more visual and interactive,
following the ideas outlined in (Wortmann 2017). Another direction is to support
Pareto-based optimization.
References
Akhtar, T. and Shoemaker, C.A.: 2016, Multi objective optimization of computationally ex-
pensive multi-modal functions with RBF surrogates and multi-rule selection, J Glob Optim,
64(1), 17-32.
Asl, M.R., Stoupine, A., Zarrinmehr, S. and Yan, W.: 2015, Optimo: A BIM-based Multi-
Objective Optimization Tool, Proceedings of eCAADe 2015, Vienna, AUT, 673-682.
Attia, S., Hamdy, M., O’Brien, W. and Carlucci, S.: 2013, Assessing gaps and needs for inte-
grating building performance optimization tools in net zero energy buildings design, Energy
Build.,60, 110-124.
Bader, J. and Zitzler, E.: 2008, HypE: An algorithm for fast hypervolume-based many-objective
optimization, TIK-Report 286, ETH Zurich.
Cichocka, J., Browne, W. and Rodriguez, E.: 2015, Evolutionary Optimization Processes as
Design Tools, Proceedings of 31th International PLEA Conference, Bologna, IT.
Conn, A., Scheinberg, K. and Vicente, L.: 2009, Introduction to Derivative-Free Optimization,
Society for Industrial and Applied Mathematics, Philadelphia, PA.
Costa, A. and Nannicini, G.: 2010, RBFOpt: an open-source library for black-box optimization
with costly function evaluations, Optimization Online 4538.
292 T. WORTMANN
Evins, R.: 2013, A review of computational optimisation methods applied to sustainable build-
ing design, Renew. Sustainable Energy Rev.,22, 230-245.
Gutmann, H.M.: 2001, A Radial Basis Function Method for Global Optimization, J Glob Optim,
19(3), 201-227.
Hamdy, M., Nguyen, A.T. and Hensen, J.L.M.: 2016, A performance comparison of multi-
objective optimization algorithms for solving nearly-zero-energy-building design problems,
Energy Build.,121, 57-71.
Holmström, K., Quttineh, N.H. and Edvall, M.M.: 2008, An adaptive radial basis algorithm
(ARBF) for expensive black-box mixed-integer constrained global optimization, Optim Eng,
9(4), 311-339.
Imbert, F., Frost, K.S., Fisher, A., Witt, A., Tourre, V. and Koren, B.: 2013, Concurrent Geo-
metric, Structural and Environmental Design: Louvre Abu Dhabi, AAG 2012, 77-90.
Jakubiec, J.A. and Reinhart, C.F.: 2011, DIVA 2.0: Integrating daylight and thermal simulations
using Rhinoceros 3D, Daysim and EnergyPlus, IBPSA 2011, Sydney, AUS, 2202-2209.
Johnson, S.G.: 2010, “The NLopt nonlinear-optimization package” . Available from <http://ab
-initio.mit.edu/nlopt>.
Jones, D.R., Perttunen, C.D. and Stuckman, B.E.: 1993, Lipschitzian optimization without the
Lipschitz constant, J Optimiz Theory,79(1), 157-181.
Knowles, J.: 2006, ParEGO: a hybrid algorithm with on-line landscape approximation for ex-
pensive multiobjective optimization problems, IEEE Trans. Evolut. Comput.,10(1), 50-66.
Koziel, S., Ciaurri, D.E. and Leifsson, L. 2011, Surrogate-Based Methods, in S. Koziel and X.S.
Yang (eds.), Computational Optimization, Methods and Algorithms, Springer, Heidelberg.
Loshchilov, I. and Glasmacher, T.: 2015, “Black-Box Optimization Competition” . Available
from <bbcomp.ini.rub.de> (accessed 14 February 2017).
Nocedal, J. and Wright, S.J.: 2006, Numerical optimization, Springer, New York.
Radford, A.D. and Gero, J.S.: 1980, On Optimization in Computer Aided Architectural Design,
Build Environ,15, 73-80.
Regis, R.G. and Shoemaker, C.A.: 2007, A Stochastic Radial Basis Function Method for the
Global Optimization of Expensive Functions, INFORMS J Comput,19(4), 497-509.
Rios, L.M. and Sahinidis, N.V.: 2013, Derivative-free optimization: a review of algorithms and
comparison of software implementations, J Glob Optim,56(3), 1247-1293.
Rutten, D.: 2010, “Evolutionary Principles applied to Problem Solving” . Available from <ww
w.grasshopper3d.com/profiles/blogs/evolutionary-principles>.
Singh, S. and Kensek, K.: 2014, Early design analysis using optimization techniques in de-
sign/practice, ASHRAE Conference Proceedings, Atlanta, GA.
Talbi, E.: 2009, Metaheuristics: from design to implementation, John Wiley & Sons, Hoboken,
N.J.
Vierlinger, R.: 2013, Multi Objective Design Interface, Master’s Thesis, TU Wien.
Wortmann, T.: 2017, Surveying design spaces with performance maps, IJAC,15(1).
Wortmann, T., Costa, A., Nannicini, G. and Schroepfer, T.: 2015, Advantages of Surrogate
Models for Architectural Design Optimization, AIEDAM,29(4), 471-481.
Wortmann, T. and Nannicini, G.: 2016, Black-box optimization for architectural design: An
overview and quantitative comparison of metaheuristic, direct search, and model-based op-
timization methods, Proceedings of CAADRIA 2016, Melbourne, AU, 177-186.
Yang, D., Sun, Y., Stefano, D.d., Turrin, M. and Sariyildiz, S.: 2016, Impacts of problem scale
and sampling strategy on surrogate model accuracy, 2016 IEEE CEC, 4199-4207.
... We selected seven different algorithms: six metaheuristics that apply evolutionary and bioinspired aspects to their search mechanisms and one model-based that operates through an entirely mathematical approach. We used NSGA2, MHACO, NSPSO, and MOEA/D, available on Opossum, 40 through implementing the Pygmo 2 library 41 and HypE and SPEA2, available on Octopus. The Non-dominated Sorting Genetic Algorithm II (NSGA2) is a genetic algorithm based on non-dominated selection and crowding. ...
... Untuned hyperparameters for RBFMOpt, NSGA2, MOEA/D, NSPSO, MHACO, HypE, and SPEA2.40,41,50 · Variability: the variability measures the range of hypervolume values across the all executions by calculating the difference between the maximum and the minimum HV for each algorithm. ...
Article
The increasing development in the computational field that allowed software and hardware advances enables more refined building performance analysis. Simulation-based optimization (SBO) methods allow high standards to be achieved by combining parametric modeling, simulation, and optimization methods. However, SBO methods still need development, especially regarding the correct choice of the optimization algorithm based on the specific characteristics of each problem. This study proposes a multi-objective optimization algorithms benchmark by comparing seven multi-objective optimization algorithms: RBFMOpt, NSGA2, MHACO, NSPSO, MOEA/D, HypE, and SPEA2, across nine building-related problems, including thermal, energy, and daylight simulation. The problems varied from 5 to 18 discrete, continuous, and mixed parameters. The objective functions varied between two and three. We used the hypervolume indicator, IGD+, GD+, and EPS + to compare algorithms’ performance and assess the tendency to reduce computational costs. We also performed the Kruskal-Wallis non-parametric test to analyze the impact of multiple runs on the hypervolume indicator. The results showed that RBFMOpt and HypE perform best across all problems. However, RBFMOpt tends to reduce computational costs since the algorithm requires fewer simulations to obtain the best results.
... Opossum offers surrogate models (SM) that are a computationally faster approximation of a simulation. 55 A sample of design variations is simulated, and its results are used to train a surrogate model that replaces the more precise simulation with a faster approximation. Then, the surrogate model is used to navigate the design space and guide toward an optimum design. ...
Article
This paper introduces a computational aesthetics framework utilizing computer vision (CV) and artificial neural networks (ANN) to predict the aesthetic preferences of groups of people for architecture. It relies on part-to-whole theories from aesthetics and cognitive psychology. A survey of a group of people on preferences of images is held to record an average hedonic response (AHR). CV algorithms MSER and SAM recognize parts in images. Birkhoff's aesthetic measure formula is adapted by employing the number of parts and their connections. These quantities are used as input layers of an ANN, and the AHR is the target output. The ANN evaluates images to output a predicted hedonic response (PHR), which is tested as a criterion in parametric design space navigation and in mapping the latent space of GANs. We conclude that such a framework is a heuristic method for better understanding the design and latent spaces and exploring designs.
... To evaluate a comfortable, denser, and near-zero residential precinct, a parametric model with a simulation-based optimization workflow (see Figure 1) was set up with ClimateStudio (Solemma 2020) and Opossum (Wortmann 2017) ...
Conference Paper
With advancements in Architectural Design Optimization (ADO) and Building Performance Simulation (BSP) tools, achieving Net Zero Energy (NZE) buildings is feasible by optimizing the balance between energy demand and on-site renewable energy generation. However, this poses several challenges in land-scarce urban areas, where on-site NZE is achievable with the tradeoff between development density (i.e., gross plot ratio - GPR), regions on-site for photovoltaic deployment, and energy efficiency associated with indoor thermal comfort (ITC). This study introduces a method to optimize high-rise residential precincts for NZE and ITC goals in tropical cities: Chennai and Singapore. The research evaluates design parameters related to urban layout, building geometry, and building performance through simulation-based optimization workflow. Though using a scalarized single-objective optimization (SOO) approach, this study identifies optimal densities of greater than 3.6 GPR for Chennai and 3 GPR for Singapore, meeting near-zero goals while ensuring adequate comfort. In conclusion, this applied methodology provides insights into the trade-offs between site density, indoor comfort, and net zero goals for tropical NZE urban developments.
... The optimisation objective aimed to minimise fabrication time. This involved using the Grasshopper Opossum plugin with the CMA-ES algorithm [23], utilising 100 * N steps, where N represents the number of parameters, totalling 700 steps. The decision to employ CMA-ES was based on its proven effectiveness in handling large-budget evaluations [24]. ...
Conference Paper
Full-text available
In the realm of Industry 4.0, significant changes have unfolded across various sectors, notably in construction, driven by the adoption of digitisation and automation. This transformation demands a reassessment of traditional practices within the Architectural, Engineering, and Construction (AEC) industry. In response, the concept of Co-Design has emerged, offering a promising avenue to integrate diverse expertise and stakeholders throughout the construction process. This study explores the fusion of Co-Design principles with ML techniques to elevate fabrication processes and project outcomes within the AEC industry. The research introduces a workflow that intertwines design, fabrication, and data analysis by implementing a fabrication prediction ML model. By harnessing real-world data from timber additive construction processes, the model is trained to predict task durations accurately. This not only tackles the inherent complexities of fabrication processes but also provides flexibility across different design scenarios and fabrication environments. Through systematic evaluation and testing, the study identifies the most effective ML algorithms for predicting fabrication times. Furthermore, it showcases the practical application of the prediction model in optimising fabrication setups, ultimately resulting in tangible enhancements in efficiency and productivity.
... Moreover, horizontally oriented or long components tend to sag under weight, requiring additional support. Their position was found using RBFMOpt minimization of FEM displacement [27] (Fig. 5f). ...
Conference Paper
In recent years, there has been a resurgence of interest in construction with raw earth due to its sustainability and reusability. Despite increased and diverse academic research on digital tools like 3D printing, standard earthen construction techniques and typologies have seen limited evolution. Consequently, the building industry continues to rely on the standardised concrete. This paper proposes novel modelling methods for clay-fiber-fabric hybrid structures, with stay-in-place formworks. Integrating computational processes to the material system expands the design space for performative, lightweight earthen components. Reinforcement patterns for flax fibers are generated from combining the properties of woven jute fabric, material experiments, informed structural behaviours and fabrication requirements. The proposed design-to-fabrication workflow can leverage the benefits of earth-fabric hybrids, enabling structures not solely reliant on compressive forces, unlike traditional earth buildings. This new design space is explored through branching typologies, ideal for large spans and bending moments. Using 3DGS form-finding and polygonal modelling, complex shapes can be created with high accuracy. Material behaviour simulations inform design and structural analysis, ensuring reductions in deviation from the desired form. This research combines these methods in a comprehensive computational framework for modelling clay-fabric-fiber hybrid structures, with enhanced structural performance, fabrication accuracy and potential for off-site prefabrication, leading to more control, accessibility, and appeal for future architects.
... As light-version CFD tools, CFD plugins require significantly fewer computer resources than CFD software [14]. In addition, they significantly simplify many aspects of CFD modeling, including numerical schemes [15,16], turbulence modeling [17,18], mesh schemes, and mesh generation methods [19][20][21], making CFD simulations accessible and understandable. Integrating CFD into design platforms using plugins marks a landmark development for the interaction between computer-aided design and CFD, allowing architects to perform CFD simulations in their familiar design environments. ...
Article
Full-text available
The incorporation of physical environmental performance as a primary consideration in building design can facilitate the harmonization of the built environment with the surrounding site and climate, enhance the building’s environmental adaptability and environmental friendliness, and contribute to the achievement of energy-saving and emission-reduction objectives through the integration of natural lighting and ventilation. General computational fluid dynamics (CFD) can help architects make accurate predictions and effectively control the building’s wind environment. However, CFD integration into the design workflow in the preliminary stages is frequently challenging due to program uncertainty, intricate parameter settings, and substantial computational expenses. This study offers a methodology and framework based on machine learning to overcome the complexity and computational cost barriers in simulating outdoor wind environments of buildings. In this framework, the machine learning model is trained using an automated CFD simulation system based on Butterfly and implemented within the Rhino and Grasshopper environment. This framework provides real-time simulation feedback within the design software and exhibits promising accuracy, with a Structural Similarity Index Measure (SSIM) ranging from 90–97% on a training dataset of 1200 unique urban geometries in Xinjiekou Area of Nanjing, China. Furthermore, we programmatically integrate various parts of the simulation and computation process to automate multiscenario CFD simulations and computations. This automation saves a significant amount of time in producing machine-learning training sets. Finally, we demonstrate the effectiveness and accuracy of the proposed working framework in the design process through a case study. Although our approach cannot replace CFD simulation computation in the later design stages, it can support architects in making design decisions in the preliminary stages with minimal effort and immediate performance feedback.
... Opossum [7] is an optimization plug-in for Grasshopper and Rhino aimed at optimizing single-objective or multi-objective architectural design problems. Opossum shows high efficiency in solving excessively time-consuming design problems such as optimizing for digital sustainability and building performance metrics. ...
Article
Full-text available
Energy and daylight simulations can be computationally intensive and time consuming, especially if their main aim is to inform the design process in the early stages. In addition, the collaboration between different disciplines have become even more important to ensure that the building design can successfully reach low carbon and energy targets while ensuring good daylight levels. This research proposes an investigation about a new methodology to run both daylight and energy simulations simultaneously while optimizing the façade parameters to find the best solutions for good daylight, low energy demand and low carbon emissions from the energy use. Overall, the outcomes demonstrated that the support of multi-objective optimization engines could successfully assist in choosing between different facade solutions with high performance indicators. Moreover, it is possible to rank the different results according to their key performance indicator (KPI), so the architects have more options to choose.
Thesis
Full-text available
Mit dem Green Deal verdeutlicht die Europäische Union die Dringlichkeit von Maßnahmen zum Klimaschutz und zur Klimaanpassung. Dem weltweit für 38 % an CO2 Emissionen verantwortlichen Gebäudesektor kommt dabei eine Schlüsselrolle zu. Bisherige Ansätze zur Quartiersentwicklung sind jedoch häufig auf einzelne Aspekte fokussiert, da die ganzheitliche Bewertung komplexer Quartiersysteme kaum umsetzbar scheint. Es gilt daher Rahmenbedingungen zu identifizieren, die Klimaschutz und Klimaanpassung vereinen sowie einen positiven Beitrag zur Regeneration der Biokapazitäten unseres Planeten leisten. Diese Dissertation zeigt, wie Wechselwirkungen zwischen Gebäuden und Außenraum untersucht werden können und welche Relevanz sie für damit verbundene Entscheidungsprozesse mit Blick auf resultierende Synergien und Trade-offs haben. Zudem werden Methoden entwickelt, um die gewonnenen Erkenntnisse der Planungspraxis zugänglich zu machen. Zunächst erfolgt eine Analyse bestehender Definitionen zu den Begriffen Synergie und Trade-off in verschiedenen Themenfeldern. Die Übertragung auf den Bausektor liefert die Grundlage zur Ableitung entsprechender Bewertungskennzahlen und zur Konzeption eines generischen Prozesses für die multikriteriell optimierte Entscheidungsunterstützung in der Quartiersentwicklung, der Urban Systems Exploration. Mittels parametrischer Gebäude- und Quartiersmodelle werden das synergetische Wirken von Maßnahmenkombinationen sowie multikriterielle Trade-offs hinsichtlich der lebenszyklusbasierten Treibhausgasemissionen, der Lebenszykluskosten und des thermischen Außenraumkomforts in Fallstudien untersucht. Die daraus entwickelten Quartiersteckbriefe dienen zur Integration der Erkenntnisse in Planungs- und Entscheidungsprozesse. Die Nutzbarkeit dieser Steckbriefe wird schließlich anhand von Expert:inneninterviews überprüft und verbessert. Mit dem entwickelten Prozess der Urban Systems Exploration und der Bewertung von Synergieeffekten in der gebauten Umwelt wird ein Grundstein für die Integration von Wechselwirkungen zwischen Gebäuden und Außenraum in Entscheidungsprozessen gelegt. Dabei zeigt sich, dass die Ermittlung des Pareto-optimalen Handlungsspielraums wesentlich zur ausgewogenen Quartiersplanung beiträgt. Dies unterstützt Planer:innen und Entscheider:innen bei der fundierten Abwägung von Zielkonflikten im Quartier, erfordert jedoch interdisziplinäre Herangehensweisen und ausgeprägtes Systemdenken aller Beteiligten. Die erstellten Quartiersteckbriefe bieten hierzu bereits in frühen Planungsphasen eine niedrigschwellige Diskussionsgrundlage. Die Ergebnisse der Fallstudien zeigen unter anderem, dass Außenraumbegrünungen in Kombination mit hohen Energiestandards und Photovoltaikflächen zur parallelen Verbesserung der lebenszyklusbasierten Treibhausgasemissionen, der Lebenszykluskosten und des Außenraumkomforts beitragen. Zudem sind diese Planungsvariablen gut geeignet, um Pareto-optimale Trade-offs gezielt zu steuern und dadurch Planungsentwürfe an den jeweiligen Quartierskontext anzupassen. Zusammenfassend ist die Notwendigkeit einer systemischen Betrachtung des urbanen Raums für die nachhaltige Transformation des Bausektors feststellbar. Es wird daher empfohlen, bei Planungsprozessen eine multikriterielle Betrachtungsweise anzustreben und Entscheidungen basierend auf der vollständigen Exploration des Handlungsspielraums abzuwägen.
Article
Artificial Intelligence (AI) applications in building performance prediction for environmental sustainability outcomes play a significant role in compensating for computationally incompetent and expensive approaches to solving increasingly complex optimization problems. Although machine-learning-based surrogate models (SMs), one of the many AI approximation strategies for higher-order models, have long been utilized in sophisticated optimization studies in various fields, their reliability in the field of sustainable and low-energy architectural design is now being extensively explored. To comprehend the effectiveness of applying state-of-the-art surrogate-assisted optimization methodologies for building energy performance, this study presents a comprehensive review of recent studies aimed at identifying innovative theories and critical factors for each component of the optimization process. Considering passive form and envelope design variables, seventy-two relevant studies from the Scopus database were selected for review after screening. The results, including the current trends for surrogate modeling and optimization methods, accuracy levels of SMs, extent of building performance improvement, and decisive building design variables, are thoroughly discussed according to the varying conditions of each study. Finally, a further discussion of recent methodological advancements and limitations can help identify potential approaches and challenges for future research.
Article
Full-text available
We consider the problem of optimizing an unknown function given as an oracle over a mixed-integer box-constrained set. We assume that the oracle is expensive to evaluate, so that estimating partial derivatives by finite differences is impractical. In the literature, this is typically called a black-box optimization problem with costly evaluation. This paper describes the solution methodology implemented in the open-source library RBFOpt, available on COIN-OR. The algorithm is based on the Radial Basis Function method originally proposed by Gutmann (J Glob Optim 19:201–227, 2001. https://doi.org/10.1023/A:1011255519438), which builds and iteratively refines a surrogate model of the unknown objective function. The two main methodological contributions of this paper are an approach to exploit a noisy but less expensive oracle to accelerate convergence to the optimum of the exact oracle, and the introduction of an automatic model selection phase during the optimization process. Numerical experiments show that RBFOpt is highly competitive on a test set of continuous and mixed-integer nonlinear unconstrained problems taken from the literature: it outperforms the open-source solvers included in our comparison by a large amount, and performs slightly better than a commercial solver. Our empirical evaluation provides insight on which parameterizations of the algorithm are the most effective in practice. The software reviewed as part of this submission was given the Digital Object Identifier (DOI) https://doi.org/10.5281/zenodo.597767.
Article
Full-text available
This article presents a method to visualize high-dimensional parametric design spaces with applications in computational design processes and interactive optimization. The method extends Star Coordinates using a triangulation-based interpolation with Barycentric Coordinates. It supports the understanding of design problems in architectural design optimization by allowing designers to move between a high-dimensional design space and a low-dimensional Performance Map. This Performance Map displays the characteristics of the fitness landscape, develops designers’ intuitions about the relationships between design parameters and performance, allows designers to examine promising design variants, and delineates promising areas for further design exploration.
Conference Paper
Full-text available
Fig 1: Optimization processes in Particle Swarm Optimization: particles and their current velocities. WHICH ARE YOUR ARCHITECTURAL (R)SOLUTIONS TO THE SOCIAL, ENVIRONMENTAL AND ECONOMIC CHALLENGES OF TODAY? Research summary Building sustainable and resilient lives in harmony with the ecosystems and local resources requires a bottom up approach as it starts from the analysis of people needs and social-economic trends. Most current social, environmental and economic challenges have multiple features, such as changes in people lifestyles, urban growth, energy and water expenditures, affordability and quality of living conditions. In order to rationalize these features, complex problem analysis and optimization tools are introduced into the design process. Evolutionary Computation (EC) techniques are considered to be suitable in solving most design problems (Kicinger et al., 2005). However EC solvers are slow (Rutten, 2010), unintuitive, hard to visualize and not verified (Vierlinger, 2013), therefore they are not widely adopted, resulting in many valuable design opportunities being missed. This paper visualizes how biological optimization techniques can assist with architectural design problems. Furthermore, a Swarm Intelligence (SI) approach is introduced as it is hypothesised to be fast, easy to tune and intuitive in its operation. The most popular platform for parametric modelling-Grasshopper® plug-in for Rhinoceros 3D® (Martyn, 2009) was selected as the demonstrator. Experimental results on four domains of increasing complexity show the effects of diversification in the mapping processes of solvers. The introduced Particle Swarm Optimisation (PSO) technique, which relies on SI rather than EC, demonstrated its computational speed, intuitiveness and robustness in complex optimization problems. Such problems increasingly occur in modern architectural practice, so PSO has the potential to become a revolutionary design tool.
Conference Paper
Full-text available
Black-box optimization methods play an important role in automated design space exploration, but to-date have not been systematically compared on problems from architectural design optimization. This paper presents a quantitative comparison of the three major types of black-box optimization: metaheuristics, direct search, and model-based methods. We compare the performance of one representative algorithm of each type (including a genetic algorithm) on four performance based design problems of varying complexity and characteristics. Our results show that metaheuristics are greatly outperformed whenever evaluating tens of thousands of design candidates is not an option, and suggest direct search and model-based methods as viable and more efficient alternatives.
Article
Full-text available
Integrated building design is inherently a multi-objective optimization problem where two or more conflicting objectives must be minimized and/or maximized concurrently. Many multi-objective optimization algorithms have been developed; however few of them are tested in solving building design problems. This paper compares performance of seven commonly-used multi-objective evolutionary optimization algorithms in solving the design problem of a nearly zero energy building (nZEB) where more than 1.610 solutions would be possible. The compared algorithms include a controlled non-dominated sorting genetic algorithm with a passive archive (pNSGA-II), a multi-objective particle swarm optimization (MOPSO), a two-phase optimization using the genetic algorithm (PR-GA), an elitist non-dominated sorting evolution strategy (ENSES), a multi-objective evolutionary algorithm based on the concept of epsilon dominance (evMOGA), a multi-objective differential evolution algorithm (spMODE-II), and a multi-objective dragonfly algorithm (MODA). Several criteria was used to compare performance of these algorithms. In most cases, the quality of the obtained solutions was improved when the number of generations was increased. The optimization results of running each algorithm 20 times with gradually increasing number of evaluations indicated that the PR-GA algorithm had a high repeatability to explore a large area of the solution-space and achieved close-to-optimal solutions with a good diversity, followed by the pNSGA-II, evMOGA and spMODE-II. Uncompetitive results were achieved by the ENSES, MOPSO and MODA in most running cases. The study also found that 1400-1800 were minimum required number of evaluations to stabilize optimization results of the building energy model.
Thesis
Full-text available
Today architectural design processes are more and more influenced by parametric methods. As these allow for a multiplicity of alternatives, the design process can be enriched by computational optimization. Extensive research has shown the efficiency of optimization in engineering and design disciplines. Though, optimization is hereby rather a technical than a design task; it is limited to different autonomous specialist areas and does not enable a comprehensive approach. Advanced optimization methods facilitate the generation of complex systems, but these procedures are directed and do not provide turnoffs, multiple solutions, or altering circumstances. These, however, are things that are essential for architectural design processes, which mostly do not have clearly defined starting and end points. This practice subdivides the workflow into two independent and recurring tasks: the generation of a parametric model followed by optimization of its driving parameters. The result is then assessed with respect to its actual qualities. The design either is kept, or modifications on the parametric model, its auxiliary conditions, and parameters are made and the optimization process starts again from scratch. The aim of this work is the development of basic structures for a flexible generation and optimization framework for practical use in the sense of a continuously accompanying design explorer, in which parameterization is adaptable and objective functions are changeable at any time during the design process. The user is supported in his/her understanding of correlations by identifying a multiplicity of optimal solutions utilizing state-of-the-art multi-objective search algorithms within the core of the framework. Considering the tool as an interactive design aid, an intuitive interface allowing for extensive manual guidance and verification of the search process is featured. User-selection of prefered solutions support man-machine-dialogue and incorporation of non- or not-yet quantifiable measures. A reusable search history aids examination of design alternatives and the redefinition of constraints maintaining the continuity of the search process and traceability of results in the sense of a rational design verification. Within this work it is not planned to focus on specific optimization targets, but to build an open framework to allow for all kinds of objective functions and in particular the mediation between conflicting targets. A basic platform should be established, applicable in real-world modes of operation that satisfies increasingly complex correlations of modern architecture’s requirements as a powerful extension of classical design methods.
Article
Full-text available
Climate change, resource depletion, and worldwide urbanization feed the demand for more energy and resource-efficient buildings. Increasingly, architectural designers and consultants analyze building designs with easy-to-use simulation tools. To identify design alternatives with good performance, designers often turn to optimization methods. Randomized, metaheuristic methods such as genetic algorithms are popular in the architectural design field. However, are metaheuristics the best approach for architectural design problems that often are complex and ill defined? Metaheuristics may find solutions for well-defined problems, but they do not contribute to a better understanding of a complex design problem. This paper proposes surrogate-based optimization as a method that promotes understanding of the design problem. The surrogate method interpolates a mathematical model from data that relate design parameters to performance criteria. Designers can interact with this model to explore the approximate impact of changing design variables. We apply the radial basis function method, a specific type of surrogate model, to two architectural daylight optimization problems. These case studies, along with results from computational experiments, serve to discuss several advantages of surrogate models. First, surrogate models not only propose good solutions but also allow designers to address issues outside of the formulation of the optimization problem. Instead of accepting a solution presented by the optimization process, designers can improve their understanding of the design problem by interacting with the model. Second, a related advantage is that designers can quickly construct surrogate models from existing simulation results and other knowledge they might possess about the design problem. Designers can thus explore the impact of different evaluation criteria by constructing several models from the same set of data. They also can create models from approximate data and later refine them with more precise simulations. Third, surrogate-based methods typically find global optima orders of magnitude faster than genetic algorithms, especially when the evaluation of design variants requires time-intensive simulations.
Article
Optimization techniques and methods for selecting better solutions (as defined by the metric chosen) are becoming more common in simulation software. Several methods are available for energy consumption optimization: parametric analysis, genetic algorithms, and examination of alternatives via the Pareto front. Optimization algorithms and design alternative methods, as offered in commonly used energy software programs, offer techniques for guiding designers towards "better" (less energy use) solutions. Processes for incorporating these existing tools into a designer's workflow need to be examined, critically evaluated, and improved. This paper discusses an integrated design approach using a case study incorporating benefits of simulation and optimization techniques at different phases of a buiding design project to make knowledge-based decisions efficiently.