Conference PaperPDF Available

Performance Driven Design Optimisation with Scientific Workflow System


Abstract and Figures

This paper proposes the use of a scientific workflow system to facilitate the implementation of a performance driven design process. The use of a scientific workflow system would enable the architect to customise and automate his/her own workflow for each project. Through this the architect will be able to manipulate the data generated from a performance driven design process efficiently and explore more design options. A simple example is presented to demonstrate the design process proposed. The scientific workflow system, Kepler is used to oversee the design process and it is linked with the 3D modelling program, SideFX Houdini 3D. Lighting simulation program, Radiance and lastly Evolutionary Algorithm (EA) is used for the optimisation of the design.
Content may be subject to copyright.
Singapore ETH Centre Future
Cities Laboratory
National University of Singapore
Institute of Technology in
Architecture, Department of
Architecture, ETH Zürich
This paper proposes the use of a scientific workflow system to facilitate the
implementation of a performance driven design process. The use of a scientific
workflow system would enable the architect to customise and automate his/her
own workflow for each project. Through this the architect will be able to
manipulate the data generated from a performance driven design process
efficiently and explore more design options. A simple example is presented to
demonstrate the design process proposed. The scientifc workflow system,
Kepler is used to oversee the design process and it is linked with the 3D
modelling program, SideFX Houdini 3D. Lighting simulation program,
Radiance and lastly Evolutionary Algorithm (EA) is used for the optimisation
of the design.
Keywords: Performance driven design, Scientific workflow system,
Evolutionary algorithm, Parametric design, Design process
The integration of environmental simulation and Computer-Aided Design
(CAD) programs into a common platform is essential for a performance oriented
building design. This integrated environment facilitates a performance driven
design process by providing feedback on the architect's design decision.
Furthermore, the automation of the design performance assessment and the
employment of optimisation algorithms enable the architect to generate a
multitude of possible design variants and thus explore more design possibilities.
Research shows that the majority of architects and engineers use simulation
programs for validation and not for exploration of alternative designs (Flager
and Haymaker 2007). One of the reasons is that the data from most of the CAD
program are not sufficient and often not compatible with simulation programs.
This greatly impairs the flow of the design process, as architects then have to
manage and error check the various domain specific data exchange, instead of
spending time on developing the design.
There are basically three level of integration of CAD and simulation
programs into the design process. At the first level, a standard CAD system is
linked to a set of simulation programs so that one can easily evaluate design
variants that are generated manually. There are various research efforts in this
field to achieve this smooth transition from the CAD system to each domain
specific simulation program. (Citherlet, Clarke et al. 2001; Malkawi 2004) One
of the example is the post processing and mapping of Building Information
Modeling (BIM) 3D model to domain specific simulation program (Sanguinetti,
Abdelmohsen et al. 2012), it tries to achieve a smooth transition between design
and evaluation by processing the data from the BIM model and preparing it for
the different simulation programs. Another example is The Design Analysis
Integration (DAI) initiative.(Augenbroe, Wilde et al. 2003; Augenbroe, Wilde et
al. 2004) It uses a workbench to manage the process and at the same time make
use of existing file formats like Industry Foundation Class (IFC) for the
information exchange. DAI do not prescript the work flow for the user but give
the user certain amount of flexibility to customise their workflow.
At the second level, a parametric CAD system such as Grasshopper or
Houdini3D is linked to a set of simulation programs so that design variants can
be more easily generated and evaluated without manually remodeling each
design from scratch. Parametric modeling refers to the process in which the
architect defines the design with a set of parameters. For example a set of
parameters to control the height, breath and length of a rectangular block tower.
This can speed up the process of design generation as various different designs
can be generated by varying the parameters. As a result more “what if” scenarios
and designs could be explored and evaluated(Shea, Aish et al. 2005).
Lastly, an optimisation algorithm is used to close the loop by linking the
simulation results back to the parametric CAD system, so that one can more
easily explore the performance tradeoffs of a large number of design variants
without the need to manually define the parameters for each design
variant.(Caldas 2006; Shea, Sedgwick et al. 2006; Flager and Haymaker 2007;
Flager, Welle et al. 2009). The use of optimisation algorithms such as Genetic
Algorithm (GA) and Ant Colony Algorithm is usually a more bottom up
approach, where the architect sets up rules, constraints and boundary conditions
for the generation of design alternatives. It is more of an exploratory nature in
which the algorithm takes a more active role in the design process and might
generates unexpected design alternatives.
This paper proposes the adaptation of a scientific workflow system as an
environment for the architect to set up design loops for performance driven
design optimisation. The use of a workflow system in the scientific community
is common practice, as most of an experiment involves repetitive cycles of
simulation, analysis and management of results. The use of a scientific
workflow system enables the automation of the cycles and at the same time
allows the scientist to concentrate on the research and not the computation
management. There is an array of available workflow system for tasks of
different purposes (V. Curcin 2008; Deelmana, Gannonb et al. 2009). This paper
presents the initial approach of the adaptation of the tool for the architecture
design purposes.
A workflow is an abstract description of the steps needed and the
information flow between the steps for executing a specific process. Each step is
made up of a series of activities. The activities include processing of data for the
next step, execution of a simulation with the provided data or the analysis of the
given data. The steps are executed in order from start to finish and most of the
time it is repeated in cycles with variation of the value of the information, but
not the nature of the information. The workflow is usually constructed with a
visual front-end or hand-coded.
The visual front-end of a workflow system is usually a graph illustrating the
order of the execution of steps needed to complete a specific process. The
connecting wire between each node is representation of the flow of data between
the different steps. The users can usually alter the workflow by manipulating the
graph, it varies depending on different systems.
One key advantages of using a workflow system is that it serves as a form
of documentation for one's design process. As alterations to the design are
common during the architectural design process, a well documented design
process facilitates making changes to the design. The workflow could be reused,
improved or altered after each project. It also enables the sharing of workflow
between collaborators. Some workflow systems allow the nesting of sub-
workflow within a workflow. This means as long as the inputs and the outputs
of different steps are well defined, nested workflows could be distributed for
collaborative purposes. This results in a modular workflow system which is
highly flexible and customisable.
Figure 1 Possible modular workflow with well-defined data exchange
The diagram in Figure 1 conceptually illustrates a modular workflow
system with well defined data exchanged between each sub-workflow. In this
case, each sub-workflow could be authored by different professionals. For
example, the architect might author the CAD sub-workflow while the
mechanical engineer authors the SIM1 sub-workflow. This could be possible by
the usage of common file exchange such as Industry Foundation Class (IFC) or
an agreed upon data format between the two professionals.
A workflow which uses Evolutionary Algorithm (EA) for its feedback
mechanism was setup. The workflow is made up of two main components; the
feedback and evaluation component. The Kepler system (Pennington, Higgins et
al. 2007) was used for managing the workflow of the design process. The Kepler
system uses a Director and Actor modelling paradigm in which the Actors are
the workflow components while the Director is in charge of the overall
workflow orchestration. SideFX Houdini3D was the 3d modelling program used
for the parameterising and generation of the design schema. For the evaluation
tasks the workflow is able to link the Radiance lighting software for simulating
solar irradiation and Python scripts were written for the calculation of the RETV
value and for the implementation of EA in Kepler system.
Figure 2 Parameters and sub-workflow
Figure 2 shows the workflow. Each node in the graph is a nested workflow.
The parameters shown in Figure 2 are the necessary inputs for running the
workflow. It includes information for running the EA, to generate the designs
and also the location of files necessary for running the workflow. These
parameters feed the sub-workflows with the information for running the
Figure 3 Inside the "Feedback" sub-workflow
At the start of the workflow is the “Init and Feedback” sub-
workflow.(Figure 3) This sub-workflow is in charge of doing the EA feedback
mechanism. It generates new designs either randomly in the first cycle or by
reproduction at the subsequent cycles. The sub-workflow is made up of an EA
environment node written in Python script. The node takes in the Genotype
Meta settings and produce the parameters for generating the design variants. It
takes in the Score Meta settings for manipulating the evaluation scores produced
by the evaluation task, whether to minimise or maximise is specified at the
“min_max_list” parameter. Lastly, it takes in the “live_file” and “dead_file”
parameters for writing the results.
Figure 4 Inside the "Evaluation" sub-workflow
The design variants are send to the “Evaluation Task” as a array of
parameters.(Figure 4) The sub-workflow includes generating a 3D model by
accessing the 3D software Houdini3D, extracting the geometry data and sending
it for evaluation tasks such as solar irradiation or RETV evaluation. This is done
by using the Python Houdini3D Advance Programming Interface (API). A
Python script was written to access Houdini3D and the parameters are passed to
the script as variables for generating a design variant. The geometry of the
design variant is then extracted and processed for the relevant simulations. All
these are done through the use of a library of Python codes and API written by
the authors.
The parameters and the scores of each design variant is sent to the
“WriteToLiveFile” sub-workflow in which the designs are written to a .csv file.
The cycle repeats itself at the “Init and Feedback” sub-workflow where it will
read the .csv file produce from the previous cycle and perform EA feedback
A simple case study of a mix-used development in Singapore was done to
demonstrate the workflow setup. It is parameterised as follows:
The two storeys commercial mall is not altered during the design
generation. The residential tower above the mall varies with the
parameters X and Y which determine the footprint of the
residential tower. This in turn will affect the height of the tower as
The orientation and location of the residential blocks.
The height of the residential block exceeds 60m, instead of one
60m tower it will generate two 30m towers.
Figure 5 Design schema and parameters
The environmental strategy is to maximise the amount of solar irradiation
falling on the building envelope while minimising solar heat gain. This
corresponds to the conflicting objectives of generating as much electricity from
the installation of Building Integrated PhotoVoltaics (BIPV) while at the same
time reducing the amount of solar heat gain. It is an attempt to source for
possible energy source on site and still reduce the demand of cooling energy in
the building.
The evaluation methods used are the Radiance ray-tracing lighting
simulation for the measure of solar irradiation and Residential Envelope
Transmittance Value (RETV) (Chua and Chou 2010) for assessing the solar heat
gain of the envelope. RETV is a measure of the solar heat gain from the building
envelope, it is developed by the Building and Construction Authority (BCA) of
The design schema is optimised using an EA. At each cycle :
1. A population of 100 design variants are randomly generated by
assigning parameter values to the four parameters described above.
2. Each design variant is evaluated and assigned scores for the two
evaluation methods.
3. A sub-population of 50 design variants are randomly selected from
the main population and pareto ranked according to their scores.
4. 30 design variants at the bottom of the ranking will be “killed” and
20 new design variants are reproduced from the “stronger” design
variants, the reproduction process is done through crossover
techniques and mutation occurs at a rate of 0.01.
5. The cycle is repeated from step 2 with the “fitter” design variants.
At the end of each cycle the design variants that are “alive” or
“dead” are written to two .csv files live.csv and dead.csv
respectively. The architect will be able to check the progress by
reading the .csv file.
As a result, the population of designs gradually evolves after each cycle.
The optimisation loop stops as specified by the user at the start or anytime
during the process. It is an performance driven design exploration of the
possible design variants.
The EA process ran for 160 generations and 4000 designs were produced.
The graph in Figure 9 shows the design variants produced at each generation.
One can observed that after each generation the design variants are moving
towards bottom right corner of the graph forming the pareto front, where it is of
high solar irradiation and low solar heat gain. Thus, achieving the aim of
increasing the generation of electricity with BIPV while still keeping the cooling
demand as low as possible.
Figure 6 Graph of EA results
Figure 7 Design 4209 (top) Design 4116 (bottom) on the pareto front
The case study illustrates a basic workflow set up with Kepler. This
workflow or its sub-workflow can be packaged and shared as a template for
future use. It can be reused or altered to fit the specific project. For example,
another user might not be proficient in using Houdini3D and is only proficient in
Rhinoceros3D. The Houdini3D actor node could be replace with a
Rhinoceros3D actor node. This applies to the evaluation actor node too. As more
users are willing to contribute , it will expand the capability of the workflow
system and accommodate more digital tools for design purposes. It is also
possible for the designer to first do a series of non-looped experiments where
the designer 'learns' what needs to be optimised. The designer could then
incorporate an optimisation algorithm to optimise the key aspects. By having
well orchestrated workflow, it is possible for architects to “plug in” design into
the workflow to be evaluated and optimised for its environmental performance.
[Augenbroe, G., P. d. Wilde, et al. (2004). "An interoperability workbench for design analysis
integration." Energy and Buildings 36(8): 737-748.
Augenbroe, G., P. d. Wilde, et al. (2003). THE DESIGN ANALYSIS INTEGRATION (DAI)
INITIATIVE. Eighth International IBPSA Conference. Eindhoven, Netherlands: 79-86.
Caldas, L. (2006). GENE_ARCH: An Evolution-Based Generative Design System for Sustainable
Architecture Intelligent Computing in Engineering and Architecture. I. Smith, Springer
Berlin / Heidelberg. 4200: 109-118.
Chua, K. J. and S. K. Chou (2010). "An ETTV-based approach to improving the energy performance
of commercial buildings." Energy and Buildings 42(4): 491-499.
Citherlet, S., J. A. Clarke, et al. (2001). "Integration in building physics simulation." Energy and
Buildings 33(5): 451-461.
Deelmana, E., D. Gannonb, et al. (2009). "Workflows and e-Science: An overview of workflow
system features and capabilities." Future Generation Computer Systems 25(5): 528 - 540.
Flager, F. and J. Haymaker (2007). A comparison of multidisciplinary design, analysis and
optimisation processed in the building construction and aerospace industries 24th W78
Conference Maribor 2007, Bringing ITC knowledge to work. Maribor, 625-630.
Flager, F., B. Welle, et al. (2009). "Multidisciplinary Process Integration and Design Optimisation of
a Classroom Building." Journal of Information Technology in Construction 14: 595-612.
Malkawi, A. M. (2004). "Developments in environmental performance simulation." Automation in
Construction 13: 437-445.
Pennington, D. D., D. Higgins, et al. (2007). Ecological Niche Modeling Using the Kepler Workflow
System Workflows for e-Science. I. J. Taylor, E. Deelman, D. B. Gannon and M. Shields,
Springer London: 91-108.
Sanguinetti, P., S. Abdelmohsen, et al. (2012). "General system architecture for BIM: An integrated
approach for design and analysis." Advanced Engineering Informatics 26(2): 317-333.
Shea, K., R. Aish, et al. (2005). "Towards integrated performance-driven generative design tools."
Automation in Construction 14(2): 253-264.
Shea, K., A. Sedgwick, et al. (2006). Multicriteria Optimization of Paneled Building Envelopes
Using Ant Colony Optimization. Intelligent Computing in Engineering and Architecture
13th EG-ICE Workshop 2006. I. F. C. Smith. Ascona, Switzerland, SpringerLink. 4200:
V. Curcin, M. G. (2008). Scientific workflow systems - can one size fit all? Biomedical Engineering
Conference, 2008. CIBEC 2008. Cairo International Cairo
... These strategies usually fall in to two approaches which are the development of better products ( Escarré et al., 2015;Heinstein, Ballif, & Perret-Aebi, 2013;Jayathissa et al., 2017;Valckenborg, van der Wall, Folkerts, Hensen, & de Vries, 2018) and the integration of BIPV into the design process ( Kaan & Reijenga, 2004). Some researchers propose PV education as a promoter of the BIPV integration (Broman, 1994;Ciriminna, Meneguzzo, Pecoraino, & Pagliaro, 2016), others propose design procedures based on performance and often taking into account multi-functional aspects of BIPV ( Bazilian & Prasad, 2002;Chatzipanagi, Frontini, & Virtuani, 2016;Lovati, Adami, De Michele, Maturi, & Moser, 2016;Sun, Lu, & Yang, 2012;Wee, Janssen, & Schlueter, 2012). ...
This paper presents a new method for the planning of photovoltaic systems in the early architectural design. The method finds capacity and position of a photovoltaic system over the envelope of a building by means of optimization. The input consists in: geometry of the building, surrounding shadings, local weather, hourly electric demand, unitary costs of the system and benefits for the production of electricity (sold or self-consumed). In the input there are known values (e.g. PV installation costs [€/kWp] or present costs for the electricity [€/kWh]) and unknown ones (e.g. degradation rate [%/year], maintenance costs [€/kWp year] or discount rate [%/year]). The optimization is performed using the expected value out of a set of parametric scenarios generated by the unknown input values. The results show that, if capacity and position of the system are tailored on its aggregated electric demand, a large penetration of photovoltaic electricity is profitable at current prices without incentives or valorization from the grid. The optimization performed with an arbitrary set of electric storages shows how the presence of storage fosters a higher optimal capacity for the PV system. This method has the potential to hugely expand the installation of urban photovoltaic.
... Lastly, a scientific workflow system is used to link the environmental simulation, CAD and optimisation algorithm programs together. It is a design environment which enables the architect to manipulate and manage the design process (Chen, Janssen et al. 2012). ...
... In order to support the low exergy design approach, including the relevant systems parameters; such as the supply temperature for the radiant panels, the required cooling surface and the number of ventilation units, digital design tools are being created to support integrated design decision-making. Various 3D modeling, lighting simulation and energy simulation programs are linked together in a workflow management program to ensure a smooth exchange of information (Chen et al., 2012a). An initial workflow was created (Chen et al., 2012b), but it needs to be further refined and validated alongside the low exergy design method. ...
Full-text available
Architecture, Engineering, and Construction (AEC) professionals typically generate and analyze very few design alternatives during the conceptual stage of a project. One primary cause is limitations in the processes and software tools used by the AEC industry. The aerospace industry has overcome similar limitations by using Process Integration and Design Optimization (PIDO) software to support Multidisciplinary Design Optimization (MDO), resulting in a significant reduction to design cycle time as well as improved product performance. This paper describes a test application of PIDO to an AEC case study: the MDO of a classroom building for structural and energy performance. We demonstrate how PIDO can enable orders of magnitude improvement in the number of design cycles typically achieved in practice, and assess PIDO's potential to improve AEC MDO processes and products.
Conference Paper
Full-text available
The past decade has witnessed a growing trend in designing and using workflow systems with a focus on supporting the scientific research process in bioinformatics and other areas of life sciences. The aim of these systems is mainly to simplify access, control and orchestration of remote distributed scientific data sets using remote computational resources, such as EBI web services. In this paper we present the state of the art in the field by reviewing six such systems: Discovery Net, Taverna, Triana, Kepler, Yawl and BPEL. We provide a high-level framework for comparing the systems based on their control flow and data flow properties with a view of both informing future research in the area by academic researchers and facilitating the selection of the most appropriate system for a specific application task by practitioners.
Full-text available
Performance-driven generative design methods are capable of producing concepts and stimulating solutions based on robust and rigorous models of design conditions and performance criteria. Using generative methods, the computer becomes a design generator in addition to its more conventional role as draftsperson, visualizor, data checker and performance analyst. To enable designers to readily develop meaningful input models, this paper describes a preliminary integration of a generative structural design system, eifForm, and an associative modeling system, Generative Components, through the use of XML models. An example is given involving generation of 20 lightweight, cantilever roof trusses for a saddle shaped stadium roof modeled in Generative Components. Synergies between the two systems and future extensions are discussed.
This paper aims to study the various parameters that affect the energy performance of commercial buildings in Singapore. The parameters are diverse, ranging from characteristics of construction of the walls and windows, to the various system settings and types within the building. Building energy performance is measured via two key indexes, namely, the Envelope Thermal Transfer Value (ETTV) and the annual cooling energy requirement (Ec). Parameters related to these two indexes are identified. An additional parameter, the solar absorptance of the wall, is further incorporated to calibrate the ETTV equation. A relative ranking on the functional parameters of ETTV has been performed to evaluate their effectiveness in lowering the ETTV of buildings. In addition, the impact of using cladding on ETTV is also studied. A correlation for Ec, expressed in the form of a simple linear equation, has been developed. This correlation accounts for the internal building loads, envelope loads, operating schedules and efficiency of the cooling equipment. Finally, ETTV and Ec have been employed to study the effects of chiller over-sizing and ventilation rates on building cooling energy. In the pursuit for better energy-efficient buildings, the approach presented in this paper contributes to the construct of sustainable energy-efficient built-environment.
One of the significant benefits of Building Information Modeling (BIM) is the ability to effectively use analysis and evaluation programs during design, as feedback. However, the current dominant approach to analysis and evaluation of design proposals requires the creation of a separate building model for each kind of evaluation. This typically involves using a BIM tool to prepare the data for a specific type of anal-ysis to obtain design feedback. Most of the effort lies in modifying the building model to support the anal-ysis required. When dealing with multiple evaluations, this process is time consuming, greatly reducing the design benefits of BIM. We propose a system architecture to facilitate analysis and feedback in archi-tectural design, based on post-processing design-oriented building models. The post-processing automat-ically adapts the building model to the needs of the specific analysis, where multiple analyses can be run from the same building model. We outline the methods for realizing such design interoperability. By uti-lizing geometric and attribute relationships and semantics, data subsets are identified and aggregated. We present an example where the design of a class of buildings – federal courthouses, is evaluated in terms of multiple analyses: programmatic spaces, building circulation, energy consumption, and preli-minary cost. These analyses are performed by post-processing a single BIM model. The method is appli-cable to both API-based direct interfaces as well as open-standard building models.
Changes in biodiversity have been linked to variations in climate and human activities [295]. These changes have implications for a wide range of socially relevant processes, including the spread of infectious disease, invasive species dynamics, and vegetation productivity [27, 70, 203, 291, 294, 376, 426]. Our understanding of biodiversity patterns and processes through space and time, scaling from genes to continents, is limited by our ability to analyze and synthesize multidimensional data effectively from sources as wide-ranging as field and laboratory experiments, satellite imagery, and simulation models.
Past design analysis integration efforts have addressed the need to achieve seamless data integration between design and analysis software applications. These efforts have been dominated by an interoperability focus. Recently, it has been recognized that these solutions suffer from several limitations. For one, they assume a ‘perfect world’ in which all information is structured and all mappings between design and analysis representations exist on a generic level. Secondly, most interoperability solutions have a data focus, whereas true design analysis integration requires a “language” to express both analysis requests and the answers that are generated by experts responding to these requests. An intrinsic part of this language is the logic of the design analysis process. Hence, the integration effort requires a strong process rather than data focus. The Design Analysis Integration (DAI)-Initiative aims to steer towards new solutions for design analysis integration that may overcome the limitations of current data-centric interoperability approaches. This paper reports on the first phase of the development, which has produced a first-generation ‘workbench’ prototype for managing a process driven design analysis dialogue. The workbench is meant to enable a more robust use of existing building models such as IFC for the mapping to simulation tools. The paper presents the underlying theories, prototype development, and findings from the research and concludes with a discussion of future work, targeting extension and benchmarking of the current prototype.
For more than a quarter of a century, building simulation programs have been developed to undertake non-trivial performance appraisals. In general these programs deal only with a small sub-set of the overall problem. However, advanced architectural developments require an integrated approach to design. The domains of heating, lighting, ventilation and acoustics, for example, are often closely related and it is only by taking into account their interactions that a complete understanding of building behaviour can be obtained. This paper describes some recent work to further the development of a multiple-domain approach.RésuméDepuis plus d’un quart de siècle, des programmes de simulation en physique du bâtiment ont été développés pour simplifier la résolution de calculs fastidieux et complexes. Mais quel que soit le type de programmes, ils ne traitent que d’un nombre limité de domaine de la physique du bâtiment. Néanmoins, les développements récents en architecture requiert de plus en plus une approche intégrée des différents domaines découlant du concept et des matériaux utilisés. Chaleur, lumière, ventilation ou acoustique sont étroitement liés et ce n’est qu’en tenant compte des ces interactions, qu’il est possible de connaı̂tre le comportement global du bâtiment. Ce papier décrit une approche possible pour une méthodologie intégrée de la simulation en physique du bâtiment.
Performance simulation in architectural design has been rapidly evolving over the past decade. Simulation tools have become widely available and this has had a measurable influence on the way in which buildings are being designed, analyzed and constructed. Several new approaches for the development of these tools are emerging, some of which evolved through many years of work based on the area of building simulation. This paper reviews these developments in environmental digital performance simulation as it relates to architectural design. It describes the recent research, development and use of these environments as well as some of the challenges that exist.
Conference Paper
Definition of building envelopes is guided by a large number of influences including structural, aesthetic, lighting, energy and acoustic considerations. There is a need to increase design understanding of the tradeoffs involved to create optimized building envelope designs considering multiple viewpoints. This paper presents a proof-of-concept computational design and optimization tool aimed at facilitating the design of optimized panelized building envelopes for lighting performance and cost criteria. A multicriteria ant colony optimization (MACO) method using Pareto filtering is applied. The software Radiance is used to calculate lighting performance. Initial results are presented for a benchmark and project-motivated scenario, a media center in Paris, and show that the method is capable of generating Pareto optimal design archives for up to 11 independent performance criteria. A preliminary GUI for visualizing the Pareto design archives and selecting designs is shown. The results illustrate that for desired values of lighting performance in different internal spaces, there is often a range of possible panel configurations and costs.