Conference Paper

Designing With Data: Moving Beyond The Design Space Catalog

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The consequent lack of knowledge over the design space prevents designers from making informed decisions when trying to find a viable solution manually, even though this knowledge gain is one of the main motivators behind ADO in practice. [4] To enhance the effectiveness of ADO and reduce the aforementioned difficulties, Design Space Exploration (DSE) has proven to be effective [5], [6], [7]. DSE is based on three pillars: Firstly, effectively sampling the design space. ...
... However, very few DSE software tools for architectural design exist and the once that do are limited either in their capabilities of integrated sampling strategies , surrogate modeling, visualization, interactivity, or a combination of those [5], [6]. The work presented in this paper expands on [9], demonstrating a fast and integrated DSE workflow enabled by a new software architecture following a methodology outlined by [10]. ...
... While there is no shortage of optimization and simulation plug-ins for Grasshopper, few tools for DSE exist. [5] developed Design Space Exploration by Digital Structures, built on the idea of design catalogues, i.e., a sampled set of design variates that can be analyzed. While they analyze different sampling strategies and construct a surrogate model to extend this catalogue in their published research, the Grasshopper plug-in does not allow the user to do so. ...
Conference Paper
Full-text available
This paper presents a novel, more flexible and faster software framework for design space exploration (DSE) in Rhino/Grasshopper, that allows users to sample the design space, train and deploy machine-learning (ML) models, and visually and interactively explore the design space, and evaluates it with a case study. In light of the climate crisis, the integration of simulation-based optimization processes in the design of buildings is becoming ever more important. Architectural design optimization can effectively reduce a building's climate impact by, e.g. minimizing energy consumption. However, these optimizations rely on costly simulations and present the designers with only a limited selection of feasible design candidates, i.e. the result of the optimization algorithm. If the so explored well performing design variates do not fulfill other criteria, such as aesthetics, the designers are likely to disregard the optimization results. This is where DSE tools come into play. By utilizing ML to estimate the performance values instead of simulating, and visualizing the results in intuitive ways, designers can be informed of a design variate's performance in real-time.
... In the face of growing complexity in the building industry (Kolarevic, 2009) and its low efficiency (McKinsey, 2016), new data-driven design approaches have recently been introduced, replacing the traditional, linear, and sequential project workflow (Alvarez et al., 2019). As stated by Brown and Mueller (2017), despite the widespread belief that architecture depends on human intuition, reasoning and creativity, data can complement or enhance human activities, for example, by helping to make informed decisions and to solve complex problems (Wei, Yuan & Liu, 2020). This method is used in multi-objective design processes, where an architect is guided by simulations that describe several aspects of building performance (Brown & Mueller, 2017). ...
... As stated by Brown and Mueller (2017), despite the widespread belief that architecture depends on human intuition, reasoning and creativity, data can complement or enhance human activities, for example, by helping to make informed decisions and to solve complex problems (Wei, Yuan & Liu, 2020). This method is used in multi-objective design processes, where an architect is guided by simulations that describe several aspects of building performance (Brown & Mueller, 2017). This data can be either used as guidelines in the next design decisions, without a direct impact on the generation of the geometry (Deutsch, 2015) or, as shown in the recent studies (Brown & Mueller, 2017, Bianconi & Filippucci ,2019, can be connected to the form-finding algorithms. ...
... This method is used in multi-objective design processes, where an architect is guided by simulations that describe several aspects of building performance (Brown & Mueller, 2017). This data can be either used as guidelines in the next design decisions, without a direct impact on the generation of the geometry (Deutsch, 2015) or, as shown in the recent studies (Brown & Mueller, 2017, Bianconi & Filippucci ,2019, can be connected to the form-finding algorithms. In this case, data-driven design can effectively support the designer in problem-solving by comparing different generated design solutions. ...
Article
Full-text available
This paper describes an innovative methodology allowing upcycling production waste into legitimate construction material for spatial structures, with minimal change to elements` shape. The system is based on interlocking joints between the boards. The plates are organized around nodes, creating a three-dimensional reciprocal system guaranteeing the stability of the entire structure, without any fasteners. We use an inversed, data-driven design process, in which unique components are defining the form of the structure. The design-to-production workflow consists of measuring and labelling of the elements, creating a data file, data-driven generation of the structure with a custom form-finding algorithm, structural optimization of the form, robotic processing of the scraps and manual assembly. The proposed methodology was tested in public spaces as a temporary pavilion and three wood-clay composite sitting elements, thus practically demonstrating the feasibility of our approach.
... With the availability of new computational design tools with generative capabilities, designers can create hundreds and thousands of alternatives. With these tools, designers iteratively generate alternatives from base model(s) and select a set of alternatives for further processing (Brown and Mueller, 2017). The cycle repeats until satisficing solutions are found (Simon, 1973). ...
... This is a cognitively intense process that can easily cause a cognitive choice-overload problem (Iyengar and Lepper, 2000). We contend that we should support this process with the tools equipped with presenting a gallery or a catalogue of design options (Brown and Mueller, 2017) to review, understand, compare and evaluate alternatives. For this, an investigation is needed to better understand how designers can potentially review and make-sense of such a large design space of multi-dimensional, high performing, fully functional design alternatives. ...
... With the increasing amount of data being used (as inputs) and produced (as output) using parametric design, there is a visible trend toward data-driven design exploratory systems. Most of these systems focus on visualization and interaction of a design space to steer the iterative cycles of exploration and optimization (Bradner et al., 2014;Brown and Mueller, 2017;Turrin et al.,2011;Joyce, 2015;Harding and Shepherd, 2017;Mueller and Ochsendorf, 2015). They address the exploration of design alternatives using data-driven, performance-based optimization strategies. ...
... However, to influence the design, simulation must provide designers with actionable insights and feedback that improve overall design quality. Such design guidance can be achieved through techniques including optimization or design space exploration [5]. Multi-objective optimization in particular allows for considering multiple performance criteria in the design process, providing a comprehensive evaluation of different design options [6,7]. ...
... Multi-objective optimization in particular allows for considering multiple performance criteria in the design process, providing a comprehensive evaluation of different design options [6,7]. This can lead to better design decision-making and help ensure that the final design meets all stakeholder requirements [5,8]. ...
Article
Multi-objective optimization can enhance design quality through performance simulation. However, ease or efficiency of construction is also important, and optimization may lead to difficult-to-build designs. Early quantification of constructability would allow designers to balance performance and construction issues. While it is difficult to quantify all factors that influence constructability, robotic construction simulations offer rich datasets to compare potential outcomes. This paper integrates constructability knowledge into early-stage design and examines the impact on multi-objective optimization. It evaluates robotic material delivery systems in constructing a standalone classroom optimized for structural, daylighting, and energy goals. When considering robotic pick and place time, the optimized designs differ, offering the opportunity to change design directions based on construction knowledge. Broader implications also become observable, as incorporating robots with higher carrying capacities reduces the structural elements and embodied carbon of optimal designs. This paper thus demonstrates benefits of incorporating robotic constructability simulation into early design optimization.
... The combination gives rise to a design process where considerations that would traditionally take place at late design phases can now become part of the early formative phases. This warrants attention to research for systematically reviewing and building tools for interacting with and exploring design spaces considering design performance data beyond what design catalog systems currently offer (Brown and Mueller, 2017b). ...
... In this work, we build on the argument for design space simplification (Erhan et al., 2015;Brown and Mueller, 2017b) through the techniques of clustering and dimensionality reduction which expose and exploit the (dis)similarity between design alternatives. We further extend it by integrating the results of this simplification in a larger, interactive design analysis dashboard. ...
... These techniques allow the evaluation step in searching for the best performing options to be simplified. More precisely, in the case of multi-objective optimisation, they support guided explorations of the design space, providing sub-optimal options (Brown & Mueller, 2017;Turrin, Von Buelow, & Stouffs, 2011;Yang Ren, Turrin, Sariyildiz, & Sun, 2018), from which the designer has to make a selection (Fig. 2b). Although very powerful in solving well-defined problems, optimisation techniques do not always offer the flexibility and the responsiveness necessary in early, ill-defined design stages. ...
... In combination with filtering functions, these algorithms offer the possibility to manage vast, multidimensional design options and eventually allow designers to negotiate quantitative and qualitative aspects according to personal preferences (Harding & Olsen, 2018;Fuhrimann, Moosavi, Ohlbrock, & D'Acunto, 2018;Saldana Ochoa, Ohlbrock, D'Acunto, & Moosavi, 2020). Following this approach, the designer is prevented from being overwhelmed (Brown & Mueller, 2017) by examining all the options individually and at the same time is not forced to focus exclusively on quantitative aspects. In line with the approach of Saldana Ochoa et al. (2020), the present research also implements a design process that includes generation, evaluation, clustering, and selection steps with the scope of considering both quantitative performance criteria and qualitative preferences of the designer while preserving the simplicity and flexibility needed in the early design stage. ...
Article
Full-text available
The design of building envelopes requires a negotiation between qualitative and quantitative aspects belonging to different disciplines, such as architecture, structural design, and building physics. In contrast to hierarchical linear approaches in which various design aspects are considered and conceived sequentially, holistic frameworks allow such aspects to be taken into consideration simultaneously. However, these multi-disciplinary approaches often lead to the formulation of complex high-dimensional design spaces of solutions that are generally not easy to handle manually. Computational optimisation techniques may offer a solution to this problem; however, they mainly focus on quantitative aspects, not always guaranteeing the flexibility and interactive responsiveness designers need in the early design stage. The use of intuitive geometry-based generative tools, in combination with machine learning algorithms, is a way to overcome the issues that arise when dealing with multi-dimensional design spaces without necessarily replacing the designer with the machine. The presented research follows a human-centred design framework in which the machine assists the human designer in generating, evaluating, and clustering large sets of design options. Through a case study, this paper suggests ways of making use of interactive tools that do not overlook the performance criteria or personal preferences of the designer while preserving the simplicity and flexibility needed in the early design stage.
... The present work tries to overcome these quantitative and qualitative limitations, allowing architects and engineers to go beyond their individual approachable design horizons and thereby supporting the idea of diversity-driven design (Brown and Mueller [2]). This should support the designer in an early phase, either to explore various solutions for an abstract study or to find design variations within a defined design brief. ...
... The theoretical base of this work builds on the combination of a form-finding methodology and state-of-the-art machine learning algorithms. Previous works such as Brown and Mueller [2], Mueller and Ochsendorf [3] or Shea and Cagan [4] have shown the great potential of datadriven structural design. ...
Preprint
Full-text available
The aim of this research is to introduce a novel structural design process that allows architects and engineers to extend their typical design space horizon and thereby promoting the idea of creativity in structural design. The theoretical base of this work builds on the combination of structural form-finding and state-of-the-art machine learning algorithms. In the first step of the process, Combinatorial Equilibrium Modelling (CEM) is used to generate a large variety of spatial networks in equilibrium for given input parameters. In the second step, these networks are clustered and represented in a form-map through the implementation of a Self Organizing Map (SOM) algorithm. In the third step, the solution space is interpreted with the help of a Uniform Manifold Approximation and Projection algorithm (UMAP). This allows gaining important insights in the structure of the solution space. A specific case study is used to illustrate how the infinite equilibrium states of a given topology can be defined and represented by clusters. Furthermore, three classes, related to the non-linear interaction between the input parameters and the form space, are verified and a statement about the entire manifold of the solution space of the case study is made. To conclude, this work presents an innovative approach on how the manifold of a solution space can be grasped with a minimum amount of data and how to operate within the manifold in order to increase the diversity of solutions.
... The present work tries to overcome these quantitative and qualitative limitations, allowing architects and engineers to go beyond their individual approachable design horizons and thereby supporting the idea of diversity-driven design (Brown and Mueller [2]). This should support the designer in an early phase, either to explore various solutions for an abstract study or to find design variations within a defined design brief. ...
... The theoretical base of this work builds on the combination of a form-finding methodology and state-of-the-art machine learning algorithms. Previous works such as Brown and Mueller [2], Mueller and Ochsendorf [3] or Shea and Cagan [4] have shown the great potential of datadriven structural design. ...
Conference Paper
Full-text available
The aim of this research is to introduce a novel structural design process that allows architects and engineers to extend their typical design space horizon and thereby promoting the idea of creativity in structural design. The theoretical base of this work builds on the combination of structural form-finding and state-of-the-art machine learning algorithms. In the first step of the process, Combinatorial Equilibrium Modelling (CEM) is used to generate a large variety of spatial networks in equilibrium for given input parameters. In the second step, these networks are clustered and represented in a form-map through the implementation of a Self Organizing Map (SOM) algorithm. In the third step, the solution space is interpreted with the help of a Uniform Manifold Approximation and Projection algorithm (UMAP). This allows gaining important insights in the structure of the solution space. A specific case study is used to illustrate how the infinite equilibrium states of a given topology can be defined and represented by clusters. Furthermore, three classes, related to the non-linear interaction between the input parameters and the form space, are verified and a statement about the entire manifold of the solution space of the case study is made. To conclude, this work presents an innovative approach on how the manifold of a solution space can be grasped with a minimum amount of data and how to operate within the manifold in order to increase the diversity of solutions.
... If the design options can be encoded parametrically, designers can potentially improve the outcome by considering many more possibilities. This is commonly done using a brute-force method, which involves creating many designs and then filtering through the data to identify a few acceptable design solutions (Brown & Mueller, 2017). If the problem can be formulated well, structural optimization can provide a more efficient design approach. ...
Conference Paper
Full-text available
Numerical analyses can aid design exploration, but there are several computational approaches available to consider design options. These range from “brute-force” search to optimization. However, the implementation of optimization can be challenging for the complex, time-intensive analyses required to assess seismic performance. In response to this challenge, this study tests several optimization strategies for the direct displacement-based design of a lateral force-resisting system (LFRS) using mass timber panels with U-shaped flexural plates (UFPs) and post-tensioning high-strength steel rods. The study compares two approaches: (1) a brute-force sampling of designs and data filtering to determine acceptable solutions; and (2) various automated optimization algorithms. The differential evolution algorithm was found to be the most efficient and robust approach, saving 90% of computational cost compared to brute-force sampling while producing comparable solutions. However, every optimization formulation did not return best range of design options, often requiring reformulation or hyperparameter tuning to ensure effectiveness.
... Beyond technological advancements, the successful implementation of AI in the AEC sector requires robust interdisciplinary collaboration. It is not just about integrating AI tools but ensuring that AI experts, environmental scientists, urban planners, and architects collaboratively harness these tools to create sustainable and efficient designs [41]. This collaborative approach ensures that the technological solutions align with the real-world complexities of urban environments and architectural challenges. ...
Article
Full-text available
Cities and buildings represent the core of human life, the nexus of economic activity, culture, and growth. Although cities cover less than 10% of the global land area, they are notorious for their substantial energy consumption and consequential carbon dioxide (CO2) emissions. These emissions significantly contribute to reducing the carbon budget available to mitigate the adverse impacts of climate change. In this context, the designers’ role is crucial to the technical and social response to climate change, and providing a new generation of tools and instruments is paramount to guide their decisions towards sustainable buildings and cities. In this regard, data-informed digital tools are a viable solution. These tools efficiently utilise available resources to estimate the energy consumption in buildings, thereby facilitating the formulation of effective urban policies and design optimisation. Furthermore, these data-driven digital tools enhance the application of algorithms across the building industry, empowering designers to make informed decisions, particularly in the early stages of design. This paper presents a comprehensive literature review on artificial intelligence-based tools that support performance-driven design. An exhaustive keyword-driven exploration across diverse bibliographic databases yielded a consolidated dataset used for automated analysis for discerning the prevalent themes, correlations, and structural nuances within the body of literature. The primary findings indicate an increasing emphasis on master plans and neighbourhood-scale simulations. However, it is observed that there is a lack of a streamlined framework integrating these data-driven tools into the design process.
... Thus, a data-driven design [25] is developed where one relates to "complex systems" not by looking for simplifications or optimal solutions, rather by creating a combinatorial explosion [26], contrasting this Evolutionary Engineering marked by Multiscale Analysis [26][27][28] as an alternative to the "iterative and incremental" standard. Optimization strategies [29,30] are structurally and environmentally optimized form-finding solutions [31] that enable the designer to visualize and evaluate thousands of design options and variants [32] through the materialization of the issue of architectural complexity [33], from the perspective of mass customization [34][35][36][37][38][39][40][41][42][43]. ...
Chapter
The study reports the outcomes of a research activity focused on digitization techniques and in particular on the value of computational design, with the aim of implementing innovative product and process solutions resulting from an integrated design approach. The digitization paths focused on representative strategies for digital optimization of the architectural form of wooden houses as a function of context, based on research on generative modelling and evolutionary algorithms for multi-objective optimization applied to the architecture of wooden houses. With such an approach, centered on artificial intelligence or at least on augmented computational intelligence, it was possible to achieve a process of mass customization of meta-planning solutions of wooden architectures, based on the morphological and energetic selection of the best configurations, identified according to the context. These results were made accessible through a web-based configurator that provides the designer with initial configurations from which starting the real project. The studies are projected to the definition of a prototype of the “breathing house,” characterized by its moisture-responsive wooden panels, with the identification of innovative solutions capable of reacting passively to changes in humidity according to the “natural intelligence” of the material, whose morphological transformation, empirically studied and digitally transcribed to identify performance solutions, generates well-being for living.
... Despite their limitations, parametric models can explore a range of options within an initial design concept, highlighting performance trends or tradeoffs when the concept is still flexible. Many design approaches have been proposed at this stage, including design catalogues (Brown and Mueller 2017), interactive and automated exploration (Ferrara et al. 2014), and machine learning for visualization (Wortmann 2017;Danhaive and Mueller 2021). However, generating a custom parametric model is time-consuming, and there can be considerable uncertainty in the design variables, simulation settings, and metrics, even if the problem is simplified. ...
... The root idea of TDAs-communicating information across digital and human boundaries while leveraging computation of the former and the heuristic judgment of the latter-drives advances in fields as far as interactive architectural design [44] and, closer to home, human-robot interaction [45]- [47]. Thus, TDAs serve as organic documentation between human and AI decision-makers. ...
Article
Full-text available
Motivated by a changing acoustic environment in the Arctic Beaufort Sea, in this article, we present a tactical decision aid framework for a human decision-maker collaborating with an autonomous underwater vehicle (AUV) to integrate the vertical sound-speed profile for underwater navigation and communication. In a predeployment phase, using modeled and real oceanographic data, we load basis function representations of the sound-speed perturbations onto one or more AUVs on deck, where a handful of weights can estimate a sound-speed profile. During deployment, these weights are updated on an AUV through a digital acoustic message to improve navigation and reciprocal communication throughout the duration of an under-ice mission. Field work applying this framework in the Beaufort Sea is presented, highlighting key decisions regarding predeployment oceanographic data assimilation, CTD cast collection, and in situ weight choice. Selected examples evaluate the framework’s ability to adapt to a depth-limited CTD cast and the appearance of an anomalous microlens feature in the profile. We show that the framework effectively balances the need to adapt in a changing acoustic environment in real time while maintaining operator trust in an AUV’s embedded intelligence.
... able data streams and generate novel forms and spatial opportunities [BM17,GR17]. Physicalizations with this representational idiom aim to provide an atmospheric experience for users while reflecting a message from their target data. ...
Article
Full-text available
Physical representations of data offer physical and spatial ways of looking at, navigating, and interacting with data. While digital fabrication has facilitated the creation of objects with data‐driven geometry, rendering data as a physically fabricated object is still a daunting leap for many physicalization designers. Rendering in the scope of this research refers to the back‐and‐forth process from digital design to digital fabrication and its specific challenges. We developed a corpus of example data physicalizations from research literature and physicalization practice. This survey then unpacks the “rendering” phase of the extended InfoVis pipeline in greater detail through these examples, with the aim of identifying ways that researchers, artists, and industry practitioners “render” physicalizations using digital design and fabrication tools.
... Parametric techniques have been especially useful for performance-driven or performanceinformed design approaches, in which different options generated by a parametric system are simulated with respect to quantitative objectives that inform design choices [89,88]. To systematically explore these options, designers might use design space exploration techniques such as sampling to create a catalog, automated optimization, or interactive optimization [18,48,13,14,11]. Broad design space exploration techniques provide an overall view of performance trends across various possibilities, while optimization converges on the best outcome for a specifically defined problem. ...
Article
This paper describes a parametric framework for early-stage tall timber design, which enables a comparison across a range of geometries with respect to embodied carbon of the main structural elements for two mass timber structural systems: post-beam-panel and post-and-platform. This research finds that the post-beam-panel system on average has a higher embodied carbon that increases faster as the building height increases. In both structural systems, the floors are a major contributor to the embodied carbon. The framework can inform higher resolution decisions, help designers detect broad trends and relationships in the tall timber design space, and allow designers to adapt geometries and immediately assess the impact on the embodied carbon. For each geometry, the embodied carbon of both structural systems can be compared to determine which system is more appropriate. Within each structural system, the timber elements with the largest material volume can be identified, and alternative materials can be considered. As a result, the framework permits designers to make simultaneous decisions on the structural system and the geometry from the perspective of both aesthetic and embodied carbon performance. By providing guidance and feedback on early-stage design, this paper aims to increase the recognition and application of timber as a structural material for large-scale construction.
... In some scenarios this could mean displaying hundreds or thousands of options with only slight variations. This can potentially overwhelm the decision maker with choice fatigue, especially when considering numerous dimensions and objectives [26]. ...
Article
Full-text available
Decision-makers across many professions are often required to make multi-objective decisions over increasingly larger volumes of data with several competing criteria. Data visualization is a powerful tool for exploring these complex solution spaces, but there is little research on its ability to support multi-objective decisions. In this paper, we explore the effects of visualization design and data volume on decision quality in multi-objective scenarios with complex trade-offs. We look at the impact of four common multidimensional chart types (scatter plot matrices, parallel coordinates, heat maps, radar charts), the number of options and dimensions, the ratio of number of dimensions considered to the number of dimensions shown, and participant demographics on decision time and accuracy when selecting the optimal option. As objectively evaluating the quality of multi-objective decisions and the trade-offs involved is challenging, we employ rank- and score-based accuracy metrics. Our findings show that accuracy is comparable across all four visualizations, but that it improves when users are shown fewer options and consider fewer dimensions in their decision. Similarly, considering fewer dimensions imparts a speed advantage, with heat maps being the fastest among the four charts types. Participants who use charts frequently were observed to perform significantly faster, suggesting that users can potentially be trained to effectively use visualizations in their decision-making.
... These physicalizations are mostly architectural spaces or artistic installations, designed with data, for the purpose of conveying a message. Architects and designers now use computational design methods to leverage available data streams and generate novel forms and spatial opportunities [BM17,GR17]. Physicalizations with this representational idiom aim to provide an atmospheric experience for users while reflecting a message from their target data. ...
Preprint
Full-text available
Physical representations of data offer physical and spatial ways of looking at, navigating, and interacting with data. While digital fabrication has facilitated the creation of objects with data-driven geometry, rendering data as a physically fabricated object is still a daunting leap for many physicalization designers. Rendering in the scope of this research refers to the back-and-forth process from digital design to digital fabrication and its specific challenges. We developed a corpus of example data physicalizations from research literature and physicalization practice. This survey then unpacks the "rendering" phase of the extended InfoVis pipeline in greater detail through these examples, with the aim of identifying ways that researchers, artists, and industry practitioners "render" physicalizations using digital design and fabrication tools.
... Result of a research agreement with the Italian start-up Abitare+ involved in timber construction, the research proposes an integrated design and a production process for CLT constructions, which is based on mass-customization experiences (Benros and Duarte 2009;Bergin and Steinfeld 2012;Duarte 2005), multi-objective optimization strategies (Aish and Woodbury 2005; Kolarevic and Malkawi 2005), and data-driven design (Brown and Mueller 2017). In this process, natural inspired form-finding strategies (Bergmann and Hildebrand 2015) are used to select optimal solutions from a structural and environmental point of view, allowing the designer to visualize and evaluate thousands of design options and variations (Self and Vercruysse 2017)o f the same product (Bianconi et al. 2017f) through the materialization of architecture's complexity question (Scheurer 2010). ...
Book
This book explores various digital representation strategies that could change the future of wooden architectures by blending tradition and innovation. Composed of 61 chapters, written by 153 authors hailing from 5 continents, 24 countries and 69 research centers, it addresses advanced digital modeling, with a particular focus on solutions involving generative models and dynamic value, inherent to the relation between knowing how to draw and how to build. Thanks to the potential of computing, areas like parametric design and digital manufacturing are opening exciting new avenues for the future of construction. The book’s chapters are divided into five sections that connect digital wood design to integrated approaches and generative design; to model synthesis and morphological comprehension; to lessons learned from nature and material explorations; to constructive wisdom and implementation-related challenges; and to parametric transfigurations and morphological optimizations.
... In addition, there are two limitations in the current research. First, the application of ML techniques in parametric design remains contingent on effective communication of the design intent and evaluation metrics with the machine, nevertheless human preferences are complex and require manual attention [41]. Second, the applicability of trained algorithms in novel situations need to be improved. ...
Article
The building envelope is a key factor affecting the energy efficiency of buildings. Often, building envelope design requires both explicit knowledge and tacit knowledge. In practice, the tacit knowledge embedded in existing envelope design cases is significantly underused. To enhance the application of such knowledge, this paper proposes a case-based reasoning (CBR) model for the decision support of building envelop design during the preliminary design stage. A case library of 100 certificated green public and commercial buildings is used to develop the model. An experiment on a test case is performed to verify the methodology and usability of the model. Furthermore, records for 25 certificated green public buildings are used to validate the effectiveness of the model. The result shows that the accuracy rate of the CBR model is 84% when considering heating and cooling demands of envelopes. In conclusion, the proposed CBR model is promising for improving the efficiency of envelope design for public and commercial buildings while reducing the reliance on expert participation.
... However, the physical rationalisation approach is the most common in research today (Luo et al. 2018) (Shi et al. 2018). The technical complexity of new physical simulation models forces designers to rely on tools whose physical principles governing simulation are hidden in favour of a simplified interface (Gaudillière 2020) (Mueller and Brown 2017). Such software with a simplified interface are accessible without any in-depth technical knowledge, be it in the field of physics of in the field of computation. ...
Conference Paper
Full-text available
Kangaroo Physics, a physical simulation engine, is amongst the most used form-finding tool with nearly 500 000 downloads. Mostly resorted to by users with moderate computation skills, it provides a simplified interface for an advanced simulation tool. It is a Particle Spring System relying on the Dynamic Relaxation method and offering a wide design space. Thanks to the visual scripting interface provided by Grasshopper, the user has access to a fixed set of physical``goals'' and unitless variables, without having to work with more complex aspects of the Kangaroo physical model. This setup induces a disconnection between the user and the physical model with its variables. The goal of this research is to introduce, within the Grasshopper environment, a tensile parameter, the Young Modulus, into the Kangaroo model. Thus, while preserving the design freedom of the plug-in, a better understanding of the physical behaviour modelled in Kangaroo is offered to neophytes, as well as better control of material properties.
... Ideally, these tools should be flexible and easily integrated into existing design approaches. Technologically savvy designers already generate design space catalogs [12,13], conduct architectural optimization [14][15][16][17][18][19][20][21], integrate technical and architectural design goals [22,23], and even implement surrogate modeling and other techniques in their workflows [24][25][26][27][28][29]. ...
Article
Designers in architecture and engineering are increasingly employing parametric models linked to performance simulations to assist in early building design decisions. This context presents a clear opportunity to integrate advanced functionality for engaging with quantitative design objectives directly into computational design environments. This paper presents a toolbox for data-driven design, which draws from data science and optimization methods to enable customized workflows for early design space exploration. It then applies these approaches to a multi-objective conceptual design problem involving structural and energy performance for a long span roof with complex geometry and considerable design freedom. The case study moves from initial brainstorming through design refinement while demonstrating the advantages of flexible workflows for managing design data. Through investigation of a realistic early design prompt, this paper reveals strengths, limitations, potential pitfalls, and future opportunities for data-driven parametric design.
... However, that promise remains contingent on communicating the design intent and evaluation metrics to the machine. While certain design attributes, such as space accessibility, daylighting, and visual distractions, can be measured quantitatively, human preferences are still complex to measure manually [16] . Limitations of current research also includes a lack of the large amounts of data necessary to train the algorithms, the applicability of the trained algorithm to novel situations, and the black-box nature of the results. ...
Article
Fueled by big data, powerful and affordable computing resources, and advanced algorithms, machine learning has been explored and applied to buildings research for the past decades and has demonstrated its potential to enhance building performance. This study systematically surveyed how machine learning has been applied at different stages of building life cycle. By conducting a literature search on the Web of Knowledge platform, we found 9579 papers in this field and selected 153 papers for an in-depth review. The number of published papers is increasing year by year, with a focus on building design, operation, and control. However, no study was found using machine learning in building commissioning. There are successful pilot studies on fault detection and diagnosis of HVAC equipment and systems, load prediction, energy baseline estimate, load shape clustering, occupancy prediction, and learning occupant behaviors and energy use patterns. None of the existing studies were adopted broadly by the building industry, due to common challenges including (1) lack of large scale labeled data to train and validate the model, (2) lack of model transferability, which limits a model trained with one data-rich building to be used in another building with limited data, (3) lack of strong justification of costs and benefits of deploying machine learning, and (4) the performance might not be reliable and robust for the stated goals, as the method might work for some buildings but could not be generalized to others. Findings from the study can inform future machine learning research to improve occupant comfort, energy efficiency, demand flexibility, and resilience of buildings, as well as to inspire young researchers in the field to explore multidisciplinary approaches that integrate building science, computing science, data science, and social science.
... However, the combination of data visualization and catalog presentation has become an effective way to enhance the decision-making process in architectural design [35]. Furthermore, design space catalogs, which present a collection of different options for selection by a human designer, have become commonplace in architecture [36]. Significant examples are the catalog built by Heinz Isler for his concrete shells [37] or the design space created by Greg Lynn for his Embryological house [38]. ...
Article
The research proposes a model for mass-customized housing in the emerging context of Industry 4.0 promoted by the European Union as a mean for technological and industrial innovation. With the aim to develop a cross-laminated timber (CLT) model for the Architecture, Engineering, and Construction (AEC) industry, the study deepens the possibility of using generative models and evolutionary principles to inform the customization process in the early stage of design. By trying to bring the latest innovation in the field of computer science and information technology to customers who typically are not proficient with algorithmic design and computation, the research builds up an intuitive interface that allows customers to explore different design solutions. Related to the scale of a single-family house, this model is intended to be used as a decision support system for the design of residential and emergency homes in central Italy.
... Result of a research agreement with the Italian start-up Abitare+ involved in timber construction, the research proposes an integrated design and a production process for CLT constructions, which is based on mass-customization experiences (Benros and Duarte 2009;Bergin and Steinfeld 2012;Duarte 2005), multi-objective optimization strategies (Aish and Woodbury 2005; Kolarevic and Malkawi 2005), and data-driven design (Brown and Mueller 2017). In this process, natural inspired form-finding strategies (Bergmann and Hildebrand 2015) are used to select optimal solutions from a structural and environmental point of view, allowing the designer to visualize and evaluate thousands of design options and variations (Self and Vercruysse 2017)o f the same product (Bianconi et al. 2017f) through the materialization of architecture's complexity question (Scheurer 2010). ...
Chapter
The contemporary development and digital culture in architecture, from the idea to the realization, lead to a rewriting of the coordinates of the deep relation between model and pre-figuration, especially in the timber structure field. Artificial intelligence opened new potentialities that rewrite the project paths through the evaluation of computational design, with a model set as the place of simulation and experimentation, in order to locate solutions for more and more high requests made by architecture. Wood’s natural intelligence inspires artificial intelligence’s principles, and it is projected as the new frontier of the research, in its possibility of defying optimized solutions also in function of multiples objectives and parameters. Wooden architecture design correlated to a history of tradition, which is established on descriptive geometry, finds today multiple application fields for the research. In this sense, representation supports the knowledge and the innovation, able to continue and express its operative aspect full of culture and, at the same time, its tecné sense, which etymologically it is meant as art and technique. The present chapter shows different ways to apply the contemporary principle of descriptive geometry in digital wood design research, in a multidisciplinary and contaminated learning environment. In all the illustrated cases, the generative design has a central role, in an integration addressed to the need of optimization of architectural form, using Genetic algorithms in order to analyze and to understand the relationship between form, geometry, and construction.
... If this technique is applied directly to architectural conceptual design, architects may choose to explore the design space immediately after the surrogate model is built, either through optimization or using sliders to select different variable combinations and gain real-time feedback about the performance response of each possibility. 20,21 However, the initial variable selection and problem setup may be inadequate for an assortment of reasons. While considering the setup process itself to be iterative, there are additional analyses that can provide further information and value to the designer at this early stage. ...
Article
Many architectural designers recognize the potential of parametric models as a worthwhile approach to performance-driven design. A variety of performance simulations are now possible within computational design environments, and the framework of design space exploration allows users to generate and navigate various possibilities while considering both qualitative and quantitative feedback. At the same time, it can be difficult to formulate a parametric design space in a way that leads to compelling solutions and does not limit flexibility. This article proposes and tests the extension of machine learning and data analysis techniques to early problem setup in order to interrogate, modify, relate, transform, and automatically generate design variables for architectural investigations. Through analysis of two case studies involving structure and daylight, this article demonstrates initial workflows for determining variable importance, finding overall control sliders that relate directly to performance and automatically generating meaningful variables for specific typologies.
... A building can be visualised as data and additional information like daylighting results can be imported and visualised as false colour maps to study design performance. Compared to traditional approaches, BPS is more readily incorporated into modern workflows, increasing awareness and appreciation towards quantitative building data (Brown and Mueller, 2017). The success of BPO as a design tool that can explore large design spaces, hinges upon data communication (Vierlinger, 2013). ...
Conference Paper
Full-text available
This paper describes a visualisation framework for organising data outputs generated from a complex multi-objective optimisation design space. The three-part visualisation framework is structured to identify solution clusters along the Pareto front, before providing designers with design space navigation control through genotype value displays. These provide smaller focus design spaces which are then supported by spatial displays of a series of solution phenotypes and their respective performance simulations. This visualisation tool aims to provide a diverse range of quantitatively high-performance design solutions determined through building performance simulations, that can further subsequent higher order qualitative design evaluation such as design aesthetics.
... A building can be visualised as data and additional information like daylighting results can be imported and visualised as false colour maps to study design performance. Compared to traditional approaches, BPS is more readily incorporated into modern workflows, increasing awareness and appreciation towards quantitative building data (Brown and Mueller, 2017). The success of BPO as a design tool that can explore large design spaces, hinges upon data communication (Vierlinger, 2013). ...
Conference Paper
Full-text available
This paper describes a visualisation framework for organising data outputs generated from a complex multi-objective optimisation design space. The three-part visualisation framework is structured to identify solution clusters along the Pareto front, before providing designers with design space navigation control through genotype value displays. These provide smaller focus design spaces which are then supported by spatial displays of a series of solution phenotypes and their respective performance simulations. This visualisation tool aims to provide a diverse range of quantitatively high-performance design solutions determined through building performance simulations, that can further subsequent higher order qualitative design evaluation such as design aesthetics.
Article
Designers in the built-environment disciplines increasingly solve problems using generative design methods, which promise novel and performant solutions to design problems but produce large design spaces that are challenging to explore. Design Space Exploration (DSE) interfaces have been used to understand, refine and narrow design spaces. Still, a critical analysis of current DSE interfaces shows a gap between their features and how designers explore and make decisions. We conducted a design study with domain experts to develop a DSE interface (DesignSense) that tightly integrates and adds to several features found separately in current DSE systems. We present a formative focus group evaluation, which suggested more areas for improvement and highlighted the need to distinguish designers from scientists as two user groups of DSE systems with varying needs, amongst other findings.
Article
Full-text available
The design of buildings has become a complex and multidisciplinary problem involving multiple conflicting objectives as architects and designers address competing technical, economic, environmental, and societal concerns. This has been driving research in Architecture, Engineering, and Construction (AEC) toward rigorous multidisciplinary decision-making frameworks that generate and evaluate numerous design alternatives using multi-objective optimization in concert with simulation and analysis models of varying fidelity and computational expense. While such frameworks are well known and widely employed in the aerospace and systems engineering domains, efforts by design professionals and researchers in the AEC field are scattered at best. In this paper, we provide a detailed review of recent developments in optimization frameworks in the AEC field and subsequently highlight how such developments are largely compartmentalized into separate domains such as structural, energy, daylighting, and other performance factors. We further discuss the technical challenges involved in concurrent coupled multidisciplinary design optimization (MDO) in the AEC field such as interoperability issues between Building Information Modeling (BIM) environments, analysis/simulation tools, and optimization frameworks. We conclude by outlining research needed for more unified modeling and simulation-based optimization frameworks to aid in complex and multidisciplinary building design processes. We also highlight the need for the identification and development of multi-fidelity simulation tools for use across multiple design phases. As such, this paper contributes a novel roadmap to leverage aerospace and systems engineering research in MDO into the field of AEC, along with a call for researchers in the MDO community to seek collaborations in AEC field.
Conference Paper
Full-text available
Many recent contributions in computational structural design have argued that design quality can be improved when performance feedback and guidance are part of the conceptual design process. However, the effect of multi-objective feedback and guidance tools has not been studied extensively. This paper presents the results of an educational study that tests the direct relationship between conceptual design tools and the simulated performance of resulting designs. In the study, students were tasked with designing a restaurant canopy roof using a series of increasingly performance-driven computational design tools. Although there was no consensus on preferred workflows or aesthetic preferences, the average designs chosen using real time feedback or directed optimization performed significantly better in terms of deflection and emissions than those chosen through free exploration. Overall, this research establishes a link between design tools and performance outcomes, while strengthening the argument for further integration of performance feedback into early stage design processes.
Conference Paper
Full-text available
In architectural and structural design, current modeling and analysis tools are extremely powerful and allow one to generate and analyze virtually any structural shape. However, most of them do not allow designers to integrate structural performance as an objective during conceptual design. As structural performance is highly linked to architectural geometry, there is a need for computational strategies allowing for performance-oriented structural design in architecture. In order to address these issues, the research presented in this paper combines interactive evolutionary optimization and parametric modeling to develop a new computational strategy for creative and high-performance conceptual structural design. Parametric modeling allows for quick exploration of complex geometries and can be combined with analysis and optimization algorithms for performance-driven design. However, this methodology often limits the designer's authorship, since it is based on the use of black-box optimizers. On the other hand, interactive evolutionary optimization empowers the user by acknowledging his or her input as fundamental and includes it in the evolutionary optimization process. This approach aims at improving the structural performance of a concept without limiting the creative freedom of designers. Taking advantage of the two frameworks, this research implements an interactive evolutionary structural optimization framework in the widely used parametric modeling environment constituted by Rhinoceros and Grasshopper (Robert McNeel & Associates [15], [16]). The implemented design tool capitalizes on Grasshopper's versatility for geometry generation but supplements the visual programming interface with a flexible GUI portal, increasing the designer's creative freedom through enhanced interactivity. The tool can accommodate a wide range of structural typologies and geometrical forms in an integrated environment. The paper includes a description of the tool and demonstrates its applications and benefits through several conceptual design case studies.
Conference Paper
Full-text available
This paper presents Opossum, a new optimization plug-in for Grasshopper, a visual data-flow modelling software popular among architects. Opossum is the first publicly available, model-based optimization tool aimed at architectural design optimization and especially applicable to problems that involve time-intensive simulations of for example day-lighting and building energy. The paper details Opossum's design and implementation and compares its performance to four single-objective and one multi-objective solver. The test problem is time-intensive and simulation-based: optimizing a screened façade for daylight and glare. Opossum outperforms the other single-objective solvers and finds the most accurate approximation of the Pareto front.
Article
Full-text available
Parametric modelling software often maintains an explicit history of design development in the form of a graph. However, as the graph increases in complexity it quickly becomes inflexible and unsuitable for exploring a wide design space. By contrast, implicit low-level rule systems can offer wide design exploration due to their lack of structure, but often act as black boxes to human observers with only initial conditions and final designs cognisable. In response to these two extremes, the authors propose a new approach called Meta-Parametric Design, combining graph-based parametric modelling with genetic programming. The advantages of this approach are demonstrated using two real case-study projects that widen design exploration whilst maintaining the benefits of a graph representation.
Conference Paper
Full-text available
Evolutionary methods with artificial selection have been shown to be a useful computer-human technique for exploring wide design spaces with unknown goals. This paper investigates a similar approach in the evolution of visual programs currently used in popular parametric modelling software. Although associative models provide a useful cognitive artifact for the designer to interact with, they are often bound by their topological structure with the designer left to adjusting (or optimising) variables to explore the design space. By allowing the topological structure of the graph to be evolved as well as the parameters, artificial selection can be employed to explore a wide design space more suited to the early design stage.
Conference Paper
Full-text available
This paper proposes a novel approach for comparing different structural geometries based on data visualisation and illustrates this with bowstring bridges as case study. The aim of the approach is to have a quick and overall insight in different design proposals at the beginning of the design process. The approach will use a dashboard that is a combination of charts. This dashboard will enable the (structural) designer to have all information on the structural behaviour of a series of design proposals at hand and to make informed design decisions based on the compact dashboard.
Article
Full-text available
This paper explores the use of data-driven approximation algorithms, often called surrogate modeling, in the early-stage design of structures. The use of surrogate models to rapidly evaluate design performance can lead to a more in-depth exploration of a design space and reduce computational time of optimization algorithms. While this approach has been widely developed and used in related disciplines such as aerospace engineering, there are few examples of its application in civil engineering. This paper focuses on the general use of surrogate modeling in the design of civil structures and examines six model types that span a wide range of characteristics. Original contributions include novel metrics and visualization techniques for understanding model error and a new robustness framework that accounts for variability in model comparison. These concepts are applied to a multi-objective case study of an airport terminal design that considers both structural material volume and operational energy consumption.
Article
Full-text available
Climate change, resource depletion, and worldwide urbanization feed the demand for more energy and resource-efficient buildings. Increasingly, architectural designers and consultants analyze building designs with easy-to-use simulation tools. To identify design alternatives with good performance, designers often turn to optimization methods. Randomized, metaheuristic methods such as genetic algorithms are popular in the architectural design field. However, are metaheuristics the best approach for architectural design problems that often are complex and ill defined? Metaheuristics may find solutions for well-defined problems, but they do not contribute to a better understanding of a complex design problem. This paper proposes surrogate-based optimization as a method that promotes understanding of the design problem. The surrogate method interpolates a mathematical model from data that relate design parameters to performance criteria. Designers can interact with this model to explore the approximate impact of changing design variables. We apply the radial basis function method, a specific type of surrogate model, to two architectural daylight optimization problems. These case studies, along with results from computational experiments, serve to discuss several advantages of surrogate models. First, surrogate models not only propose good solutions but also allow designers to address issues outside of the formulation of the optimization problem. Instead of accepting a solution presented by the optimization process, designers can improve their understanding of the design problem by interacting with the model. Second, a related advantage is that designers can quickly construct surrogate models from existing simulation results and other knowledge they might possess about the design problem. Designers can thus explore the impact of different evaluation criteria by constructing several models from the same set of data. They also can create models from approximate data and later refine them with more precise simulations. Third, surrogate-based methods typically find global optima orders of magnitude faster than genetic algorithms, especially when the evaluation of design variants requires time-intensive simulations.
Conference Paper
Full-text available
This paper outlines recent efforts by the author to integrate modern web-based data visualisation techniques into decision-making efforts for structural design projects. It traces the development of data visualisation, detailing powerful new techniques, which exist in modern web browsers, and are already used by many other data-rich professions. Explaining the underlying technology as compared to current engineering visualisation equivalents. With the benefits and potential applications of this technology to design engineering discussed. Then using real design problems, some example applications of these methods applied to structural engineering decision making support are described and explained for supporting large scale option based decision making with structural engineering data.
Article
Full-text available
This paper addresses the need to consider both quantitative performance goals and qualitative requirements in conceptual design. A new computational approach for design space exploration is proposed that extends existing interactive evolutionary algorithms for increased inclusion of designer preferences, overcoming the weaknesses of traditional optimization that have limited its use in practice. This approach allows designers to set the evolutionary parameters of mutation rate and generation size, in addition to parent selection, in order to steer design space exploration. This paper demonstrates the potential of this approach through a numerical parametric study, a software implementation, and series of case studies.
Article
Full-text available
One of the open problems in Semantic Web research is which tools should be provided to users to explore linked data. This is even more urgent now that massive amount of linked data is being released by governments worldwide. The development of single dedicated visualization applications is increasing, but the problem of exploring unknown linked data to gain a good understanding of what is contained is still open. An effective generic solution must take into account the user's point of view, their tasks and interaction, as well as the system's capabilities and the technical constraints the technology imposes. This paper is a first step in understanding the implications of both, user and system by evaluating our dashboard-based approach. Though we observe a high user acceptance of the dashboard approach, our paper also highlights technical challenges arising out of complexities involving current infrastructure that need to be addressed while visualising linked data. In light of the findings, guidelines for the development of linked data visualization (and manipulation) are provided.
Conference Paper
Full-text available
We have developed a data visualization interface that facilitates a design by shopping paradigm, allowing a decision-maker to form a preference by viewing a rich set of good designs and use this preference to choose an optimal design. Design automation has allowed us to implement this paradigm, since a large number of designs can be synthesized in a short period of time. The interface allows users to visualize complex design spaces by using multi-dimensional visualization techniques that include customizable glyph plots, parallel coordinates, linked views, brushing, and histograms. As is common with data mining tools, the user can specify upper and lower bounds on the design space variables, assign variables to glyph axes and parallel coordinate plots, and dynamically brush variables. Additionally, preference shading for visualizing a user’s preference structure and algorithms for visualizing the Pareto frontier have been incorporated into the interface to help shape a decision-maker’s preference. Use of the interface is demonstrated using a satellite design example by highlighting different preference structures and resulting Pareto frontiers. The capabilities of the design by shopping interface were driven by real industrial customer needs, and the interface was demonstrated at a spacecraft design conducted by a team at Lockheed Martin, consisting of Mars spacecraft design experts.
Conference Paper
Full-text available
During the past decade the construction industry has been witnessing a constant shift in the way it operates. The advances of technology have made possible the adaptation of a more direct, performance-driven design approach based on multi-objective - and sometimes contradicting -- criteria of environmental, structural, economic and aesthetic impact. As a consequence, the various teams of consultants involved in the process no longer inform it consecutively, forcing various changes at different stages of the design. Instead, building projects increasingly comprise numerous design issues that can be delegated to small groupings of architects, engineers, and consultants to be resolved simultaneously, in parallel. In the light of this new status quo, the significance of new customized simulation tools and interfaces, capable of providing near real-time feedback and driven by multiple input criteria, looms as a potential game changer to the industry. This paper outlines the advances implemented by the authors to support these new integrated workflows.
Article
Full-text available
Spatial structures often embody generative systems. Both analog (physical modeling) as well as computational methods have been uses to explore the range of design possibilities. Whereas many of the favored physical modeling techniques, such as soap films or catenary nets, inherently generate forms based on certain performative properties, many of the parametric form generating computational methods derive form based solely on geometry, detached from physical performance. ParaGen has been developed as a tool to explore parametric geometry based on aspects of performance. Within the cyclic structure of a genetic algorithm, it incorporates parametric geometry generation, simulation for performance evaluation, and the ability to sort and compare a wide range of solutions based on single or multiple objectives. The results can be visually compared by teams of designers across a graphic web interface which includes the potential for human interaction in parent selection and breeding of further designs. The result is a tool which allows the exploration of the generative design space based on performance as well as visual criteria.
Chapter
Full-text available
This paper looks at a new way of incorporating unsupervised neural networks in the design of an architectural system. The approach involves looking the whole lifecycle of a building and its coupling with its environment. It is argued that techniques such as dimensionality reduction are well suited to architectural design problems whereby complex problems are commonplace. An example project is explored, that of a reconfigurable exhibition space where multiple ephemeral exhibitions are housed at any given time. A modified growing neural gas algorithm is employed in order cognize similarities of dynamic spatial arrangements whose nature are not known a priori. By utilising the machine in combination with user feedback, a coupling between the building system and the users of the space is achieved throughout the whole system life cycle.
Chapter
Full-text available
The research project presented in this paper deals with the development of a creative evolutionary design methodology for layout problems in architecture and urban planning.To date many optimisation techniques for layout problems have already been developed. The first attempts to automate layout were undertaken back in the early 1960s. Since then, these ideas have been taken forward in various different manifestations, for example shape grammars, CBS, cellular automata and evolutionary approaches. These projects, however, are mostly restricted to very specific fields or neglect the creative, designerly component. Since pure optimisation methods are of little practical use for design purposes, there have been no useful attempts to derive a universally applicable method for computer aided layout design.For this we need to be aware that designing is a process that occurs at different levels and degrees of abstraction. The solution space is explored in the realm between intuition and rationality in a variety of ways. Good solutions can only arise through an intensive and fluid dialogue between the designer and the generating system. The goal of our project is to develop an adaptive design system for layout problems. To this end we examine different approaches to achieving the best possible general applicability of such a system and discuss criteria that are crucial for the development of such systems.
Chapter
Full-text available
The purpose of BARCODE HOUSING SYTEM, a research project developed over the last four years, has been to create an Internet-based system which facilitates the interaction of the different actors involved in the design, construction and use of affordable housing built with industrialized methods. One of the components of the system is an environment which enables different users – architects, clients, developers – to retrieve the housing units generated by a rule-based engine and stored in a repository. Currently, the repository contains over 10,000 housing units. In order to access this information, we have developed clustering techniques based on self-organizing maps and k-means methods.
Conference Paper
Full-text available
Parametric design systems model a design as a constrained collection of schemata. Designers work in such systems at two levels: def- inition of schemata and constraints; and search within a schema collec- tion for meaningful instances. Propagation-based systems yield ecient algorithms that are complete within their domain, require explicit speci- fication of a directed acyclic constraint graph and allow relatively simple debugging strategies based on antecedents and consequents. The require- ment to order constraints appears to be useful in expressing specific de- signer intentions and in disambiguating interaction. A key feature of such systems in practice appears to be a need for multiple views onto the con- straint model and simultaneous interaction across views. We describe one multiple-view structure, its development and refinement through a large group of architecture practitioners and its realization in the system Generative Components.
Article
There is a growing expectation for high performance design in architecture which negotiates between the requirements of the client and the physical constraints of a building site. Clients for building projects often challenge architects to maximize view quality since it can significantly increase real estate value. To pursue this challenge, architects typically move through several design revision cycles to identify a set of design options which satisfy these view quality expectations in coordination with other goals of the project. However, reviewing a large quantity of design options within the practical time constraints is challenging due to the limitations of existing tools for view performance evaluation. These challenges include flexibility in the definition of view quality and the ability to handle the expensive computation involved in assessing both the view quality and the exploration of a large number of possible design options. To address these challenges, we propose a catalogue-based framework that enables the interactive exploration of conceptual building design options based on adjustable view preferences. We achieve this by integrating a flexible mechanism to combine different view measures with an indexing scheme for view computation that achieves high performance and precision. Furthermore, the combined view measures are then used to model the building design space as a high dimensional scalar function. The topological features of this function are then used as candidate building designs. Finally, we propose an interactive design catalogue for the exploration of potential building designs based on the given view preferences. We demonstrate the effectiveness of our approach through two use case scenarios to assess view potential and explore conceptual building designs on sites with high development likelihood in Manhattan, New York City.
Article
How might it be possible to create computational systems that are sufficiently intuitive to make human experience of space a design driver? Guest-Editor Christian Derix and Prarthana Jagannath describe a series of research projects that were undertaken at the Centre for Evolutionary Computing in Architecture (CECA) at the University of East London between 1999 and 2009, which put aside a structuralist, performance-led approach in favour of new learning models based on artificial neural networks (ANNs) that have the capacity to respond to human activity.
Article
Building information modelling (BIM) is a powerful tool for clients and architects alike, particularly when clients have ongoing complex programmatic requirements. Chuck Eastman describes how with his team* at the AEC Integration Laboratory at the College of Architecture at the Georgia Institute of Technology he was commissioned by the US federal government's General Service Administration (GSA) to automate the design guidelines for all US courthouses in such a way that preliminary designs by architects could be assessed and checked against specific criteria. Copyright © 2009 John Wiley & Sons, Ltd.
Article
The design of technical systems such as automobiles and spacecraft has traditionally focused exclusively on performance maximization. Many organizations now realize that such an approach must be balanced against competing objectives of cost, risk, and other criteria. If one is willing to give up some amount of performance relative to the best achievable performance level, one introduces slack into system design. This slack can be invested in creating better outcomes overall. One way to achieve this is to balance the requirements among contributing subsystems such that the number of active constraints is minimized, while still achieving the desired system performance. This paper introduces a methodology called “isoperformance” as a means of identifying and evaluating a performance-invariant set of design solutions, which are efficient in terms of other criteria such as cost, risk, and lifecycle properties. Isoperformance is an inverse design method that starts from a desired vector of performance requirements and works backwards to identify acceptable solutions in the design space. To achieve this, gradient-based contour following is implemented as a multivariable search algorithm that manipulates the null set of the Jacobian matrix. Use of the method is illustrated with two examples from spacecraft design and human performance in sports. © 2006 Wiley Periodicals, Inc. Syst Eng 9: 45–61, 2006
Article
This paper reviews the basic approximation concepts used in structural optimization. It also discusses some of the most recent developments in that area since the introduction of approximation concepts in the mid-seventies. The paper distinguishes between local, medium-range and global approximations; it covers function approximations and problem approximations. It shows that, although the lack of comparative data established on reference test cases prevents an accurate assessment, there have been significant improvements. The largest number of developments have been in the areas of local function approximations and use of intermediate variable and response quantities. It appears also that some new methodologies emerge which could greatly benefit from the introduction of new computer architectures.
Article
During the last few years there has been an extraordinary development of computer-aided tools intended to present or communicate the results of architectural projects. But there has not been a comparable progress in the development of tools intended to assist design to generate architectural forms in an easy and interactive way. Even worse, architects who use the powerful means provided by computers as a direct tool to create architectural forms are still an exception. Architecture continues to be produced by traditional means using the computer as little more than a drafting tool. The main reasons that may explain this situation can be identified rather easily, although there will be significant differences of opinion. In my opinion, it is a mistake trying to advance too rapidly and, for instance, proposing integrated design methods using expert systems and artificial intelligence while no adequate tools to generate and modify simple 3D-models are available. The modeling tools we have at the present moment are unsatisfactory. Their principal limitation is the lack of appropriate instruments to modify interactively the model once it has been created. This is a fundamental aspect in any design activity, where the designer is constantly going forward and backwards, re-elaborating once and again some particular aspect of the model, or its general layout, or even coming back to a previous solution that had been temporarily abandoned. This paper presents a general summary of the actual situation and recent developments that may be incorporated to architectural design tools in a near future, together with some critical remarks about their relevance to architecture.
Article
A major challenge to the successful full-scale development of modern aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.
Article
Building optimization involving multiple objectives is generally an extremely time-consuming process. The GAINN approach presented in this study first uses a simulation-based Artificial Neural Network (ANN) to characterize building behaviour, and then combines this ANN with a multiobjective Genetic Algorithm (NSGA-II) for optimization. The methodology has been used in the current study for the optimization of thermal comfort and energy consumption in a residential house. Results of ANN training and validation are first discussed. Two optimizations were then conducted taking variables from HVAC system settings, thermostat programming, and passive solar design. By integrating ANN into optimization the total simulation time was considerably reduced compared to classical optimization methodology. Results of the optimizations showed significant reduction in terms of energy consumption as well as improvement in thermal comfort. Finally, thanks to the multiobjective approach, dozens of potential designs were revealed, with a wide range of trade-offs between thermal comfort and energy consumption.
Article
In this paper we discuss the benefits derived by combining parametric modeling and genetic algorithms to achieve a performance oriented process in design, with specific focus on architectural design. The key role played by geometry in architecture is discussed in relation to performance oriented design, in which evaluations based on engineering criteria are integrated into the conceptual phase of the design. The performance attained by a specific geometric solution is considered along with its complexity in an interdisciplinarity process. A specific case study using large roofs is presented as an example. Enabling the designer to automatically generate a large range of alternative design solutions is a great advantage offered by parametric modeling in supporting geometric design explorations. However, this in turn presents the difficulty of how to evaluate the resulting myriad of generated alternatives. ParaGen is presented as a tool to support the exploration of the parametric design alternatives. ParaGen combines parametric modeling, performance simulation software and genetic algorithms, together with a database to store and retrieve the solutions for subsequent exploration. The design exploration is enhanced by means of the interaction of the designer with the process. This serves two objectives. Firstly, it addresses the genetic algorithm based creation of design solutions, while still focusing on a given fitness function. Secondly, it facilitates knowledge extraction from the generated solutions. A description of the tool and its possible uses by designers is provided. Applications of this tool are illustrated for both education and research, with specific reference to two examples in the field of modular long span roofs. The first case study has been developed as part of a teaching exercise in which ParaGen is used to explore the morphology of a dome based on structural performance. The second case study is derived from a research project which deals with solar energy transmission, and concerns the solar heat gain and daylight transmittance of a long span roof.
Design by Shopping: A New Paradigm?
  • Richard Balling
Balling, Richard. 1999. "Design by Shopping: A New Paradigm?" In Proceedings of the Third World Congress of Structural and Multidisciplinary Optimization, 295-297. Buffalo, NY: WCSMO.
Untangling Parametric Schemata: Enhancing Collaboration through Modular Programming
  • Daniel Davis
  • Jane Burry
  • Mark Burry
Davis, Daniel, Jane Burry, and Mark Burry. 2011. "Untangling Parametric Schemata: Enhancing Collaboration through Modular Programming." In Proceedings of the 14th International Conference on Computer Aided Architectural Design, 55-68. Liege: CAAD.
Don't Use the Force, Luke-Use the Targeting Computer
  • James Somers
Somers, James. 2017. "Don't Use the Force, Luke-Use the Targeting Computer." The Atlantic, April 12.
Brown is a PhD candidate in Building Technology at the Massachusetts Institute of Technology, where he earned an SMBT degree in 2016. His research seeks to understand how structural considerations interact with other architectural performance criteria in conceptual design
  • C Nathan
Nathan C. Brown is a PhD candidate in Building Technology at the Massachusetts Institute of Technology, where he earned an SMBT degree in 2016. His research seeks to understand how structural considerations interact with other architectural performance criteria in conceptual design. Nathan earned a BSE in Civil and Environmental Engineering at