Conference PaperPDF Available

HoloBrick: a contextual design and analysis workflow for parametric masonry façade utilizing augmented reality

Authors:

Abstract

This paper presents experimental research about developing a contextual design and analysis workflow utilizing augmented reality (AR) technology to enrich the current design process with an immersive experience. The limitation in the current design process is that due to the lack of spatial perception of the architectural draft on-site preview, designers sometimes find it difficult to fully evaluate their design proposals, which often causes unreasonable design outcomes and incomplete related analysis. To respond to the above problems, the HoloBrick research aims to create a unique augmented parametric algorithm for masonry façade designs to develop and validate the proposed contextual AR-assisted design and analysis workflow, which consists of two phases: a) AR contextual design, and b) AR contextual analysis and modification. The research findings highlight the advantages of using AR in the design and analysis process, such as providing immersive preview and interactive input methods, analyzing design outcomes within context, and enhancing draft modifications with better understanding, which are not offered in the current design process.
SIGraDi 2023. Accelerated Landscapes | Centro Universitario Regional Este (CURE)
Facultad de Arquitectura, Diseño y Urbanismo | Universidad de la República.
HoloBrick: a contextual design and analysis
workflow for parametric masonry façade utilizing
augmented reality
Yang Song1, Asterios Agkathidis2, Richard Koeck1
1 University of Liverpool, Liverpool, UK
yang.song@liverpool.ac.uk; r.koeck@liverpool.ac.ukter
2 University of Liverpool, Liverpool, UK
a3lab@liverpool.ac.uk
Abstract. This paper presents experimental research about developing a contextual
design and analysis workflow utilizing augmented reality (AR) technology to enrich the
current design process with an immersive experience. The limitation in the current
design process is that due to the lack of spatial perception of the architectural draft on-
site preview, designers sometimes find it difficult to fully evaluate their design proposals,
which often causes unreasonable design outcomes and incomplete related analysis. To
respond to the above problems, the HoloBrick research aims to create a unique
augmented parametric algorithm for masonry façade designs to develop and validate
the proposed contextual AR-assisted design and analysis workflow, which consists of
two phases: a) AR contextual design, and b) AR contextual analysis and modification.
The research findings highlight the advantages of using AR in the design and analysis
process, such as providing immersive preview and interactive input methods, analyzing
design outcomes within context, and enhancing draft modifications with better
understanding, which are not offered in the current design process.
Keywords: Augmented Reality (AR), Contextual design, Contextual analysis, Masonry
façade, Immersive workflow
1 Introduction
Architectural design, being the quintessential 3D-4D design field, has
throughout its history, been limited by 2D or cumbersome 3D representation,
such as sketching on the plane surface, modelling in design software, or
building physical scale models (Barczik, 2018). Even though computer-aided
architectural design and modelling software is widely used to produce digital 3D
models, their preview is still limited to a 2D-based screen, which lacks an
intuitive means of on-site visualization and modification (Song et al., 2022).
Additionally, conventional screen-based visualization methods for design and
analysis are restrictive to how well the user understands the space on a
computer, as the design, analysis, and modification are done outside the
building site; hence, there might be disparities between the design and final
fabrication (Nguyen and Haeusler, 2014).
The last decade has witnessed the explosion of new technologies, and their
impacts have dramatically changed architectural design and analysis methods
(Huang et al., 2018). For instance, AR technology, with the characteristic of
overlapping the holograms on the actual physical world and connecting
interactions between human and digital data in real-time, is at the forefront of
the immersive methods to enhance collaboration between designers, digital
outcomes, and physical space (Sampaio and Henriques, 2008). AR technology
was recently applied in architectural construction fields to augment the current
2D-based construction files into three-dimensional geometries, by its unique
visual characteristic of combining real and virtual objects in the aligned physical
site (Chu et al., 2020). Besides its application on construction sites, AR
technology, with its visual and interactive performance, could potentially
augment the conventional architectural design and analysis process (Song et
al., 2021). Although the architectural modelling method has fundamentally
changed in its history, the AR tools currently integrated with the corresponding
design methods are mainly limited to enhanced design draft visualization only.
However, due to the insufficient development of immersive technology, as
Coppens et al. conclude, very few projects attempt to solve the challenge of
modelling and analyzing in an immersive environment, which requires a new
design and analysis input and preview methods to eliminate the traditional pen
or mouse-keyboard with screen combination (Coppens et al., 2018).
Based on the current research gaps in the existing design method and the
possibility brought by the advanced immersive technology, architects began to
explore immersive design and analysis to eliminate 2D-based design
restrictions. Therefore, seeking a design and analysis method within context
can help designers preview, test, and modify design drafts at a real scale on-
site and augment the current design experience. This immersive design and
analysis method will break through the experience of the conventional design
process, stimulate the designers' creativity through contextual holographic
preview and interaction, modify and optimize the design draft to get better
results according to the on-site building analysis of the design model with
contextual environments.
In response to the above existing limitations in the current architectural
design and analysis process, this paper aims to develop a contextual design
workflow utilizing AR technology to enrich the contemporary design and
analysis experience with immersive technology. For this HoloBrick experiment,
we utilize AR technology with the parametric design algorithm and building
analysis scripts to augment the conventional masonry design methods for
façade designs to develop and validate the proposed contextual AR design and
analysis workflow.
2 Methodology
The HoloBrick research project proposes an augmented contextual design
and analysis workflow utilizing AR technology with visual scripting techniques
and tests its effectiveness by conducting two experiments with several masonry
façade designs with context. The workflow consists of the following phases: a)
AR contextual design, in which users can design on-site with surrounding
contexts through AR, and the interactive design inputs will communicate with
parametric algorithms in AR through the screen-based inputs from mobile AR
devices (e.g. iOS or Android smartphones and tablets) or the gesture-based
inputs from head-mounted devices (HMD) (e.g. Microsoft HoloLens 1); as well
as b) AR contextual analysis, in which users can preview the design drafts as
AR holograms on-site with corresponding daylighting and energy analysis
information to modify and optimize the masonry façade design drafts. The task
of this research is to use AR immersive technology to realize the parametric
masonry façade on-site holographic design, preview, analysis, and
modification, which helps designers to fully understand and modify design
drafts in an immersive and comprehensive way before the physical
construction, and make the design more optimized and reasonable. The above
two experiment phases can verify the feasibility and limitations of the proposed
workflow multi-dimensionally.
The employed contextual design and analysis workflow (Fig.1) is driven by
an instant connection between parametric design software (Rhinoceros7 /
Grasshopper), AR immersion (Fologram / Fologram App) and energy analysis
plugins (Heliotrope / LadyBug). Fologram is a third-party API developed by
architects for architects, which could extract human gestures, screen taps,
device location and mark information, and the Fologram App provides the
holographic design drafts and related information on-site preview method, and
a user interface (UI) as a bridge to interact and modify the associated
parameters in masonry façade design scripts from Grasshopper through AR.
Moreover, Heliotrope and LadyBug plugins mainly provide visualized analysis,
including daylight and radiation analysis. The Fologram, Heliotrope, and
LadyBug plugins work with its integrated graphical algorithm editor
Grasshopper, which are ubiquitous tools in architectural design, and can easily
be integrated into established immersive, contextual design and analysis
workflow.
The AR immersion devices for the above experiments are the mobile
terminal (iPhone 14 Pro Max) and the HMD terminal (Microsoft HoloLens 1),
which will be used for interactive inputs and holographic design draft and
analysis information previews in the contextual design and analysis workflow.
These devices are connected to a WIFI router in the same IP address network
environment for transforming the data from different stages, and live streaming
comments on design software and plugins to visualize and output response
ports.
Figure 1. This is the flowchart of the HoloBrick research, including Phase A (AR
contextual design) and Phase B (AR contextual analysis) (colored in blue); the related
plugins for each step ( in red); and the outcomes of each phase ( in green). Source:
Yang Song, 2023.
3 Experiments and Findings
3.1 AR Contextual Design
Phase A proposes an AR contextual design method. Compared with the
conventional design method, this augmented method enables users to achieve
real-time on-site contextual design and draft visualization modifications through
UI-based interactions from AR devices. The AR contextual design employed
gesture recognition, screen-based interaction, path tracking, and marker
tracking to transform intuitive human movements and hand gestures or screen
interactions into interactive design inputs for design algorithms through AR. The
design algorithms have already been pre-set from the masonry façade design
library, representing different parametric structures with related AR interactive
design inputs. Users can also customize the design algorithms and AR
interventions according to their needs and extract interactive inputs in the open
platform, AR UI, through Grasshopper.
For example, we use some brick façade design algorithms to validate the
AR contextual design method. First, the façade is based on a QR code
generated from Fologram, which can be measured and placed on-site as the
reference point of the façade to locate contextual surroundings with the digital
design environment. After scanning the marker through the AR device (iPhone
14 Pro Max or Microsoft HoloLens 1 for this experiment), the design coordinate
will be picked up from physical to virtual in Grasshopper. Second, the interactive
inputs and adjustable values will be extracted from Grasshopper and displayed
in the AR UI, according to the façade design algorithm. These values shown on
the AR UI include brick size, the number of bricks per layer, brick gaps,
proportions, brick rotation of each layer, brick patterns, the height and width of
the masonry façade, as well as other interactive parametric inputs. Designers
can add or modify parameters for their customized algorithms and
interventions. Third, according to the selected façade design algorithm, users
can start designing their AR masonry façade by developing and adjusting the
related parameters through AR UI, previewing and modifying the structure
outcomes on-site as holograms overlapping on the contextual surroundings
(Fig. 2). Last, the outcomes will be recorded in Grasshopper. Multi-designer
users can scan the same QR code to access the façade design process and
outcomes; as well as remote users can scan and convert the surrounding
environment and save them as digital meshes with the same design reference
QR code, to scan, share, and modify the design in the same contextual
environment, which provides a contextual remote design strategy.
Figure 2. The user is accessing the AR contextual design by scanning the QR code
and activating the AR UI. Through the AR UI, the user changes the interactive inputs
for designing the façade on-site with contextual environments for better spatial
understanding. Source: Yang Song, 2023.
In summary, the AR contextual design does fulfill the pre-determined
assumptions. We successfully designed various masonry façades with different
algorithms in AR UI (Fig. 3). Through the parameter adjustment of the screen-
based and gesture-based UI and the real-time on-site preview of the
holographic design draft, users can better experience the sense of space and
scale, which provides an appropriate and intuitive design experience within
contextual surroundings. Ubiquitous AR devices such as smartphones, tablets,
and HMDs can be used to activate AR UI for contextual design, which makes
this method easier to be popularized and accessible. However, since this is a
façade design, the corresponding daylighting and radiation analyses will also
be an important part of the design. Therefore, integrating the above analysis
results into the existing AR contextual design workflow is considered to be able
to give the designer more comprehensive design feedback and contextual
inspirations, so as to obtain a more reasonable design result.
Figure 3. The user is accessing the AR contextual design by scanning the QR code
and activating the AR UI. Through the AR UI, the user changes the interactive inputs
for designing the façade on-site with contextual environments for better spatial
understanding. Source: Yang Song, 2023.
3.2 AR Contextual Analysis
Phase B proposes an AR contextual analysis method for the design
outcomes from Phase A. Compared with the conventional design analysis
method, this augmented method enables users to preview the daylighting and
radiation analysis in real-time on-site as AR holograms with the contextual
environment from AR devices. The AR contextual analysis employed all the AR
features in Phase A into the interactive design algorithms for better
modifications through AR. The analysis scripts have already been pre-set for
the masonry façade design drafts, including the daylighting analysis with the
Heliotrope plugin and the radiation analysis with the LadyBug plugin with
related AR interactive design modification inputs. Users can also customize the
other analysis scripts, such as thermal comfort, glare simulation, building
energy analysis, etc., and AR design modification interventions according to
their needs. The analysis outcomes have to be extracted in the open platform,
AR UI, for users to preview and interact with through Grasshopper.
For daylighting analysis in this experiment, we use some brick façade
designs from Phase A to validate the AR contextual analysis method. First, the
user is required to set related location, date, and time information for accurate
analysis. For example, we use Liverpool city center as the location in this
experiment. To import the related analysis database, the user is required to
input 53.4 as "Latitude" and -2.9 as "Longitude" through AR UI. After that, the
user can set a specific date with time through AR UI or a range. Second, the
façade design is based on a QR code from Phase A. The user needs to scan
the QR code to locate the design draft on-site with the contextual environments.
Moreover, the user can emphasize "north direction" by screen-based and
gesture-based demonstration through AR UI, or just indicate it through the
analysis scripts in Grasshopper. Third, after the basic setup, the user can
preview the daylighting shadows on-site as holograms through the AR device.
Last, according to the on-site holographic daylighting analysis results, the
designer can change the arrangement or shape of the masonry façade bricks
at any time using the AR contextual design method from Phase A to optimize
and modify for better design results (Fig. 4).
Figure 4. These are the screenshots of the AR contextual analysis from AR UI.
Users can preview the daylighting analysis result and modify the design draft in real-
time on-site for the optimized design results according to the contextual analysis
feedback. Source: Yang Song, 2023.
For radiation analysis in this experiment, we use the same brick façade
designs with corresponding analysis location, date, and time settings to validate
the AR contextual analysis method. Moreover, the user is required to import the
related energy plus weather (EPW) file to the analysis script (Liverpool EPW is
imported for this experiment). After the basic setup, the user can preview the
radiation analysis results on-site as holograms through the AR device. The
analysis results will be shown in the color transition of the brick from blue to
orange, which means the transition of heat radiation from low to high. Last,
according to the on-site holographic radiation analysis results, the designer can
modify the arrangement or shape of the masonry façade bricks at any time
using the AR contextual design method from Phase A to reduce the heat
radiation for better design results, that is, to convert as much orange
holographic bricks as possible to blue (Fig. 5).
Figure 5. These are the screenshots of the AR contextual analysis from AR UI.
Users can preview the radiation analysis result and modify the design draft in real-time
on-site for the optimized design results according to the contextual analysis feedback.
Source: Yang Song, 2023.
In summary, the AR contextual analysis indeed achieved daylighting and
radiation analysis on-site preview. We successfully analyzed various masonry
façades from Phase A and modified the design drafts according to better
analysis results through AR UI for a masonry façade design proposal (Fig. 6).
Through the real-time on-site analysis, users can experience daylight and
shadow, and visualize façade radiation, which will help designers modify more
optimized solutions based on the above contextual analysis. This immersive
analysis feedback design modification is missing in traditional architectural
design processes. However, there are limitations and space for further
improvement. The AR contextual analysis only carries out limited analysis
functions; in the future, more contextual analysis should be developed. Due to
the extensive calculation of interactive information transmitted in this prototype,
there are delays between holographic analysis preview and design draft
interactive modification through AR UI in the entire real-time design and
analysis process. Therefore, optimizing the parameters scripts involved or
developing customized analysis scripts and plugins will improve the user
experience and system fluency. Moreover, the sensors of AR devices are
affected by the surrounding light conditions. If too much or too little UV light
occurs in the natural environment, for instance, the holographic model
sometimes has difficulties locking a model in a contextual place. Consequently,
the holograms always drift by the surrounding environment's interferences.
Sometimes, it is necessary to restart the device and re-scan a QR code to
correct the location. In further research, extra sensors, such as Kinect, are
needed for helping to scan the on-site design context precisely to reduce the
unstable disadvantages of current AR.
Figure 6. The user successfully designed a masonry façade proposal using AR
contextual design and analysis methods through HoloBrick research. This AR-assisted
contextual design and analysis result-oriented design modification feedback provides
users with an unprecedented immersive experience. Source: Yang Song, 2023.
4 Conclusion
The HoloBrick research developed and verified a contextual design and
analysis workflow for parametric masonry façade utilizing AR that successfully
applies the customized façade design algorithms with building analysis scripts
in AR, exploring a real-time, interactive, and contextual design, analysis, and
modification method in the early architectural design stages through the
immersive environment on-site. Closely practicing the AR-assisted contextual
design and analysis process and outcomes, it can be concluded that the
proposed immersive design and analysis workflow does fulfill our pre-
determined assumptions and offers a new way to modify design drafts and
preview contextual building analysis on-site through AR in real time. The
employment of AR technology, with interactive inputs, on-site spatial
registration, 1:1 holographic preview, and real-time data interaction features,
provided the illusion of actual spatial objects and related information. It aids
real-time evaluation and instant modification of design proposals, enables the
users to improve their cognition and understanding of space, triggers reflections
and remodeling of the architectural design process, and cultivates their design
creativity and outcome variety in the early stage. Additionally, traditional
architectural proposal analysis is based on screens or drawings. The AR
contextual analysis in this experiment can be displayed to users as holograms
on-site, providing an immersive and intuitive experience, and helping users to
better design based on the analysis results. In addition, architects are able to
preview their digital designs with related analysis outcomes out of the sketch or
computer screen, as well as interact and communicate with the related on-site
physical environment. These on-site design and analysis functions break the
conventional 2D-based design method, providing designers with a 3D-4D
immersive perception in AR for more practical design.
The AR contextual design and analysis workflow gives architects more
freedom, as well as its remote collaboration and multi-designer outcome-
sharing functions, break through the constraints of conventional 2D-based
design media. In future research, besides the façade, more architectural
element designs and more functional building analyses will be developed and
tested. Moreover, we will promote this AR contextual design and analysis
workflow with convenient manipulation such as UI-based or intuitive gesture-
based for architects, get their feedback and opinions after use, and improve the
method to make the whole process smooth and reasonable for initial
architectural design.
References
Barczik, G. (2018). From Body Movement to Sculpture to Space, employing immersive
technologies to design with the whole body. eCAADe 2018, Vol 2, p. 781-788.
Song, Y., Agkathidis, A., & Koeck, R. (2022). Augmented Bricks: an on-site AR
immersive design to fabrication framework for masonry structures. In the CDRF
2022, Hybrid Intelligence, Tongji University, Shanghai, China, pp. 385-395.
Nguyen, DD. and Haeusler, MH. (2014). Exploring Immersive Digital Environments, and
developing alternative design tools for urban interaction designers. CAADRIA 2014,
p. 87-96.
Huang, X.R., White, M. and Burry, M. 2018. "DESIGN GLOBALLY, IMMERSE
LOCALLY: A synthetic design approach by integrating agent-based modeling with
virtual reality." In CAADRIA 2018, 473-482. Hong Kong.
Sampaio, A.Z. and Henriques, P.G. 2008. "Simulation of construction processes within
virtual environments." In the Third International Conference on Computer Graphics
Theory and Applications Conference, 326-33. Funchal, Madeira.
Chu, CH., Liao, CJ. and Lin, SC. (2020). Comparing Augmented Reality-Assisted
Assembly Functions, a case study on Dongong structure. Applied Sciences, 10,
3383.
Song, Y., Koeck, R. and Luo. S. 2021. "Review and Analysis of Augmented Reality (AR)
Literature for Digital Fabrication in Architecture." Automation in Construction, 128
(2021) 103762.
Coppens, A., Mens, T. and Gallas, M.A. 2018. "PARAMETRIC MODELLING WITHIN
IMMERSIVE ENVIRONMENTS: Building a bridge between existing tools and virtual
reality headsets." In eCAADe 2018, 711-716. Lodz.
Conference Paper
Full-text available
Shekili houses are traditional Iranian timber lodges originating in Guilan, south of the Caspian Sea. This research highlights this yet forgotten knowledge by adapting to contemporary Design for Disassembly (DfD) principles, leveraging the potential of Mixed Reality (MR) in the built environment. Conversely, identified research gaps from a systematic literature review by authors suggest that human interaction is a critical factor in MR applications, including a lack of involvement from clients who need to be more involved in design processes. The research aims to develop a disassembly MR workflow that non-trained stakeholders, including clients and owners, can deploy. The main question the authors answered is, "How can MR technology facilitate disassembly processes by laypersons?". Data is collected and verified through a design experiment with laypersons using HoloLens 2 as a free-standing MR headset. Our findings emphasize facilitating the design process by integrating the non-trained public using MR.
Chapter
Full-text available
The Augmented Bricks research project aims to develop an immersive design to fabrication framework for the assembly of masonry building components by incorporating robotic fabrication and augmented reality (AR) technologies. Our method incorporates two main phases: firstly, the design phase in which users’ gestures and interactions are being identified in AR for the immersive design and simulation process; secondly, an innovative robotic assembly phase in which users can control a robotic arm for assembly by interacting with the AR user interface (UI). Our framework is validated by the design and assembly of four brick-based columns. Our findings highlight that the proposed design to fabrication framework offers a novel, intuitive design inspiration and experience beyond the traditional design methods. It returns the task of assembling parametric structures with high-tech equipment back to the designers, allowing them to master and participate in the entire design to the fabrication process. The impact of this practice-based research will allow architects and designers to modify and construct their designs more simply and intuitively through the AR environment.
Article
Full-text available
The Dougong structure is an ancient architectural innovation of the East. Its construction method is complex and challenging to understand from drawings. Scale models were developed to preserve this culturally-unique architectural technique by learning through their assembly process. In this work, augmented reality (AR)-based systems that support the manual assembly of the Dougong models with instant interactions were developed. The first objective was to design new AR-assisted functions that overcome existing limitations of paper-based assembly instructions. The second one was to clarify whether or not and how AR can improve the operational efficiency or quality of the manual assembly process through experiments. The experimental data were analyzed with both qualitative and quantitative measures to evaluate the assembly efficiency, accuracy, and workload of these functions. The results revealed essential requirements for improving the functional design of the systems. They also showed the potential of AR as an effective human interfacing technology for assisting the manual assembly of complex objects.
Conference Paper
Full-text available
The last three decades have witnessed the explosion of technology and its impact on the architecture discipline which has drastically changed the methods of design. New techniques such as Agent-based modeling (ABM) and Virtual Reality (VR) have been widely implemented in architectural and urban design domains, yet the potential integration between these two methods remains arguably unexploited. The investigation in this paper aims to probe the following questions: How can architects and urban designers be informed more comprehensively by melding ABM and VR techniques at the preliminary/conceptual design stage? Which platform is considered more appropriate to facilitate a user-friendly system and reduces the steep learning curve? And what are the potential benefits of this approach in architectural education, particularly for the design studio environment? With those questions, we proposed a prototype in Unity, a multi-platform development tool that originated from the game industry, to simulate and visualize pedestrian behaviors in urban environments with immersive design experience and tested it in a scenario-based case study. This approach has also been further tested in an architectural design studio, demonstrating its technical feasibility as well as the potential contributions to the pedagogy.
Conference Paper
Full-text available
We present and discuss an experimental student design and research project that investigates how architectural design can be enhanced via immersive technologies. Specifically, by employing not a 2D interface for designers' thoughts, but a 3D interface and thereby activating the whole body instead of merely head and hands.
Conference Paper
Full-text available
Even though architectural modelling radically evolved over the course of its history, the current integration of Augmented Reality (AR) and Virtual Reality (VR) components in the corresponding design tasks is mostly limited to enhancing visualisation. Little to none of these tools attempt to tackle the challenge of modelling within immersive environments, that calls for new input modalities in order to move away from the traditional mouse and keyboard combination. In fact, relying on 2D devices for 3D manipulations does not seem to be effective as it does not offer the same degrees of freedom. We therefore present a solution that brings VR modelling capabilities to Grasshopper, a popular parametric design tool. Together with its associated proof-of-concept application, our extension offers a glimpse at new perspectives in that field. By taking advantage of them, one can edit geometries with real-time feedback on the generated models, without ever leaving the virtual environment. The distinctive characteristics of VR applications provide a range of benefits without obstructing design activities. The designer can indeed experience the architectural models at full scale from a realistic point-of-view and truly feels immersed right next to them.
Conference Paper
Full-text available
Three-dimensional geometric models have been used to present architectural and engineering works, showing their final configuration. But, when the clarification of a detail or the constitution of a construction step in needed, these models are not appropriate because they do not allow the observation of the construction activity. Models that could present dynamically changes of the building geometry are a good support on education in civil engineering domain. Techniques of geometric modelling and virtual reality were used to obtain interactive models that could visually simulate the construction activity. The applications explain the construction work of a cavity wall and a bridge. The models present distinct advantage as educational aids in first-degree courses in Civil Engineering. The use of Virtual Reality techniques in the development of educational applications brings new perspectives to the teaching of subjects related to the field of civil construction.
Article
The use of Augmented Reality (AR) technologies has increased recently, due to the equipment update and the mature technology. For architectural design, especially in digital fabrication projects, more designers begin to integrate AR methods to achieve the visualization in the process. To help unskilled labors for holographic on-site previewing and instruction training, experimental and practice-based studies in AR for the architectural digital fabrication have emerged in recent years. Now, it is a great opportunity to discuss the topic of AR in architectural digital fabrication. By presenting a statistical review of AR technology in architecture projects, this literature review aims to review ongoing research and provide pathways for further research in architectural digital fabrication. This review article is based on information found in journal publications and conference papers in the fields of architecture, engineering, robotics, and digital fabrication, published to date (from 2010 to 2020). The review narrows the literature within these papers by filtering 84 articles through the keyword “Augmented Reality”, “Digital Fabrication” and “Assembly”. The selected articles can be categorized based on the most use of AR function in architectural digital fabrication into an order of the following three classifications with the most significant growth in the last years: (A) AR 3D holographic instruction, (B) AR data sharing, (C) AR for Human-Computer interaction. The information collected from these articles within their classifications is meant to give insight into the current state-of-the-art of AR in the architectural digital fabrication area, as well as to summarize how the topic has matured and developed over time in the research and industry literature. This article has not only analyzed the existing literature but also highlighted new emerging fields in AR research and the future trends of AR function in architectural digital fabrication.
Exploring Immersive Digital Environments, and developing alternative design tools for urban interaction designers
  • D D Nguyen
  • M H Haeusler
Nguyen, DD. and Haeusler, MH. (2014). Exploring Immersive Digital Environments, and developing alternative design tools for urban interaction designers. CAADRIA 2014, p. 87-96.