Content uploaded by Daniela Mitterberger
Author content
All content in this area was uploaded by Daniela Mitterberger on Oct 13, 2022
Content may be subject to copyright.
Extended Reality Collaboration: Virtual and mixed reali-
ty system for collaborative design and holographic-
assisted on-site fabrication
Daniela Mitterberger1 [0000-0001-6183-6926], Evgenia-Makrina Angelaki1, Foteini Salveri-
dou1 [0000-0001-5314-3038], Romana Rust1 [0000-0003-3722-8132], Lauren Vasey1,
Fabio Gramazio1, Matthias Kohler1
1 ETH Zürich, Gramazio Kohler Research, Switzerland
Abstract. Most augmented and virtual applications in architecture, engineering,
and construction focus on structured and predictable manual activities and rou-
tine cases of information exchange such as quality assurance or design review
systems. However, collaborative design activities or tasks such as complex ne-
gotiation, task specification, and interaction are not yet sufficiently explored.
This paper presents a mixed-reality immersive collaboration system that enables
bi-directional communication and data exchange between on-site and off-site
users, mutually accessing a digital twin. Extended Reality Collaboration (ERC)
allows building site information to inform design decisions and new design iter-
ations to be momentarily visualized and evaluated on-site. Additionally, the
system allows the developed design model to be fabricated with holographic in-
structions. In this paper, we present the concept and workflow of the developed
system, as well as its deployment and evaluation through an experimental case
study. The outlook questions how such systems could be transferred to current
design and building tasks and how such a system could reduce delays, avoid
misunderstandings and eventually increase building quality by closing the gap
between the digital model and the built architecture.
Keywords: mixed reality, virtual reality, interactive design, collaborative virtu-
al environments, remote collaboration, immersive virtual environment.
1 Introduction
In recent years, there have been remarkable advances in mixed-reality technologies
for architecture, engineering, and construction (AEC). Most augmented and virtual
applications in AEC focus on structured manual activities (Goepel 2020; Jahn et al.
2019; Fazel and Izadi 2018), more routine cases of information exchange such as
quality assurance (Dietze, Jung, and Grimm 2021; Büttner et al. 2017), or design re-
view (Liu et al. 2020; Zaker and Coloma 2018). Collaborative design activities or
interdependent collaborative tasks, such as complex negotiation, task specification,
and interaction, are, however, not yet sufficiently explored (Marques, Bernardo et al.
2021; Wang and Tsai 2011; Benford et al. 2001; McGrath and Prinz 2001).
Collaborative activities in the field of AEC can involve a plethora of different
stakeholders with heterogeneous backgrounds and expertise. Furthermore, stakehold-
2
ers can involve remote collaborators, ranging from on-site to off-site users. Especially
for communication between remote users, knowledge transfer is critical for a success-
ful collaboration, particularly for task decomposition, handover processes, and design
revisions. Improved decision-making processes can enhance workflow efficiency and
collaboration in the creative process as it supports the inclusion of expert knowledge.
Current computer-supported cooperative work systems (CSCW) focus on enhanc-
ing collaboration in AEC by providing users with diverse shared information. This
information includes, for instance, access to shared digital context through common
data structures and environments utilizing building information modeling (BIM)
software such as Autodesk Revit or ArcGIS. Other systems provide access to shared
administrative tasks such as project management platforms, e.g., Microsoft planning
software or Autodesk Navisworks.
Current CSCWs are suitable for very distinct and asynchronous tasks that do not
require extensive communication and collaboration between users. Their structure
allows users to complete individual tasks and inform other users about their progress.
Nevertheless, due to their task-specific structure, these platforms are relatively rigid
and do not provide an environment that fosters an immersive communication and
discussion platform between users. This lack of a communication environment can
cause user frustration and inhibit creativity. Especially interwoven negotiated task
activities require a more comprehensive range of communication between different
stakeholders.
The research presented in this paper, "Extended Reality Collaboration" (ERC),
aims to complement the functionalities of existing CSCW systems and groupware
tools in AEC by providing workflows for not yet well-supported collaboration and
communication tasks. This paper proposes a mixed-reality immersive collaboration
system that enables bidirectional communication and data flow between on-site and
off-site users, enabling them to operate together on a digital twin in a collaborative
virtual environment (CVE).The workflow and functionalities of ERC have been ap-
plied and validated in an architectural scale prototype - a sticky note installation.
2 Background
Our work builds upon two general fields of research: collaborative virtual environ-
ments and augmented fabrication.
2.1 Collaborative virtual environments (CVE):
Churchill et al. (Churchill, Snowdon, and Munro 2001) define CVEs as distributed
virtual systems that enable users to collaborate with a digital environment and with
each other. Asymmetric CVEs (Grandi, Debarba, and Maciel 2019; Piumsomboon et
al. 2017) support users with different input and visualization hardware, adapting to
their various capabilities. DollhouseVR (Ibayashi et al. 2015) facilitates asymmetric
collaboration between co-located users, one virtually inside the dollhouse using a
head-mounted display (HMD) and the other using an interactive tabletop. Another
3
asymmetric CVE of co-located users is shareVR (Gugenheimer et al. 2017), which
uses floor projection and mobile displays with positional tracking to visualize a
shared virtual world for non-HMD users. A system developed for geographically
separated users is presented by Oda et al. (Oda et al. 2015), which supports a remote
expert to assist a local user. The results showed that a local user understood task in-
structions faster when the remote user wore a VR HMD and demonstrated the task in
virtual space compared to written annotations. Commercial software such as the Wild
and Iris VR provide CVEs for multiuser object manipulation but do not link it with
fabrication parameters and instructions and, therefore, miss out on streamlining the
design and fabrication phase.
2.2 Augmented fabrication:
Augmented fabrication in AEC focuses primarily on guiding a craftsperson in a man-
ual fabrication process (Nee et al. 2012). This guidance can be with audio instruc-
tions, projection mapping, or screen-based mixed-reality (MR). Fologram uses MR
headsets to see virtual holographic 3D models in space and assist unskilled construc-
tion workers in complex fabrication tasks (Jahn et al. 2019). An example of a screen-
based augmented-reality (AR) system is Augmented Bricklaying (Mitterberger et al.
2020). This system extends purely holographic AR with a context-aware AR system
providing humans with machine precision by tracking objects in space. IRoP (Mitter-
berger et al. 2022) is a system that allows users to instruct robots via programming by
demonstration and to preview generated designs on-site via projection-based AR.
While the growing number of AR fabrication research shows the enormous potential
of augmented fabrication, all the discussed systems are solely designed to be used in-
situ. None of the above examples link a local user with a remote user.
3 Methods
Our ERC system aims to combine design and fabrication functionalities in a collabo-
rative virtual environment and enhance communication between two geographically
separated stakeholders. Consequently, the system not only converges on- and off-site
activities but also integrates the processes of design development and physical fabri-
cation into one virtual shared environment.
3.1 User scenario
ERC involves at least two different stakeholders with different expertise that are in
different locations; one user is on-site, and the other is off-site (see Fig. 1). The on-
site user, "MR-User," is equipped with a MR headset, whereas the off-site user, "VR-
User," utilizes a virtual-reality (VR) headset. The MR-User represents an expert con-
struction worker, craftsperson, or construction site manager. The role of the MR-User
is to provide site-specific data, insight knowledge, and instruct manual fabrication.
The VR-User represents a stakeholder such as an architect or planner who navigates
4
in a digital twin of the construction site. The role of the VR-User is to request and
receive on-site information and feedback on the design to adjust the design according-
ly. Furthermore, the VR-User provides different design options and supervises fabri-
cation. Both users meet in the virtual space collaborating synchronously.
Fig. 1. User scenario showing the on- and off-site scenario with two distinct stakeholders
3.2 System Walkthrough
ERC is designed around two distinct phases, 1. Collaborative design phase, and 2.
Augmented fabrication phase. In phase one, the VR-User (architect) and the MR-User
(expert) evaluate the design options collaboratively. The MR-User localizes and cre-
ates the digital twin (see Fig. 2a), and then both users can meet in virtual space (see
Fig. 2b). The MR-User sees the design options as holographs on-site, while the VR-
User sees them in the digital twin of the construction site. The collaborative design
phase has two distinct features: 3D sketching and annotating (see Fig. 2c), and collab-
orative design on-the-fly (see Fig. 2d). Phase two allows the users to plan and fabri-
cate the design and has two features: holographic fabrication (see Fig. 2e) and fabrica-
tion supervision (see Fig. 2f).
To illustrate a typical interaction, we consider the user scenario described in sec-
tion 3.1. The users follow a linear sequence of interactive design and fabrication ses-
sions.
5
Fig. 2. System walkthrough.
A - Creation of a digital twin:
The creation of a shared digital twin model featuring both the construction site as well
as as-built model can be done in two ways, resulting in meshes with different resolu-
tions. The first option is asynchronous, creating a high-resolution point cloud using a
Lidar scanner. The second option is synchronous using the spatial awareness system
of the MR headset. This option can be accessed in ERC via the 'scanning' feature al-
lowing the users to receive a current as-built mesh of the construction site with cus-
tomized levels of detail. This feature consists of several interactive modes to further
access and edit the generated spatial data. The MR-User can select and send meshes
to the VR-User. Based on these meshes, the VR-User can adjust and update the de-
sign options.
B - Localization and meeting in virtual space:
Both users need to be localized in physical and virtual space to correctly send corre-
lated spatial, geometric, and temporal data. Therefore, the local coordinate systems of
the MR and VR spaces need to be aligned using relative transformation. The trans-
formation requires the current position of each user relative to an origin frame. As an
origin frame, the MR-User scans a referenced QR code in the physical space and then
transmits the frame data to the VR-User. To share a mutual sense of presence, both
users appear as avatars. The avatar position is updated in real-time and allows the
users to communicate via hand movements and body motion trajectories.
C - 3D sketching and annotating:
After localization and setting up a digital twin of the construction site, both stake-
holders use a sketching and annotating feature to draw in 3D, highlight specific target
areas or annotate existing designs (see Fig. 3). In this phase, both users can discuss
potential design problems with the construction site's current as-built state.
6
D - Collaborative design on-the-fly:
This feature allows users to preview and adjust a parametric design model on-the-fly
in MR and VR and directly preview it as a hologram in-situ (see Fig. 4). The VR-User
loads the parametric model using Rhino.Inside1 and adjusts the parameters of the
digital model according to the feedback of the MR-User. The VR-User has access to
properties of the parametric model and can adjust these parameters in near real-time.
Both users can sketch directly on the design options using the "3d sketch and anno-
tate" feature.
Fig. 3. 3D sketch and annotation feature. On the left is a first-person view of the MR-User
watching the VR-User sketch. On the right side is a third-person view of the MR and VR-User
drawing collaboratively within the VR space.
Fig. 4. Collaborative design on-the-fly feature. On the left is a first-person view of a hologram
of the design on the installation site. On the right side is a third-person camera view of the MR
and VR-User discussing the design in VR.
E - Holographic fabrication:
After deciding on a final design, the users switch from the interactive design phase to
the fabrication mode (see Fig. 5). The fabrication mode can also include multiple
other users as the system can be deployed on various augmented reality devices. The
MR-User receives fabrication-specific information such as the holographic 3D model,
estimated fabrication time, and the number of elements deployed. Furthermore, the
MR-User can switch between fabrication sessions. These sessions are visualized in
different colors representing the estimated daily working hours (see Fig. 6-2).
1 Rhino.Inside® is an open-source project which allows Rhino and Grasshopper to run inside
other 64-bit Windows application
7
F - Fabrication supervision:
This mode allows the MR-User to enter information about completed tasks, current
fabrication sessions, and problematic areas. Furthermore, the VR-User can virtually
join the fabrication session to supervise the process (see Fig. 6).
Fig. 5. A menu informing the MR-User about fabrication parameters and the holographic 3D
model supporting fabrication.
Fig. 6. Fabrication supervision feature. On the left is a first-person view of the MR-User look-
ing at the VR-User. On the right side is a third-person camera view of the MR and VR-User
discussing the fabrication in VR. The different colored elements show the different fabrication
sessions.
3.3 System Architecture
As displayed in Figure 7, the system architecture consists of three main parts: (1) an
on-site MR setup with a scanning system, (2) an online server, and (3) an off-site VR
setup. The on-site MR setup consists of a laser scanning device (Leica RTC 360)
providing high-resolution on-site scans, an MR-headset (Microsoft Hololens2), a lap-
top, and a WIFI router. The off-site hardware consists of a VR headset (Oculus Quest
2), a laptop, and a WIFI router.
The software setup is structured as follows. Two autonomous Unity3D applications
were developed, one for MR and one for VR. The MR application uses the Mixed
reality toolkit (MRTK) and OpenXR library to enable spatial awareness scanning and
QR-code detection. The VR application is developed using the OpenXR library. Fur-
thermore, Rhinoceros3D, Grasshopper, and Python are used to create algorithmic
designs. Rhino.Inside, enables compatibility and bidirectional communication be-
tween external Unity processes and Grasshopper. The online communication is based
on the Robot Operating System (ROS) (Quigley et al. 2009). The rosbridge package is
8
used to access the publish-and-subscribe architecture of ROS and ROS# for the Uni-
ty3D applications.
Fig. 7. System architecture
4 Case Study
To validate the feasibility of the proposed method and demonstrate the potential for a
concrete fabrication system such as façade panels, we focused on one full-scale exper-
imental implementation. For a user-friendly experience, the user interface (UI) design
was based on each user's different roles and work packages (see Fig. 8). We used
sticky notes as placeholders to showcase the various and complex types of infor-
mation that can be exchanged between two geographically separated users. This in-
formation includes the position (P), rotation (a), size (f1), geometry (folding type)
(f2), and color of each unit (see Fig. 9).
The total fabrication time was 26 hours, whereas the interactive design was around
1 hour. The final design was fabricated using two MR headsets, and a total of 4000
sticky notes were placed. The final design was split into distinct fabrication sessions
of 60-90 minutes. We used an attractor-based approach for the computational design,
which influenced the design depending on its location in space and its distance from
physical boundaries (see Fig. 10). Specifically, the attractor's location (CP) changed
the position, rotation, color, size, and folding type of the sticky notes. In our case
study, the sticky note's location was projected onto the spatial mesh data (M) scanned
by the MR User. This projection resulted in a precise position for each sticky note on
the as-built data of the installation site. During the design phase, the VR-User moved
the attractor as an interactive 3D prism in virtual space to control the number of pro-
jections. The VR-User could adjust the design parameters collaboratively with the
MR-User while the MR-User saw the different results as holographs in-situ. Further-
more, the MR-User could interact with the design via sketching to adjust the outline
9
of the design. After agreeing on a final design, the MR-User fabricated the full-scale
experimental implementation (see Fig. 11) while the VR-User supervised and in-
formed the process.
Fig. 8. Left: The MR-User interacts via hand tracking (A), gesture tracking (B), menu buttons
(C), and voice commands (D). The MR-User sees an info window superimposed over their
view (E). Right: The VR-User navigates the space and interacts via controllers (A) us-
ing controller buttons (B) and virtual menu buttons (C). The VR-User sees an info screen (D)
and moves within a digital twin of the construction site (E)
Fig. 9. Various and complex parameters that can be exchanged with the system
Fig. 10. Computational attractor-based design logic and remapping values to determine
sticky note position, rotation, folding size, and color
10
5 Results
Our ERC system allowed for an intuitive and real-time design interaction for users in
different physical locations. The users had access to a full-scale impression of the
architectural model augmented and contextualized by site-specific information. Both
users collaboratively designed and fabricated a complex and full-scale architectural
installation (see Fig. 12). Furthermore, personalized communication was achieved by
creating avatars for all users. Implementing the ERC system and the case study pro-
vided us with insights into the hardware and software limitations.
Fig. 11. Photograph of the fabrication of the final physical installation.
Fig. 12. Photograph of the final physical installation.
11
5.1 System limitations
We experienced hardware limitations regarding the environmental scanning and local-
ization as well as jitter of the digital model (see Table 1). The main software limita-
tions were delays and transmission speed, especially between the Grasshopper envi-
ronment and the Unity interface with increased mesh count and internet connection
speed. To avoid delays between the MR and VR-User, we used a mesh resolution of
20 triangles per cubic meter. Furthermore, the system still has a limited amount of
drawing tools in the “3D sketch and annotation” feature. Extending the drawing tools
would allow users to interact with a broader range of communication options. In noisy
environments, it was difficult to get the other user's attention. Therefore, it would be
essential to implement an "attention feature". Additionally, the current system lacks a
“documentation feature” that would allow users to upload video, pictures, or voice
memos to the digital model with associated location. Such a note collection could
help on-site workers keep track of construction site notes and allow easier communi-
cation with off-site users. These notes could also be read asynchronously, allowing
users to log into the system at different moments.
QR – Code distance
Drift of the digital model
< 0.45m and in view
0.1 - 0.3cm
~ 4m and in view
1.5 - 2.3cm
not in view
2 - 3cm
Table 1. Relation between QR Code placement and visibility and drift of the digital model. The
QR code dimensions were 12.5cm x 12.5cm.
6 Conclusion and Outlook
This research investigates the potential of collaborative design activities and how they
could lead to better knowledge and information flows between on-site and off-site
stakeholders during design and fabrication processes. The functionalities of the sys-
tem were evaluated via a full-scale case study, aiming to define collaboration proto-
cols and improve interaction and communication. Even though there are still limita-
tions, this research shows the potentials of such a system to improve supervision and
collaboration between on-site and off-site stakeholders, such as architects and con-
struction supervisors, to support a paperless construction site. The key findings of this
research are novel collaborative MR and VR interfaces, 3D workspace scenes with
sufficient context-awareness, and a fabrication protocol that includes remote monitor-
ing and planning. As an outlook, such a system could be applied towards detecting
deviations between the as-built and the digital model in order to decrease project costs
and building time. Such a system could be applied to real building scenarios, i.e., on-
site construction meetings, custom interior designs, renovations, and complex build-
ing elements. ERC allows dispersed personnel to have more direct contact, thereby
12
reducing problems of isolation and miscommunication. Furthermore, such a system
could accelerate workflows and support a teleoperated construction site.
7 Acknowledgments
We want to thank Gonzalo Casas (ETH Zurich) for supporting the research on online
communication. Furthermore, we would like to express our thanks to the Design++
initiative for giving us access to the Immersive Design Lab (IDL), equipment, and
support throughout the project.
8 Author’s contribution
DM wrote the manuscript, did the conception of the work, and supervised the thesis.
EA and FS developed the system as part of their MAS master thesis. EA contributed
to research for the manuscript. RR wrote the manuscript, did the conception of the
work, and supervised the thesis. LV was part of the supervision of the thesis. FG and
MK contributed to the conception of the work. All authors reviewed the manuscript.
9 References
1. Benford, Steve, Chris Greenhalgh, Tom Rodden, and James Pycock. 2001. 'Collaborative
Virtual Environments'. Communications of the ACM 44 (7): 79–85.
https://doi.org/10/dhcdcq.
2. Büttner, Sebastian, Henrik Mucha, Markus Funk, Thomas Kosch, Mario Aehnelt, Sebas-
tian Robert, and Carsten Röcker. 2017. 'The Design Space of Augmented and Virtual Real-
ity Applications for Assistive Environments in Manufacturing: A Visual Approach'. In
Proceedings of the 10th International Conference on PErvasive Technologies Related to
Assistive Environments, 433–40. Island of Rhodes Greece: ACM.
https://doi.org/10/gprbdq.
3. Churchill, Elizabeth F., David N. Snowdon, and Alan J. Munro, eds. 2001. Collaborative
Virtual Environments: Digital Places and Spaces for Interaction. Computer Supported
Cooperative Work. London ; New York: Springer.
4. Dietze, Andreas, Yvonne Jung, and Paul Grimm. 2021. 'Supporting Web-Based Collabora-
tion for Construction Site Monitoring'. In The 26th International Conference on 3D Web
Technology, 1–8. Pisa Italy: ACM. https://doi.org/10/gprbfh.
5. Fazel, Alireza, and Abbasali Izadi. 2018. 'An Interactive Augmented Reality Tool for Con-
structing Free-Form Modular Surfaces'. Automation in Construction 85 (January): 135–45.
https://doi.org/10/gczk7c.
6. Goepel, Garvin and Crolla. 2020. 'Augmented Reality-Based Collaboration - ARgan, a
Bamboo Art Installation Case Study'. In D. Holzer, W. Nakapan, A. Globa, I. Koh (Eds.),
RE: Anthropocene, Design in the Age of Humans - Proceedings of the 25th CAADRIA
Conference - Volume 2, Chulalongkorn University, Bangkok, Thailand, 5-6 August 2020,
Pp. 313-322. ACADIA. http://papers.cumincad.org/cgi-
bin/works/Show&_id=caadria2010_003/paper/caadria2020_426.
13
7. Grandi, Jeronimo Gustavo, Henrique Galvan Debarba, and Anderson Maciel. 2019. 'Char-
acterizing Asymmetric Collaborative Interactions in Virtual and Augmented Realities'. In
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 127–35. Osaka,
Japan: IEEE. https://doi.org/10/gj27xd.
8. Gugenheimer, Jan, Evgeny Stemasov, Julian Frommel, and Enrico Rukzio. 2017.
'ShareVR: Enabling Co-Located Experiences for Virtual Reality between HMD and Non-
HMD Users'. In Proceedings of the 2017 CHI Conference on Human Factors in Compu-
ting Systems, 4021–33. CHI '17. New York, NY, USA: Association for Computing Ma-
chinery. https://doi.org/10.1145/3025453.3025683.
9. Ibayashi, Hikaru, Yuta Sugiura, Daisuke Sakamoto, Natsuki Miyata, Mitsunori Tada, Ta-
kashi Okuma, Takeshi Kurata, Masaaki Mochimaru, and Takeo Igarashi. 2015. 'Dollhouse
VR: A Multi-View, Multi-User Collaborative Design Workspace with VR Technology'. In
SIGGRAPH Asia 2015 Emerging Technologies, 1–2. SA '15. New York, NY, USA: Asso-
ciation for Computing Machinery. https://doi.org/10.1145/2818466.2818480.
10. Jahn, Gwyllim, Cameron Newnham, Nick van den Berg, Melissa Iraheta, and Jackson
Wells. 2019. 'Holographic Construction'. In Design Modelling Symposium Berlin, 314–24.
Springer.
11. Liu, Yifan, Fadi Castronovo, John Messner, and Robert Leicht. 2020. 'Evaluating the Im-
pact of Virtual Reality on Design Review Meetings'. Journal of Computing in Civil Engi-
neering 34 (1): 04019045. https://doi.org/10/gprbfj.
12. Marques, Bernardo, Silva, Samuel, Dias, Paulo, and Sousa-Santos, Beatriz. 2021. 'An On-
tology for Evaluation of Remote Collaboration Using Augmented Reality'.
https://doi.org/10.18420/ECSCW2021_P04.
13. McGrath, Andrew, and Wolfgang Prinz. 2001. 'All That Is Solid Melts Into Software'. In
Collaborative Virtual Environments, edited by Elizabeth F. Churchill, David N. Snowdon,
and Alan J. Munro, 99–114. Computer Supported Cooperative Work. London: Springer
London. https://doi.org/10.1007/978-1-4471-0685-2_6.
14. Mitterberger, Daniela, Kathrin Dörfler, Timothy Sandy, Foteini Salveridou, Marco Hutter,
Fabio Gramazio, and Matthias Kohler. 2020. 'Augmented Bricklaying: Human–Machine
Interaction for in Situ Assembly of Complex Brickwork Using Object-Aware Augmented
Reality'. Construction Robotics 4 (3–4): 151–61. https://doi.org/10/gn6g4x.
15. Mitterberger, Daniela, Selen Ercan Jenny, Lauren Vasey, Ena Lloret-Fritschi, Petrus
Aejmelaeus-Lindström, Fabio Gramazio, and Matthias Kohler. 2022. ‘Interactive Robotic
Plastering: Augmented Interactive Design and Fabrication for On-Site Robotic Plastering’.
In . New Orleans: ACM. https://programs.sigchi.org/chi/2022/program/content/68974.
16. Nee, A. Y. C., S. K. Ong, G. Chryssolouris, and D. Mourtzis. 2012. 'Augmented Reality
Applications in Design and Manufacturing'. CIRP Annals 61 (2): 657–79.
https://doi.org/10.1016/j.cirp.2012.05.010.
17. Oda, Ohan, Carmine Elvezio, Mengu Sukan, Steven Feiner, and Barbara Tversky. 2015.
'Virtual Replicas for Remote Assistance in Virtual and Augmented Reality'. In Proceed-
ings of the 28th Annual ACM Symposium on User Interface Software & Technology, 405–
15. UIST '15. New York, NY, USA: Association for Computing Machinery.
https://doi.org/10.1145/2807442.2807497.
18. Piumsomboon, Thammathip, Youngho Lee, Gun Lee, and Mark Billinghurst. 2017. 'Co-
VAR: A Collaborative Virtual and Augmented Reality System for Remote Collaboration'.
In SIGGRAPH Asia 2017 Emerging Technologies, 1–2. Bangkok Thailand: ACM.
https://doi.org/10/gf2bqk.
14
19. Quigley, Morgan, Brian Gerkey, Ken Conley, Josh Faust, Tully Foote, Jeremy Leibs, Eric
Berger, Rob Wheeler, and Andrew Ng. 2009. 'ROS: An Open-Source Robot Operating
System', 6.
20. Wang, Xiangyu, and Jerry Jen-Hung Tsai, eds. 2011. Collaborative Design in Virtual En-
vironments. Intelligent Systems, Control and Automation: Science and Engineering 48.
Dordrecht: Springer.
21. Zaker, Reza, and Eloi Coloma. 2018. 'Virtual Reality-Integrated Workflow in BIM-
Enabled Projects Collaboration and Design Review: A Case Study'. Visualization in Engi-
neering 6 (1): 4. https://doi.org/10/gprbfk.