Content uploaded by Giovanni Betti
Author content
All content in this area was uploaded by Giovanni Betti on Sep 26, 2019
Content may be subject to copyright.
Pop Up Factory : Collaborative Design in Mixed Rality
Interactive live installation for the makeCity festival, 2018 Berlin
Giovanni Betti1, Saqib Aziz2, Gili Ron3
1,2,3HENN GmbH
1,2{Giovanni.Betti|Saqib.Aziz}@henn.com 3giliron.space@gmail.com
This paper examines a novel, integrated and collaborative approach to design
and fabrication, enabled through Mixed Reality. In a bespoke fabrication process,
the design is controlled and altered by users in holographic space, through a
custom, multi-modal interface. Users input is live-streamed and channeled to 3D
modelling environment,on-demand robotic fabrication and AR-guided assembly.
The Holographic Interface is aimed at promoting man-machine collaboration. A
bespoke pipeline translates hand gestures and audio into CAD and numeric
fabrication. This enables non-professional participants engage with a plethora of
novel technology. The feasibility of Mixed Reality for architectural workflow was
tested through an interactive installation for the makeCity Berlin 2018 festival.
Participants experienced with on-demand design, fabrication an AR-guided
assembly. This article will discuss the technical measures taken as well as the
potential in using Holographic Interfaces for collaborative design and on-site
fabrication.Please write your abstract here by clicking this paragraph.
Keywords: Holographic Interface, Augmented Reality, Multimodal Interface,
Collaborative Design, Robotic Fabrication, On-Site Fabrication
INTRODUCTION
In this paper we argue that AR can improve man-
machine collaboration in architecture making. Holo-
graphic, multimodal interfaces simplify access to
cutting-edge technology for the all-user. Human op-
erators can edit digital information with using in-
tuitive interfaces, for CAD/CAM. Leveraging human
capabilities to communicate vie speech and hand-
gestures allows non-professionals easy access to 3D
modelling and robotic fabrication. We examined AR
representation and multimodal interface potential in
the inclusion of non-professional in architectural de-
sign and making, human-machine collaboration and
multi-participant design. This approach was tested
through a fully integrated cycle of holographic 3D
modelling, on-demand robotic machining and AR-
guided assembly that resulted in a collaborative in-
stallation for the MakeCity Festival 2018 in Berlin.
BACKGROUND
The 4th industrial revolution marks the fusion of
novel automation with advanced means of commu-
nication and increased connectivity (Schwab, 2016).
Cyber-Physical production, the joint work of
Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3 - eCAADe 37 / SIGraDi 23 |115
human-human, machine-human and machine-
machine, is the foundation for increased productivity
on-which lies Industry 4.0 (Schuh, et al., 2014). In the
architecture field this is seen in the adaptation of
tools originating from various industries: fabrication
with automotive-industry robotic arms; visualization
techniques origination in the gaming industry, such
as AR and VR goggles. Synthesizing the two brings
about precise, on-demand CAD/CAM fabrication on-
site.
Studies in robotic fabrication mark its perfor-
mance as a prominent tool of the future work-
site. Automation brings about increased process
control, speed, precision of execution and increase
load-bearing capacities (Bonwetsch, F. Gramazio, M.
Kohler, 2012). It enables continuous work in-situ as
well as an increase in workers’ safety (Keating, 2017).
On-site automated fabrication with using robotic-
arms has been studied for its potential in combined
fabrication with several agents (Parascho et al., 2017).
Latest design research and application in robotic-
human fabrication in a building scale is seen in the
collaborative project DFAB House, of NCCR Digital
Fabrication and ETH Zurich (ETH Zurich, NCCR Digi-
tal Fabrication, 2018).
Augment Reality is advantageous in showing
contextual information, regarding design and fab-
rication. It allows hands-free access to informa-
tion, serving as guides for fabrication (J. Zhou et al.,
2011), increasing product quality and reducing prod-
uct costs (M. Brettel et al., 2014). In architecture, on-
site visualization during 3D modelling design phase
promotes context-aware design, adaptation to site-
conditions (Zlatanova, 2001) and increased design
performance (Popovski, 2014). Visualization is a key
factor in collaborative design and public participa-
tion (Allen, 2011). Previous experiments in using
AR-interfaces for fabrication purposes, were tested
for professional and non-professional use (Weichel,
2014), executed in small-scale (Do, 2010) and taught
a skill in repeating a predefined design (Billinghurst,
2015). AR and VR has been long identified for their
simulative advantages, training human agents in
working with digital fabrication tools (Lin, 2002). The
results of improper operation can be simulated with-
out incurring the associated costs in terms of human
injury and equipment repair.
Multimodal interfaces make use of human nat-
ural communication capabilities: speech, gestures,
touch. It is advantageous in translating long-learnt
motoric skills into machine-related skills, in short
time (Arshad et al., 2017). Multimodal interfaces have
been in use for the gaming industry, tracking mo-
tion and sound. Inputs are tracked with using sen-
sors (Nintendo Wii; Microsoft Kinect) or translated
through remote controls (HTC Vive; Samsung Gear
VR; Oculus Rift). Tracking of human motoric skills
into machine-code is seen in digital art-making (Hac-
mun, 2018) and in medicine practices, allowing re-
mote support from experts (Bodner, 2005). Human-
body “outputs”, translated into machine-code, have
been used as artistic generators in art installation:
heart-beats, neuronal activity and sound. The lat-
ter, an exploration in sound-driven pattern-making,
serves as base for the installation further discussed
(Betti, 2018).
A fully operational AR based, multimodal system
as the one prototyped here is yet to be developed
and tested at scale but it seems to belong to a rapidly
approaching future.
The use of Holographic, multimodal Interface is
tested for its possible impacts on in-situ design, fab-
rication and assembly. The workflow demonstrated
here is aimed at the inclusion of non-professionals in
a collaborative design and fabrication process. For
this purpose, machine inputs are customized to fit
the every-user. In sum, Pop-Up Factory system is
composed of hardware, software and human agents.
METHODS
The Pop-Up Factory installation explores digital de-
sign, fabrication and assembly on-site with using
multiple technologies, controlled via AR. Hardware
components used are Microsoft HoloLens AR head-
set, ABB 1200 robotic arm augmented with a custom-
made hot-wire cutting end-effector and a micro-
phone. Software components used are Rhinoceros
116 |eCAADe 37 / SIGraDi 23 - Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3
Figure 1
The physical
components of the
Pop-Up Factory
installation
Figure 2
The physical and
holographic
components of the
Pop-Up Factory
installation
Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3 - eCAADe 37 / SIGraDi 23 |117
3D modelling, Grasshopper programming, Python
programming; and various Grasshopper plug-ins,
among them: Fologram and HAL robotics. We will
discuss those in the following sections.
1) Holographic Interface: A bespoke AR user in-
terface was designed for a design and fabrication en-
terprise. The installation consists of a workflow in
three stages: design of the brick tower (CAD), fabrica-
tion of bricks (CAM) and manual assembly. AR inter-
face is used to control software and automated hard-
ware, allows navigation between the different CAD/-
CAM stages and serves as guide in the assembly se-
quence. The interface exhibits six applications, ‘Ac-
tion Buttons’, each displaying an identifying icon:
1. Eye / Hand icon: shift between two visualiza-
tion modes: built portion of the tower (hand);
design of the over-all tower (eye).
2. Control Points: controlling the paths that de-
termine the tower’sgrowth patterns and over-
all form.
3. Audio Recording icon: starts and pauses an
audio recording, used to texture the brick’sex-
terior face.
4. Ruler icon: defines the portion of design still
available for modifications (‘abovethe line’) or
already decided (‘below the line’).
5. Send Fabrication Data: initiates wire cutting
tool-path to the robot.
6. Show Assembly Instructions
Hand gesture, Air Tap and Drag, are registered with
the headset’s sensors, activating the interface. Aug-
mented Reality goggles display the proposed design
on-top-of the structure.
2) Architecture: AR-interface is used in the mak-
ing of an architectural-scale brick installation. The de-
sign emerges from the assembly of individual bricks.
Its gradual growth is subjected to user’s input, with
the following limitations:
1. Tower’s dimensions respond to site condi-
tions.
2. Base-to-canopy ratio is limited to ensure
structure stability.
3. Tower’s aggregation is limited by human’s
reach, stopping at 1.9 (m).
4. Component’s dimensions fit a human palm
(15*10*5 cm, approx.).
3) 3D modelling: digital design typically required pre-
vious experience with 3D modelling environments.
To include non-professional users in a collaborative
design effort, a new, intuitive AR-based interface was
developed. The new design pipeline makes use of
sound inputs and hand gestures, to generate and
modify the tower’s design. Holographic-Interface
3D modelling takes place in two scales and is per-
formed as follows: design of the tower is attained
with hand-gestures, shifting control points. Design
of the tower’s façade-pattern is performed through
the processing of audio signals. Sound samples
are divided into channels: tone, pitch and volume.
Each façade element receives a unique pattern, spe-
cific to each user (Betti et al, 2018). A bespoke 3D-
modelling pipeline discretizes the global form into
bricks. The bricks differ in surface texturing and the
geometry of the ___ contact surfaces (responsive to
the tower’soverall geometry) to generate tower’s cur-
vature. User’s inputs are registered in the CAD/CAM
environment and updated in real-time with any new
data.
4) Robotic Fabrication: on-demand fabrication,
using a robotic arm placed in-situ is controlled with
the interface. A bespoke pipeline translates 3D mod-
elling information into robotic machine code. Tool-
path was designed to execute bricks design, with
minimal leftover material. Each tool-path generates
five bricks, cut in black foam. Tool-path was de-
signed to execute brick’s joinery and façade-details,
with minimal leftover material.
5) Assembly: Users assemble the bricks manu-
ally. Assembly instructions are shown in AR and up-
date as the structure grows. Once a series of five
bricks is cut, animated arrows indicate its orientation,
location and assembly sequence. Shifting between
visualization modes allows users to evaluate the on-
going production process, to inspect the individual
and the overall structure.
118 |eCAADe 37 / SIGraDi 23 - Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3
Figure 3
Icons of the
holographic
interface and their
activation
RESULTS
The installation lasted two days and was visited by
general public and colleagues, ranging in age and
design experience. The following workflow was
exercised: installation space accommodated three
production-related areas, designated for 3D design,
numeric fabrication and assembly. A graphic tuto-
rial exhibited on-site introduced visitors with the dif-
ferent holographic design modes and their related
‘Action Buttons’. Participants were first introduced to
AR space and operating-gestures: wearing AR gog-
gles and practicing Air Tap and Drag commands, and
interface-navigation. In taking on collaborative de-
sign, participants explored digital design and fabri-
cation using the ‘Action Buttons’. Each user in turn
added to the global design of the structure, fabricat-
ing and assembling a series of five bricks.
Crowd participation was somewhat limited:
while people were keen to try on the AR googles
and experience the interface and installation in AR,
participation in design was in smaller numbers. We
attribute this mainly to limited time available for each
visitor to interact with the installation.
Human-machine collaboration. Most interface
‘Action Buttons’ were reported as easy to use. Hand
gestures shifting ‘Control Points’, were sometime not
registered. This could be due to its location at the
very top of the Holographic space, missed by the sen-
sors; and errors in reading caused by sun glare com-
ing from the windows of the installation space.
The AR interface was extremely successful in en-
abling a productive interaction with the robotic arm,
Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3 - eCAADe 37 / SIGraDi 23 |119
Figure 4
The various
interaction
elements of the
holographic
interface
Figure 5
Two of the
innumerable
different possible
configurations of
the installation
120 |eCAADe 37 / SIGraDi 23 - Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3
Figure 6
Overview of the
installation through
the holographic
interface
allowing continuous fabrication and no physical in-
teraction between participants and robot.
The hands-free and mobile interface made easy
the free-form assembly of multiple discrete elements,
otherwise extremely arduous. Visual feedback en-
abled design in context and accurate fabrication. The
sequenced design and fabrication process realized
for non-professional was clear and easy to follow.
Human-human collaboration. The designed
work-sequence had a large effect on the nature of the
practiced collaborative design. Though the installa-
tion relied on multiple design inputs its outcome was
bound by design definitions, made beforehand. The
pre-requisites were aimed at increased, speeded fab-
rication of minimal waste and installation-timeline.
Bricks’ dimensions, for example, were pre-defined;
allowing individual input only on its ’façade’ and in
joinery detail. Another example in which user input
was bound to installation predefined logic was a col-
laborative design of a tower only in aggregation.
Another interesting aspect of the participatory
discourse taking place was the sequential nature of
participation. Most participant wanted to contribute
to the design only once. The outcome is different
than what is usually seen in participatory design,
where participants are encouraged to debate and al-
ter the design until the final results is agreed by the
majority of the group.
Our initial hypothesis was then only partially ver-
ified. We still believe that AR can play a role in pro-
moting collaborative design.
Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3 - eCAADe 37 / SIGraDi 23 |121
Figure 7
Human Machine
collaboration
Figure 8
AR-guided
assembly
122 |eCAADe 37 / SIGraDi 23 - Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3
DISCUSSION
1. This paper outlines a novel, integrated and
collaborative approach to design and fabri-
cation, enabled through Augmented Reality,
robotic fabrication and multimodal interface.
In this installation, the custom notational in-
put used for 3D modelling was altered, ac-
cessible to intuitive input. We contribute
to the emerging field of production research
by providing simplified access to innovative
design and fabrication tools. Hand-gestures
and speech were used as multimodal inter-
action with CAD/CAM, in a large-scale, multi-
participant effort. Non-professionals easily
followed visual cues in AR and utilized their
existing communication skills to control 3D
modelling and robotic fabrication, with little
time of training. In doing so, the focus in
architecture-making is shifted from forming
the design itself to forming a platform for oth-
ers to explore design. The public, now in the
focus of architecture-making, participates in
decision-making creative effort and manufac-
turing with using cutting-edge technology.
For the architectural field this marks a poten-
tial shift to public participation in practice and
discourse. In active participation, the public
gains better understanding of the impact of
the design and has the freedom to alter it ac-
cording to their wishes.
2. Recent innovations in the optic industry ac-
celerates simulative abilities, enabling 3D rep-
resentation in-situ. Augmented reality gives
accessed to plentiful, varied information. It is
advantageous in experiencing a proposed de-
sign with context to site or end-user, teach-
ing skills to non-professionals and quality con-
trol. In this installation, AR interface was used
for 3D design on-site, providing context to the
structure; a remote and simple control over
automated fabrication tools; instructions for
precise execution in manual fabrication; and
means of assessing of the assembled struc-
ture. We believe AR as a generative and crit-
ical tool has the potential in bridging gaps
between plan and execution, bringing about
adapted design.
3. The Digital Turn (Carpo, 2013) and parametric
practice embodied a change in workflow. The
use of AR leads a change in design represen-
tation, moving from 2D drawings to 3D ele-
ments. New ways of communicating architec-
ture might have great consequences on the
built environment: in the form of new geome -
try used and new practitioners in the field, in-
teracting with advanced tech.
REFERENCES
Allen, M, Regenbrecht, H and Abbott, M 2011 ’Smart-
Phone Augmented Reality for Public Participation in
Urban Planning’, OZCHI ’11, pp. 11-20
Arshad, SZ 2017 ’Human-in-the-Loop Machine Learning
with Intelligent Multimodal Interfaces’, Proceedings
of the ICML2017 Workshop: Human in the Loop Ma-
chine Learning, Sydney, Australia
Betti, G 2018 ’Communication Landscapes’, Robotic Fab-
rication in Architecture, Art and Design, pp. 74-84
Billinghurst, M 2015, ’A Survey of Augmented Reality’,
Foundations and Trends•Rin Human-Computer, 8, pp.
74-272
Bodner, J 2005, ’The Da Vinci Robotic System for General
Surgical Applications: a Critical Interim Appraisal’,
Swiss Medical Weekly, 135, pp. 674-679
Bonwetsch, FG 2012, R-O-B: Towards a Bespoke Building
Process, John Wiley and Sons
Carpo, M 2013, ’The Digital Turn in Architecture 1992 -
2012’, AD, 1, pp. 8-15
Friederichsen, N, Keller, M and Brettel, MRM 2014, ’How
Virtualization, Decentralization and Network’, World
Academy of Science, Engineering and Technology, 8,
pp. 37-44
Gandia, A, Mirjan, A, Gramazio, F and Parascho, MKS
2017 ’Cooperative Fabrication of Spatial Metal Struc-
tures’, Fabricate 2017: Rethinking Design and Con-
struction, pp. 24-30
Hacmun, I, Regev, D andSolomon, R 2018, ’ The Principles
of Art Therapy in Virtual Reality’, Frontiers in Psychol-
ogy, 9, pp. 1-8
Lau, M, Kim, D, Villar,N and Weichel, HGC 2014 ’MixFab: a
Mixed-Reality Environment for Personal Fabrication’,
Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3 - eCAADe 37 / SIGraDi 23 |123
Proceedings of CHI ’14
Lee, TV and Do, JW 2010, ’3DARModeler: a 3D Modeling
System in Augmented Reality Environment’, World
Academy of Science, Engineering and Technology, 39,
pp. 538-547
Leland, JC, Levi, C and Keating, NOSJ 2017, ’Toward Site-
Specific and Self-Sufficient Robotic Fabrication on
Architectural Scales’, Science Robotics, 26, pp. 1-15
Popovski, F 2014, ’Generating 3D Model in Virtual RE-
ALITY AND analyzing its performance’, International
Journal of Computer Science \& Information Technol-
ogy (IJCSIT), 6, pp. 123-128
G Schuh, TP 2014 ’Collaboration Mechanisms to Increase
Productivity in the Context of Industrie 4.0’, Proce-
dia CIRP 19, Robust Manufacturing Conference (Ro-
MaC 2014), pp. 51-56
Schwab, K 2016, The Fourth Industrial Revolution, Crown
Publishing Group
Ye, L, Duffy, VG, Lin, C and Jung-Su, F 2002, ’Develop-
ing Virtual Environments for Industrial Training’, In-
formation Sciences, 1, pp. 153-170
Zhou, J 2011 ’Applying Spatial Augmented Reality to Fa-
cilitate In-Situ Support’, proceedings of VRCAI 2011,
Hong Kong, China, pp. 195-200
Zlatanova, S 2001 ’3D Modelling for Augmented Reality’,
Dynamic and Multi-dimensional GIS, Bangkok, Thai-
land
[1] https://dfabhouse.ch/
124 |eCAADe 37 / SIGraDi 23 - Simulation - VIRTUAL AND AUGMENTED REALITY 2 - Volume 3