Content uploaded by Fiona French
Author content
All content in this area was uploaded by Fiona French on Oct 10, 2015
Content may be subject to copyright.
1
The Life Project
Camille Baker
Brunel University
London, UK
camille@swampgirl67.net
Fiona French
London Metropolitan
University
London, UK
f.french@londonmet.ac.uk
Evan Raskob
Openlab Workshops
London, UK
evan@FLKR.com
Nick Rothwell
Cassiel.com
London, UK
nick@cassiel.com
with Andrew Crowe, Giorgio Demarco, Steven Fortune, Gustavo Guerrero, Lori Ho, Simon Katan, Chris Lowell, Manuel
Mazzotti, David McLellan, Francesca Perona, Darren Perry, Elvia Vasconcelos and support from SPACE Studios
The Life Project explores issues of psychological projection into technology by diving into the
convoluted relationship between practical purpose and emotional attachment, through both the
creative act of designing and making robot entities with artificial emotions, and the social act of
engaging with them. This process explores the concept of body representation through a multi-
identity in virtual and physical blended space. In a lesser sense, it also suggests a future world of
collaboration between physical and virtual forms, enabled by new forms of representation in
blended worlds.
Robots, Avatars, Open Source, Twitter, QR Code, Emergence, Virtual Lifeform, Craft, Design, Workshop.
1. INTRODUCTION
The Life Project was originally conceived by
Openlab Workshops as a collaborative workshop
series for a diverse group of artists, designers,
makers and musicians, developed as part of the
Permacultures exhibitions at SPACE Studios
(SPACE 2012). The aim of the workshops was to
explore the boundary between the virtual and the
real by examining our complex, mutually dependent
relationship with technology. This aim was to be
achieved by designing and building an “ecosystem”
of small digital Creatures (or robots) that would
mutually interact and influence each other, and also
interact with human participants who could choose
to feed them and/or alter their environmental
parameters in meaningful ways.
Inspiration was taken from a variety of sources
including generative systems such as Conway’s
Game of Life as an investigation of emergent
behaviour, ecological and environmental concerns,
digital pet toys such as Tamagotchi, video games
and AI, as well as current research into modeling
emotional intelligence systems.
2. VIRTUAL RELATIONSHIPS
Most technologists understand that robots are
purely mechanical devices, incapable of friendship,
inner thoughts, or human emotion. This practical
knowledge directly conflicts with our primal human
instinct to anthropomorphise and empathise with
animate beings, assigning feelings to them that
they may or may not have, but that we certainly
possess (Nass et al, 1997). For example, silencing
a creature’s thoughts and emotions is a decision
we do not take lightly: studies show that manually
powering off a robot while it is actively moving
around and making sounds causes emotional
conflict in almost all people, regardless of whether
or not they understand that the mechanism is,
purely, a robot (Turkle, 2011, loc 815-816). In a
fundamental way, we embody our technological
avatars with our own consciousness.
The designing and building workshops forming the
core of the project operated according to an
“ecological perspective” of art, which Suzi Gablik
(1991) explained as:
connect[ing] art to its integrative role in the
larger whole and the web of relationships in
which art exists, emphasizing community and
environment, and giving a deeper account of
what art is doing, reformulating its meaning and
purpose beyond the gallery system, in order to
redress the lack of concern, within the aesthetic
model, for issues of context and social
responsibility.
Forming a relationship with something or someone
is a process that occurs over time. This basic
observation drives the focus of the workshops on
“process, not product.” The end result of the
workshops is less reliant on an actual, completely
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
2
realised, physical “Creatures” than the a formative
process of social discourse and learning centered
around the planning and making them. Through
making the Creatures and their world we gradually
unpick the complex, recursive relationship we have
between the things we make and ourselves, or to
paraphrase McLuhan in Understanding Media
(1964), how we shape our tools and our tools in
turn shape us.
3. PROJECT DEVELOPMENT
3.1 Overview of the First Iteration
The first iteration of the project started in April 2011
and culminated in a public exhibition in October
2011, at SPACE Studios in Hackney, London (Fig.
1), where the team installed working digital
components and displayed important concepts
developed over the first phase of the project.
Figure 1: Exhibition in SPACE Studios
Over these seven months, a group of around
twenty volunteers came together to respond to the
original brief, under the guidance of facilitator Evan
Raskob of Openlab Workshops. They met weekly
to discuss and debate the project’s design and
outcomes at first, and later on to take part in
learning and building sessions where they both
learned and taught digital and craft skills in a
collaborative space at SPACE. Team members
also met virtually during the week, and blogged
about the meetings (Life Project blog 2012) so that
members of the public, as well as team members,
could engage with the process. The exhibition in
October showed the results of their sessions: a
series of digital “Creatures” situated in an
sculptural, responsive “Environment,” along with
diagrams explaining the Creatures’ AI and the plan
for the finished installation.
Figure 2: Creatures exchange feelings
Inherent to the Creatures was an emotional AI
system, based on a simplistic model of human
emotions (Lee et al, 2008, p.104-113), so they
could affect one another (Fig.2) and respond in an
appropriate way to human participants.
The Environment was designed to influence the
Creatures’ emotions and act as a conduit between
the outside world (Twitter and other Internet
sources) and each Creature’s self-contained world
in the gallery space.
Figure 3: The Happy Cave
Both the Creatures and the Environment were
constructed from a system of microprocessors with
a variety of I/O including sensors, lighting, sound,
and actuators. Creatures communicated invisibly
with one another and with their constructed
Environment using wireless radio and infrared
pulses.
Areas of the Environment could influence the
Creature’s emotional states in appropriate ways,
such as the “Anti-Social Forest” constantly
broadcasting an “anti-social” emotion to the
Creatures using infrared communication, and a
“Happy Cave” (Fig.3) similarly broadcasting
“happiness.”
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
3
As the Creatures’ emotional AI and response
systems were developed, the team supplied them
with their own individual Twitter accounts for
sharing their emotions and responding to
humans. Additionally, people could feed the
Creatures by scanning QR codes attached to them,
and alter their moods by repositioning them at
specific places in the Environment.
3.2 Technological Considerations
The technology platform for the project was driven
by the core philosophy of the facilitating
organisation, Openlab Workshops (OLW). OLW
develops and runs educational workshops across
the UK, but mainly in London. Their focus is on
teaching people how to use Free/Libre Open
Source Software (FLOSS) technology effectively in
their own creative practice. This involves
explaining copyright and legal issues surrounding
free and non-free software, developing practical
software and hardware skills, constructing projects
using collaborative methods, and examining how
technology directly affects the creative practise.
OLW was founded on principles of openness,
transparency, collaboration, and sharing and
requires all workshops to use as much FLOSS
technology as possible to achieve their educational
aims. For this project, the team required an easy to
use, Open Source hardware platform that was
inexpensive to fit the project’s small budget, and
yet versatile enough to allow the team to
experiment with a wide variety of sensors,
actuators, and other feedback devices.
The team chose the Arduino platform (Arduino
2012), and the Arduino-derived microcontroller
platform called the JeeNode (JeeLabs 2012)
because they met most of these
requirements. Arduinos are microcontroller
experimentation platforms with Open Source
hardware designs, running Open Source software,
and programmed using an Open Source
development application (IDE). Physically, they
consist of a programmable microprocessor
mounted on a prototyping board with standard
connections.
The main benefits of the Arduino platform are its
simplified, cross-platform programming system and
its large and supportive community. Many learning
resources are available online, and a wide variety
of manufacturers take advantage of their open
standards to provide a wide variety of functional
accessories such as motor controllers and wireless
communication devices. The main issues with the
Arduino were its relatively large size and cost,
compared with the size and cost of standalone
microcontrollers and basic components. Each
Arduino would have taken up over half of our target
cost of £40 per Creature, if used.
Instead, the team chose the JeeNode. The
JeeNode is a cheaper, low-power Arduino-
compatible board with a smaller footprint and short-
range wireless radio communication technology
built into it. JeeNodes have a variety of
inexpensive, useful add-on boards that provide out-
of-the-box functionality for controlling multiple
LEDs, motors, and other I/O devices. The main
benefits of the JeeNode were its lower cost, smaller
footprint, and additional cost savings from having a
built-in wireless board instead of needing to
purchase separate units. The downside is that
programming the JeeNodes is more complex due
to their software’s use of more advanced C++
language features and more efficient but complex
communication protocols (often using bit
operations). This means more time would be spent
on rudimentary software development, debugging,
and learning new software skills, rather than on
active prototyping and systems design.
For rapidly prototyping design ideas in software,
the creative coding platform Processing
(Processing 2012) was used. Processing is an
Open Source, Java-based platform for quickly
sketching out ideas in code. The team used
Processing to test out ideas in Artificial intelligence,
and to interact with Twitter.
All source code and design documents are licensed
as Free Software or Open Source and available
online (Pixelpusher on Github, 2012).
3.3 Group Organisation
At the start of the project, the team was quite large
and so work began in smaller teams geared
towards specific project development roles: a
Design Team, an Embedded Systems Team, a
Materials Team, and a Web Team. The Design
Team was tasked with developing concepts for
Creatures: how they might look, where they would
exist, what they might do. The Embedded Systems
Team would develop practical digital ideas for
building the Creatures: sensors, locomotive
systems, communication systems, lighting. The
Materials Team would investigate practical building
materials for the Creatures and their environment,
such as fabrics, latex, and sculptural
materials. The Web Team was responsible for the
blog, group forum, Twitter, and other modes of
web-based public and inter-team communications.
After some activity, we found that an Embedded
Systems Team and Design Team were sufficient
groupings to cover all design issues and decided
not to use a separate Materials Team.
3.4 Documentation of the Design Process
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
4
The Design Team looked carefully at the
relationship between the Creatures and humans,
and between the Creatures and their Environment.
From team notes, they first started out simply with
the concept of “robots that talk to each other” and
moved on to looking at more complex interactions
between the robots. They explored behaviours from
animal life: feeding, sleeping, procreating, being
lonely, loving each other, even boredom. The
proposition was that without care, Creatures die;
left to themselves, they cooperate with one
another; overstimulated, they would grow apathetic
and bored, even to the point of dying.
Using such a small number of states to represent
the complexity of this robotic life was inspired in
part by Conway’s simple cellular automata, in the
“Game of Life”, Alberto de Campo’s digital
Creatures project Varia Zoosystematica
Profundorum (de Campo et al, 2010) as well as
general chaos theory and generative forms in
nature such as in sea shells (Meinhardt 1998).
Representing reality in a convincing form is
something computers fundamentally do with only
two states, the binary 1 and 0. It is the structures
and processes encoded in simple state values
which create meaning: in this case, the way in
which the each Creature’s state changes over time
and in relation to the states of other Creatures and
the Environment.
A theme arose of ‘anthropology versus husbandry’;
thus taking an ethnographic approach of observing
the Creatures from an objective distance, without
disturbing them. This was contrasted by the view
that we should take a hands-on approach, using
directly intentional actions to guide their evolution.
A question arose as to whether the team should
create Creatures requiring human assistance
(“domestic” Creatures), or “wild beasts” existing
independently of outside intervention. This choice
was viewed as “Captivity” versus “The Wild;”
Nature versus Nurture; Ecology or Evolution.
“Please don’t feed the machine” wrote Elvia in her
notes.
The team expressed strong moral and ethical
feelings around this topic. Was human intervention
in nature damaging by its very nature? Members
brought up the terms “purity” and “contamination” to
qualify this relationship. The discussion about
creating artificial “life” quickly became a proxy for
discussing our own fraught relationship between
humans and the life around us. Are we humans
simply another part of nature, or something
intrinsically different? Another discussion, inspired
by Internet “memes,” centred around the theme of
“language as virus,” where our language or
interaction could infect the Creatures’ programming
and spread amongst them. Might they contaminate
us back in some way?
The team decided that the “cellular” Creatures
should have their own behaviour when people were
not around, as well as their own language for
communicating with one another. There was some
debate as to whether this language should be
observable, and therefore, potentially
understandable by humans, or invisible, or just
plain unintelligible. “Robots should communicate
with one another foremost, and then how the
audience interacts with them can be considered the
emergent behaviour of the system” wrote Gustavo.
It was suggested that, even if they were
incomprehensible in physical form, the Creatures
would use Twitter and possibly other Internet
mediums to broadcast out thoughts and desires
whenever they felt necessary. In the end, the team
decided to use light and sound so that people
observing the Creatures could figure out what the
robots were communicating. Additionally, Twitter
feeds would give a wider human audience a
voyeuristic look into the fishbowl-like lives of the
Creatures.
The need to make a physical installation eventually
focused the discussion on creating a list of
functionality and requirements for building the
Creatures, which became:
(i.) They transmit emotional states to one
another using an “emotional map”
(ii.) They transmit “physical states” to one
another, such as DEAD and DYING (these
states were eventually combined with the
emotional states)
(iii.) They live on a raised surface (a table)
(iv.) They contain speakers inside for
communication and feedback
(v.) They are transparent, with embedded
LEDs, so we can see what goes on inside
(vi.) They have “eyes” (to transmit and receive)
(vii.) There is an Ethernet connection on one
Creature to communicate with the outside
world
(viii.) The Creatures feed off “Twitter energy” of
followers
(ix.) The installation space should allow for
remote viewing, using an overhead camera
and microphone
(x.) Human interactions – picking up, moving,
reaching IR beam – change emotional
states
3.5 The Environment
The Creatures, being physical beings at the core,
had to exist somewhere. The place could not be
an arbitrary anywhere, but had to be a particular
place, with a clear concept behind it. This
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
5
Environment also had the responsibility of bridging
the gap between the physical world of the
Creatures and the virtual world of the Internet.
The Environment was not a passive medium that
the Creatures existed on top of, nor was it intended
as simply a transparent translator/broadcaster for
the Creatures. It was to be an agitator with multiple
personalities, acting in the same way as a
landscape does when it provokes emotions in
humans. Places are never neutral; arguably, they
influence us just as much if not more than we
influence them. ”We shape our buildings;
thereafter they shape us,” said Winston Churchill
(1943).
The team devised a list of key emotional features of
the Environment. These features would broadcast
emotions to the Creatures and affect their
emotional states:
(i.) Antisocial Forest (at Francesca’s
insistence): a place for Creatures to go and
be alone; a forest of solace
(ii.) Surprise Rocks: triggering unexpected
changes in the Creatures’ emotional states
through “surprise”
(iii.) Orgasmatron: a place that makes the
Creatures “horny” – homage to Wilheim
Reich’s Orgone Energy Accumulator
(Guardian 2012), as parodied by Woody
Allen in the movie Sleeper (1973)
(iv.) Happy Cave: a primordial place of
happiness
(v.) Social Plain: a large, flat area where many
Creatures could be positioned by
participants and observed communicating
with each other
Other features were necessary, such as a place to
put the electronics and wiring. Thus, the “Happy
Cave” was born as a cavern in the central plateau,
with an Antisocial Forest growing on top of it. The
electronics (microprocessor, wireless transmitter,
wiring) were designed to sit in the middle of the
structure housing these two features, easily
accessible via snap-off top under the forest.
At the same time, the Environment listened to
broadcasts from the Creatures – hunger, emotional
interactions between them – and relayed them to
the outside world via Twitter. It also worked in
reverse, taking communications via Twitter and
relaying them back to individual Creatures. These
two processes were referred to as “The
Conversation.”
3.6 The Conversation
The Conversation Flowchart (Fig. 4) described the
“conversation” between the outside world (people,
Twitter, the Web) and the Creatures. People in the
installation space could scan the unique QR code
(ThinkMakePlay 2011) attached to every Creature,
which triggered the Environment to feed them.
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
6
Figure 4: Conversation flowchart
The left side of the chart above explains this
process, which involved decoding the unique QR
code into a Web address (URL) that, when
accessed in a web browser such as Firefox, ran a
PHP script. The script stored the cumulative
number of scans of the QR code in a database,
then ‘tweeted’ this number to the Creature’s
individual Twitter account. Finally, it returned a link
to this Creature’s twitter account through the web
browser for the participant to follow and access the
Creature’s tweets. This complex QR code tracking
process was meant to be invisible to the
participant, whose intention was simply to feed the
Creature.
Figure 5: Creature with QR code beside digital innards
This was only half of the feeding process. The right
side of the chart explains how the Environment
retrieved the sent information from the web
database, and then handled it by “feeding” all the
Creatures whose QR codes were scanned. This
involved wirelessly broadcasting “feeding”
messages targeting specific Creatures by their
unique, internal id number.
The Environment also received tattle-tale updates,
broadcast from the Creatures wirelessly, telling it
about interactions between the Creatures: how one
influenced the emotional state of another, such as
making it feel “happy” or “sad.” This information
was fed back to Twitter and the outside world in the
form of public tweets, so that observers could
follow these interactions.
3.7 Emotional Intelligence
These “simple” little Creatures are designed to be
more interesting in aggregate than as individuals.
The Creature state flowchart (Fig. 6) described
what went on inside the Creatures’ “brains.”
Figure 6: Creature state flowchart
Internally, the Creatures’ emotions were constantly
in flux, evolving over time based on a large table of
rules (their AI). Left on their own, they would
change emotions in unpredictable ways, based on
these rules, which were in turn based on the
neuropsychology of how humans change emotion
(Lee et al, 2008, p.104-113). Externally, the
Creatures broadcast their emotions invisibly, for
other Creatures to pick up. This is like looking at
someone else and reading their emotions –
glancing across the room and seeing an angry
man, and becoming confused or fearful. Again, the
Creatures have a large table of rules (Pixelpusher
on Github, 2012) for responding to others’
emotions.
While the Creatures are “feeling” a particular
emotion, they emit appropriate sounds via a piezo
speaker and cycle through colourful animations
using dual embedded RGB LEDs (Fig. 5).
The Creatures update their current emotional state,
and respond to external emotional broadcasts, via
a look-up table in their Arduino code (Pixelpusher
on Github, 2012) for the external reactions and
another for the internal ones (Fig. 7). These tables
are essentially lists of probabilities for determining
which state a Creature manifests when either time
increases (for the internal case) or another
Creature confronts it with an emotion (for the
external case).
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
7
Figure 7: internal emotion changes
3.8 Manifestation
The team spent some time considering how best to
represent the range of emotional states of the
Creatures using a limited number of actuators –
multi-coloured LEDs, sounds and tweets. The
design of the audio and the colour animations
came from the team’s own artistic ideas, inspired
by Dave Griffiths’ research into colour-emotion
waveforms (Griffiths 2012).
Examples:
(i.) state: Happy > colour = yellow/green/red >
animation = pulsating very brightly.
(ii.) state: Sad > colour = Blue > animation =
very slow dimming, then brightening a little,
then dim again.
The sounds were synthesised/procedurally
generated to miniature speakers on each
creature. Tonal variation was inspired by familiar
human non-vocal noises associated with particular
feelings (culture-specific).
Examples:
(i.) emulating a “wolf-whistle” to express
“horniness”,
(ii.) a “chirp” to show “happiness”,
(iii.) a low drone for antisocial feelings.
Each creature had its own Twitter account with an
application (@LifeButNot) developed using the
Twitter API. They are provisionally named
“01thing” to “10thing” (after Thing 1 and Thing 2 in
Dr Zeuss “Cat in the Hat”) and they all follow each
other, thereby creating some noise when all
switched on and programmed to tweet their
feelings. In the last iteration, the messages were
minimal - eg. “Thanks for feeding me at 14.41.52 -
31/10/2011” and “03thing made me DISGUSTED
at 14.40.48 - 13/10/3022.”
An early plan to design a one-size-fits-all clone
casing for the Creatures (Fig. 8), failed because the
team’s creativity meant that ideas were wide-
ranging, from the clean simple lines of identical little
boxes to bizarre organic tentacled jellyfish. This
led to a series of exploratory workshops in August
and September 2011, where the design team
joined with members of the public to take part in an
art project/communal crafting exercise where
participants began to personalise the Creatures by
creating individual cases using recycled materials.
Figure 8: Creature sketch from design workshop
4. Conclusions
The Life Project has successfully met its original
aim of exploring embodiment and identity through
the collaborative process of creating an
“ecosystem” of little machines that live, grow,
communicate and die with one another, all in the
presence of humanity.
The major challenges can be summarised as
follows:
(i.) Creative Collaboration: Facilitating a large,
diverse group of creative people to work
and collaborate effectively together is a
daunting task, since The Life Project
provided a rare opportunity for designers,
crafters, artists, programmers and
engineers to develop ideas together on a
shared brief, problem-solving and
negotiating milestones from concept to
finished artefact.
(ii.) Developing an Emotional Intelligence - The
Life Project explored how software and
hardware could be used to represent and
communicate changing emotional
states. Experiments used software
probability tables, animated lights, sounds,
and tweets - a wide range of outputs, each
with their own complexity.
The Life Project
Camille Baker, Fiona French, Evan Raskob, Nick Rothwell
8
(iii.) Communication and Interaction - The Life
Project investigated modes of
communication between software and
hardware agents and people. Using Infra-
red LEDs (light emitting diodes) as
transmitter/receiver between Creatures and
Environment; using Twitter to respond to
people via Social Networks; using QR
Codes to enable creature husbandry from
the public - again with their unique
requirements to interconnect with the rest,
adding another layer of complexity (and
chaos).
(iv.) Look and feel - The Life Project provided
an opportunity for community involvement
in the later stages of project, which
required some of the complexity to be
made more readily understandable and
accessible for simple engagement and
interactivity.
(v.) FLOSS Integration – The FLOSS
community and technology was essential
to this project. Without the Arduino
community, and their companion
JeeNodes, the team would have had to
purchase expensive proprietary systems or
spend much more time developing core
technology. The team reciprocated by
distributing all code, diagrams, and
blogging about the development process.
The team intends to maintain the project as a
communal art installation, organizing future
workshops and inviting members of the public to
contribute their own creative designs to and interact
with a slice of digital ecology. The aim is provide
future teams the opportunity to study the interaction
and the effectiveness of the concepts and intended
user interaction, in order to draw conclusions about
our complex and interdependent relationship with
technology and the “natural” world.
5. References
SPACE Studios. http://spacestudios.org.uk
(retrieved 02 April 2012)
Life Project blog.
http://lifeproject.spacestudios.org.uk
(retrieved 02 April 2012)
Nass et al. (1997) Computers Are Social Actors: A
Review of Current Research. In Human Values and
the Design of Computer Technology, Batya
Friedman ed. CSLI Productions, Stanford, CA.
Turkle, Sherry (2011) Alone Together: Why We
Expect More From Technology and Less From
Each Other, Kindle Edition. Basic Books.
Gablik, Suzi (1991) The Reenchantment of Art.
Thames and Hudson: New York.
McLuhan, Marshall (1964) Understanding Media.
1st Ed. McGraw Hill, NY.
Tien-Wen Lee, Raymond J. Dolan and Hugo D.
Critchley (2008) Controlling Emotional Expression:
Behavioral and Neural Correlates of Nonimitative
Emotional Responses. Cerebral Cortex, January
2008;18:104-113
Openlab Workshops. http://openlabworkshops.org
(retrieved 02 April 2012)
Arduino. http://arduino.cc
(retrieved 02 April 2012)
JeeLabs. http://jeelabs.com
(retrieved 02 April 2012)
Processing. http://processing.org
(retrieved 02 April 2012)
Pixelpusher on Github.
https://github.com/pixelpusher/EmotionalCreature
(retrieved 02 April 2012)
de Campo, Prof. Dr. A., Hannes, H., Wieser R.
(2010) Varia Zoosystematica Profundorum. XIII
Generative Art Conference (GA2010). Politecnico
di Milano University, Italy
Meinhardt, Hans (1998) The Algorithmic Beauty of
Sea Shells (The Virtual Laboratory). 2nd enlarged
ed. Edition. Springer.
Wilhelm Reich: the man who invented free love |
Books | The Guardian
http://www.guardian.co.uk/books/2011/jul/08/wilhel
m-reich-free-love-orgasmatron
(retrieved 02 April 2012)
ThinkMakePlay.
http://www.thinkmakeplay.co.uk/life/
(retrieved 02 April 2012)
Griffiths, Dave: dave’s blog or art and
programming.
http://www.pawfal.org/dave/blog/2011/08/colourful-
emotions/
(retrieved 02 April 2012)