Content uploaded by Paul Granjon
Author content
All content in this area was uploaded by Paul Granjon on Sep 14, 2018
Content may be subject to copyright.
Performing with machines and machines that perform
Paul Granjon
Media Arts and Performance
School of Art and Design
UWIC
Howard Gardens
Cardiff CF240SP
pgranjon@uwic.ac.uk
Abstract
The author is a performance and visual artist whose concern lies in the co-evolution of human and
machine, subject on which he comments with self-made machines. The paper provides an insight
into his artistic practice and defines a personal position on notions encountered in the work such as
delegation to machine, slave-master technological systems and autonomous performing robots.
Technical aspects and underlying concepts of a selection of machines he made for performances
or installations are described. State of the art humanoid robotics research and examples of robotic
art are examined with a critical distance combined to a genuine interest in the development of
contemporary technology.
Biography
Paul Granjon is a visual artist living in Cardiff who has exhibited self-made machines since 1996.
He teaches media art and performance in UWIC, Cardiff. He was one of the artists representing
Wales at the Venice Biennale 2005.
Keywords
Robotics, performance, HCI, animal, evolution, art
Introduction
My interest lies in the co-evolution of human and machine and my practice is based on practical
experimentation with analog and digital technology. This article outlines several examples of
machines I have developed within a performance art context which have extended this
understanding: Furman is a human size kicking robot that fells the performer in front of an
audience; Robothead, a wearable device, overrides the user’s routine functions; the Robotic Ears
and Tail act as a reminder of the animal present in every human; and the Sexed Robots are
autonomous robots that perform without supervision.
After ten years of making machines in a fine-arts context, I can condense the meaning of my work
in the following way: I am promoting an engaged attitude towards technological progress, claiming
humanity through being a learner-maker instead of a user-consumer. In my opinion, acquiring
knowledge of contemporary technological tools, adapted to personal abilities and interests is a
valuable way of appropriating and demystifying some aspects of an environment sometimes
described as a ‘suicidal techno-fetishist society’ (Kontejner Collective 2005: 17), a world where the
human with its fragile wetware and irrational software is likened to a virus, getting in the way of
optimal technological development. In her book The Cybernetic Empire, Canadian writer Cécile
Lafontaine describes accurately the culture I am standing against. She describes how the
cybernetics model developed in the 1950’s by Norbert Wiener and many adopters has spread and
Performing with machines and machines that perform, May 2008, International Journal of
Performance Arts and Digital Media 4(1):45-57
DOI: 10.1386/padm.4.1.45_1
contaminated philosophy and humanities. She argues that reducing the human brain and society to
informational models has a crumbling effect on the values of democracy and humanism, largely
due to the loss of a subjective dimension. “True mutant, the cybernetic subject must constantly
adjust to the human-mechanical system in the midst of which he evolves. Traversed from end to
end by these realities of his environment, he gradually turns into a ‘man with no
interiority’ (Lafontaine 2005: 58).
RobotHead
In 1999 I developed Z Lab Presents, a performance with robots. A BBC Microcomputer was at the
heart of an audio-visual-mechanical system where a human operator demonstrated two robots:
Toutou the singing dog and Robothead, a wearable radio-controlled device. While Toutou is a
funny and appealing creature with big round eyes and furry ears, Robothead (Granjon 2000)
(figure 1) provides a complex field of reflection on the relation of human and machine. A wearable
computer remote-controlled device based on a welding mask, RobotHead was presented as a
Figure 1: robotHead
Photo by Tim Lee
concept robot, a not fully completed construction that illustrates a concept, provides a platform for
the discussion of ideas and opens insights for future developments. The concept was that of a
delegation robot, a semi-intelligent wearable robotic mask that can be programmed to take over its
user for the execution of boring, mundane, intimidating or dangerous tasks. Once the mask is
fitted, the user is blinded. Guided by a series of pre-learned beeps, his/her interface with the
outside world catered for by the robot, the user is free to think about subjects more pleasant or
engaging than the task at bay. I demonstrated the robot’s functions on stage, starting with the robot
on a stand flashing its eyes and saying its name in a deep loud voice. I then put RobotHead on my
head while the robot pronounced several key sentences related to generic situations the user
might encounter (shopping, office meeting, flirting, swearing). The video display on the back on the
stage showed a synchronised animated diagram of the robot and the words pronounced. For the
robot’s motion function, a crude map of a virtual environment is projected. I start walking and
turning, following the cues provided by loud beeps of various frequencies. A dot moving in the
virtual environment echoes my movements on the stage. I demonstrate a practical application of
the combined conversation and motion abilities of RobotHead with a simulated bank attack
scenario. The show ends with a cover version of Krafwerk’s song We Are the Robots, performed by
RobotHead and Toutou, with beep music generated on the BBC Microcomputer.
Issues of control and empowerment, or disempowerment, inherent to the use of increasingly
intelligent machines, are touched upon in almost all of my work since 1995. This issue is echoed
by the notion of master-slave relation encountered in many electronic systems requiring
synchronisation of several units, a typical example being the widespread Musical Instrument Digital
Interface (MIDI) standard. A master device generates a synchronisation signal and triggers one or
more slave machines. In the case of the performance Z Lab Presents described above, the human
element is technically the slave to the BBC Micro, taking cues from the program to start specific
actions. On the other hand, the whole miniature circus is largely brought to life by the human
performer whose speech, support and willing contribution, combined with the audio-visual scenic
display amplify the meagre abilities of the robots.
The notion of delegation to machines, highlighted in a slapstick fashion by Robothead, is a
recurrent theme in my work. Presenting performances and installations based on a thorough if
humorous investigation of contemporary technological progress, I intend to generate and convey a
critical distance from the field of research. Marshall McLuhan wrote that ‘any invention or
technology is an extension or self-amputations of our physical bodies’ (McLuhan 1990: 45). In that
light I question the relevance of surrounding ourselves with an endless and increasingly complex
amount of tools and prostheses. Simultaneously I recognise the irrepressibility of the human urge
to discover, and, as an inventor, I enjoy the possibilities offered by the technology. This contrasted
position is the main motor of my practice.
Furman
One morning of summer 2002 I woke up from a strange dream: a large humanoid creature covered
in fur from head to knees with no visible arms and inhumanly big spiky hair on its legs was karate
kicking in the air. I scribbled a drawing of the creature in a notebook and gradually came to the idea
of trying to build the thing, which by then I had named Furman (figure 2). Not unlike members of
the surrealist movement who sought inspiration from their dreams, I was curious to witness a
figment of my subconscious being transferred to the physical world. After several months of
development and construction I demonstrated Furman (Granjon 2003) in a live performance. The
six foot high pneumatically powered robot was programmed to deliver a karate side-kick. Fitted
with a helmet and a kickboxing practice vest, I received the kick in the chest and fell onto a gym
mattress. I, creator of the machine, was kicked and felled by my own creation in a live enactment
of the Frankenstein complex, the ‘gut fear that any artificial man they created would turn upon its
creator’ mentioned and denounced by Isaac Asimov in several of his novels and short stories
(Asimov 2000: 63).
Figure 2: Furman
Photo by Jennie Savage
There are other examples of violent machines in recent art production. The best known belong to
the electro-mechanical menagerie of the San Francisco based group Survival Research
Laboratories (Survival Research Laboratories 2007). Unlike Furman, most of Survival Research
Laboratories’ remote-controlled machines are potentially lethal. Under the supervision of founder
Mark Pauline, machines made of parts found in military and civilian scrap-yards generate hyper
loud mayhem and destroy each other in spectacular shows called for example A Complete Mastery
of Sinister Forces Employed With Callous Disregard To Produce Catastrophic Changes In The
Natural Order of Events (2007). Survival Research Laboratories aim to shock, exposing the
audience at close range to the sheer destructive power of artificial constructs. Conjuring striking
images of mechanical chaos, they highlight with a rare efficiency a nightmare option where mighty,
fearsome machines rule a world that has become unfit for human beings. SRL succeeded so well
that it is now almost impossible for them to find a venue in the health and safety conscious 21st
century.
Furman is the only humanoid, full size robot I have built so far. Many robotic laboratories around
the world develop various humanoid robots, one of the most famous being Asimo, the well known
Honda machine. In 2006, after more than a decade of massively expensive development, the
Asimo was victim of an embarrassing incident. During a demo taking place in Japan, Asimo,
introduced by a female presenter, was going to climb up a flight of stairs, specially constructed for
the presentation. On a video of the event (anon 2007), one can see the robot talking in Japanese,
and walking towards the stairs. On the third step it slips and falls down to the floor, where it keeps
talking but makes no attempt to get up. Technicians run in, the presenter has a nervous laugh and
a screen is quickly installed between the fallen robot and the audience while the projected Honda
logo (‘The power of dreams’) is turned off and the house lights switched on. The footage provides a
humorous antidote to over-optimist technological visions of the near future, and a reminder that
even multi-billion yens (pounds, dollars, euros,…) projects aiming to reproduce human basic
functions can yield imperfect results.
It is interesting to compare the mighty, corporate Asimo project with the grassroots Robo-one
competition (Robo-one 2007), that originated in the Akihabara area of Tokyo at the beginning of the
21st century. Hobbyists started to organise fights between home-made humanoid robots,
approximately forty centimetre tall bipedal machines powered by servo-motors. Interest in the
competition and its diminutive fighters grew quickly, generating a rapid advance in the hardware
and software techniques developed by competitors. Robo-one’s most advanced machines are able
to walk fast and climb stairs. Most of the humanoids’ decision making is remote-controlled by the
user, while the gait and balance are autonomously driven from an onboard computer. Fully
autonomous Robo-one machines are on their way, the fruit of an organic effort by passionate
individuals operating outside of a corporate system, with widely available tools and components.
The Robo-one community produces open source code and information, with many detailed books
available in Japanese bookshops.
Interfacing and control techniques
The original Furman was the last of my machines to be powered by an antique 8 bits BBC
Microcomputer. In the late seventies the British Broadcasting Corporation launched a programme
aiming to bring computer literacy to the population of the UK and funded the development of a
powerful, affordable and flexible computer that featured many functions, including a user port, a
programmable input-output interface connector to which electronic devices can be attached and
simply controlled (anon 2007). The Cambridge based firm Acorn won the contract with a machine
called BBC Microcomputer model A, followed in 1982 by the model B. I became aware of the BBC
Microcomputer and its history shortly after moving to the UK in 1995. While at the time my medium
of choice was single screen video, with a developing interest in interactive media, I was looking for
a way to tap into the flexibility and power of computers to develop work that would be active in the
physical world. After finding a discarded Acorn BBC Microcomputer model B in a skip I started to
follow tutorials and conduct simple experiments described in the widely available literature detailing
how to use the machine for interfacing and control. I quickly realised that the thirteen year old
computer had lots to offer to the exploratory artist with no background in electronics or computing.
Learning how to exploit the user port and other peripherals of the machine was beneficial in more
aspects than the mere practical level. In order to control an external device from the computer, bits
have to be activated on one of the registers of the processor. For two of the installations controlled
by BBC micros, I had to use assembler, a low-level language where the operation of the processor
is controlled step by step by three letter commands often followed by binary addresses or values.
Although it is common knowledge that computers operate by shifting large amounts of 1 and 0s,
the experience of writing simple code to address a specific area of the machine’s memory,
specifying where the 1s and 0s go and seeing results materialise in form of blinking lights or
spinning motors provided a concrete understanding of digital technology.
In 1997 my first BBC microcomputer-controlled complex robot, the Fluffy Tamagotchi (Granjon
1998) emitted its first demanding noises and produced a dollop of blue poo. Inspired by the
tamagotchi, also known as “virtual pet”, globally marketed since 1998, the Fluffy Tamagotchi
brought back a physical, messy, noisy and cuddly presence to the sanitised toy version of a
domestic animal. The robot was built for a short video, where I demonstrate its various functions.
After developing the Fluffy Tamagotchi, I felt confident enough in my controlling and interfacing
skills to develop machines for live performance. Unlike the artefacts developed for video, where
another take is always possible and editing allows the rubbing out of malfunctions, a robot made
for a public presentation must perform every time with a transportable reliability.
After Furman I started learning how to use microcontrollers, a more contemporary technology for
interface and control. Microcontrollers are simplified computers that fit on a single electronic chip.
The program is written on a standard computer, then uploaded to the microcontroller. The
uploading can take place while the microcontroller is already fitted in its final circuit on the
machine. Once the upload is completed, the circuit can be unplugged from the computer. The
program will run as soon as the microcontroller is powered. Recent microcontrollers have a
footprint of just a few square millimetres, which makes them ideal for embedding in all sorts of
larger objects such as phones, cars, street furniture, clothing, kitchen appliances, music systems. It
is likely that if the present trend progresses without interruption, a very large proportion of common
objects will be fitted with some kind of computing power in the near future. The shift to
microcontroller technology allowed me to create autonomous robots, and tap into a relatively
cheap, evolutive, contemporary way of engineering machines for performances and exhibitions.
The Robotic Ears described below are my first operational performance machine to be powered by
a microcontroller. Their program runs on a Microchip Pic 16F628, located in the control box of the
ears.
Robotic Ears and Tail
In the performance Z Lab Transported (2003-2005), I presented a set of Robotic Ears and Tail
(Granjon 2003), wearable machines that provide their owner with the attributes of a mammal. The
ears are covered in artificial fur while the tail is represented by a blue plastic tube. Ears and tail are
fitted with a control panel that allows both manual and automatic operation. During the
performance I explain the inception of the product, showing original technical drawings and videos
of real animals, before putting on the ears and the tail. I demonstrate their basic functions, starting
with the Robotic Ears (figure 3): the right and left ears can be controlled independently, either
manually or in automatic random mode, while the tail, inspired from a dog tail, can wag at 2
different speeds and be lifted up or lowered down. Still wearing the kit, I sing a song titled Animal
completed by a crude video animation. The song concludes with the following verse: ‘Pretend to be
an animal/Might seem a little special/But if you think about it/It is quite normal/Because there is/In
each of us an animal’. The simple lyrics highlight the robotic prostheses as a crude attempt to
regain touch with a primal level of cognition buried deep by centuries of civilisation and
technological progress. While the Robotic Ears refer to those of a bear or teddy bear, the
hairlessness of the Robotic Tail is an acknowledgment of the slick plastic appearance of most 21st
century electronic animal toys, where the bestiality and tactile, sensuous contact provided by the
fur is replaced by a sterile surface as unlikely to inspire a cuddly relation as to provide a hospitable
environment for fleas.
Sexed Robots
Developed in 2005, the Sexed Robots (Granjon 2005) (figure 4 and figure 5) are a pair of
autonomous performing machines, respectively male and female. Both individuals are functional
looking aluminium platforms on plastic wheels. They are identical except for their gender specific
genital module, machined in industrial white nylon. Presented like animals in a zoo, in an enclosure
separated from the visitors, they are programmed to move autonomously, avoiding obstacles and
beeping. Their program randomly activates at regular intervals one of four different states: moving,
singing, sleeping, in heat. When in heat, the robots attempt to locate a partner and mate, until at
least one of them switches off from the in heat mode and resumes non-sexual activities.
Figure 3: Robotic Ears
Photo by Chris Webb
The familiar zoo exhibit set-up and the stripped down mechanical activities that take place invite
the observers to project their own narrative of seduction, rejection, sexual prowess, relationship
and love on the robots. The machines’ simple design and behaviour lend themselves to
characterisation and some degree of anthropomorphism, thus engaging the viewer more intensely.
In a humorous and clumsy fashion the Sexed Robots operate at the border between mammal and
machine worlds, being programmed like computers to perform tasks that belongs to the highly
developed organic forms of the planet. It is my hope that the displacement of a fundamental
mammal activity into cybernetic performers might lead the viewer to reflect on the place artificial
designs occupy in their environment, life and perception.
Figure 4: Male Sexed Robot
Photo by Paul Granjon
Known sexed robots are mostly aimed at operating on human beings and generally belong to the
category of the sex toy. The only other example of sexed robots designed to have intercourse with
other robots that I know of in the field of robotic art were made in 1988 by Norman White and Laura
Kikauka for a piece called Them Fucking Robots (Smith 1988) White built the male and Kikauka
the female, ‘without consulting each other on the particulars, apart from the dimensions of the
engaging organs’. While details of the robots’ operation are obscure, we know that the male had a
built-in strobing orgasm function and that the female components included a boiling kettle and a
squirting oil pump. I recently met a direct witness of the robots’ first encounter that took place in an
artist studio in Toronto. He told me that when brought together, Them Fucking Robots could not
have intercourse because the male organ was too big for the female’s.
Figure 5: Female Sexed Robot
Photo by Paul Granjon
Autonomous agents
The Sexed Robots demonstrate a semi-autonomy (they need assistance only for battery
replacement) and programmed behaviours that give them the appearance of simplistic life forms.
More impressive are current developments in the field of evolutionary robotics and artificial
intelligence. In the Laboratory of Intelligent Systems in Lausanne, Professor Dario Floreano has
developed several projects where neural network-controlled robots are evolved using genetic
algorithms. Starting with a random set of instructions (genetic material), several robots are let loose
in an arena where a specific task awaits them. The genetic material of the most successful
individual is truncated and a different, randomly mutated complete version is loaded in each
member of the colony (Nolfi and Floreano 2000: 19). After several hundred generations, successful
behaviours emerge, with no directive human programming. A clear example of evolution is
presented in the video Evolution of Collective Foraging (EPFL’s Laboratory of Intelligent systems
2005) where robots of generation 0 stumble against walls and shake while at generation 146 they
demonstrate effective group behaviour.
Although most applications of neural networks in the field of art are limited to screen based
installations, Nicolas Anatol Baginsky’s Three Sirens (1992-2005) are a band of robotic musical
instruments, where ‘artificial neural networks control every aspect of the robot's
activities’ (Baginsky 2005). The robots’ neural networks – coincidentally initialised – learn
organising and unsupervised melody improvisation, eventually producing a reasonably convincing
kind of rock music.
Operating as an isolated idiosyncratic scientist, ‘self-taught tinkerer’ (Grand 2003: 3) Steve Grand
has been working for several years on the development of Lucy, an autonomous robot referred to
by its maker as a daughter. Lucy is a child-size, legless humanoid robot with a monkey face.
Grand’s ambitious quest is to build at home a conscious robot that will lead to better understanding
of the human brain. His approach is a middle way between two diametrically opposed theories of
artificial intelligence: the top-down approach, which attempts to simulate a fully functional human
brain, and the bottom-up approach that mimicks the behaviour and body of insects. In Grand’s
words, Lucy is ‘a biologically inspired and plausible approach to the replication of higher level,
mental phenomena. Her intelligence is not "programmed in”, [it] is an emergent consequence of
the interactions between thousands of simulated neurons, and, moreover, it is learned, rather than
given to her a priori. She won't ever be very smart, but it might not be too far from the truth to say
that she will eventually have a mind of her own, albeit a very, very stupid one’ (Grand 2003). Grand
developed the commercially successful computer game Creatures in the late 1990’s. The game
applied principles of artificial intelligence and artificial life to the simulation of a colony of bipedal
animals living in a virtual forest. With Lucy, Steve Grand tackles a different level of complexity, the
transition to hardware coming with messy cables, friction, heat and demand for vast processing
power. Steve Grand’s project, not fitting easily in the fields of academic science, commercial
research or art is a unique quest for an embodied form of conscious artificial life. The project is
currently on hold for lack of funds.
In the recent years an increasing number of autonomous or semi-autonomous physical machines
made their way to art gallery spaces. To name just a few, Andy Gracie’s Fish, Plant, Rack (2004),
Sabrina Raaf’s Translation: Grower2 (2005-5), Ken Rinaldo’s Autotelematic Spider Bots (2006) are
exhibited regularly to a curious audience. In the field of live performance, such machines are
extremely rare. Early robotic shows by Ullanta Performance Robotics group, such as Fifi and Josie,
A Tale of Two Lesbiots: A Story of Autonomy, Love, Paranoia, and Agency (1995) where all cast is
constituted of autonomous mobile robots, are an exception. My only attempt so far at integrating an
autonomous machine in performance was in The Heart and the Chip (2006), where I ride on a
modified Female Sexed Robot. Wearing a Robotic Perception Kit I attempt to locate the Male
Sexed Robot in heat and drive the female to it. Other projects involve robotic sidekicks contributing
in a semi-controlled fashion to the delivery of performance lectures, and plans to develop a robot
for a piece inspired from Joseph Beuys’ I love America and America Loves Me (1974), where he
shared a gallery space with a live coyote for one week. The coyote would be replaced by a
potentially dangerous mobile robot, capable of detecting human presence and evolve a
responsive, learning behaviour.
Conclusion
Both the performance machines which I activate in front of a live audience and the autonomous
performing machines provide a field of experimentation and reflection with a direct, embodied
impact. In the case of performance machines, the robots’ performance and the concept they
illustrate are largely conveyed by the human performer. The performing machines, inspired from
animals, provide a more contemplative, observational platform. In both cases the respective status
of the machine and that of the human or animal is exposed and commented upon in an original
manner.
On a technical front, electronic components are increasingly of the surface mount technology
(SMT) type, replacing the easily handled and soldered older chips. Steve Grand describes his
problem with SMT technology: ‘surface mount chips can be very densely packed […] They are
easy to place using robots, and to solder using expensive temperature controlled ovens. But I’m
not a robot…’ (Grand 2003: 69). Presently, many components are still offered in both formats, but
newer components are only available in SMT format, and older ones are being phased out rapidly.
Eventually, there will be no alternative but to use robots to make robots. In a similar way, discarded
electronic hardware comprises less and less to offer to the tinkerer, all parts being so small and
integrated that recycling of parts is difficult or impossible.
In October 2005 I saw on television a Philip K. Dick android (Hanson 2005) having a conversation
with a member of the public who was asking it why Bicentennial Man, was its (K. Dick’s) favourite
film. After an alarmingly long gap, the android blinked and repeated the question, before starting to
buzz and stutter in an alarmed fashion words that sounded like ‘Bugs, they are all around us, they
are all around us!’. Obviously the programmers had opted for the android to jump in a K. Dickian
schizophrenic response loop when caught off guard. Placed in parallel with the technical and
psychological complexity of Philip K. Dick’s simulacra, claws (K. Dick 1990: 31) and replicants this
example illustrates how far robots still have to go before they match the science-fiction-fed public’s
expectations. I regularly face disappointed faces when I explain that no, Furman can’t walk…
References
Kontejner collective, (2005), ‘Vampires, Butterflies, Golden Girls, Semi-Living Entities and Other
Miraculous Encounters’, Touch Me, Kontejner, Zagreb.
Lafontaine, C. (2004), L’empire Cybernétique, Seuil, Paris.
Granjon, P. (2000), http://www.zprod.org/PG/machines/robotHead.htm, last checked October 27th
2007.
McLuhan, M. (1994), Understanding Media, MIT Press. First published 1964.
Granjon, P. (2003), http://www.zprod.org/PG/machines/furman.htm Last checked October 27th
2007.
Asimov, I. (2000), ‘That Thou Art Mindful of Him’, The Bicentennial Man, Millenium. First published
1964.
Survival Research Laboratories (2007), http://www.srl.org, last checked October 27th 2007.
Anon (2006), http://www.pinktentacle.com/2006/12/asimo-help-me-ive-fallen-and-i-cant-get-up/,
last checked October 27th 2007.
Robo-one (2007), http://www.robo-one.com/, last checked October 27th 2007.
Anon (2007), http://en.wikipedia.org/wiki/BBC_Micro, last checked October 27th 2007
Granjon, P. (2003), http://www.zprod.org/PG/machines/fluffyTamagotchi.htm, last checked October
27th 2007.
Granjon, P. (2003), http://www.zprod.org/PG/performances/zLabTransMovie.htm Last checked
October 27th 2007.
Granjon, P. (2005), http://www.zprod.org/PG/machines/sexedRobots.htm Last checked October
27th 2007.
Smith N. (2007), http://www.normill.ca/artpage.html, last checked October 27th 2007.
Nolfi S. and Foreano D. (2000), Evolutionary Robotics, MIT Press.
Ecole Polytechnique Fédérale de Lausanne’s Laboratory of Intelligent Systems (2005), http://
lis.epfl.ch/research/projects/EvoAnts/videos/EvoShort-lowres.mpg, , last checked October 27th
2007.
Baginsky N. A. (1992-2005), http://www.the-three-sirens.info/binfo.html, last checked October 27th
2007.
Grand S. (2003), Growing up with Lucy, Weidenfeld and Nicolson, London.
Grand S. (2004), http://www.cyberlife-research.com/people/steve/, last checked 28th January 2008.
Grand S. (2003), Growing up with Lucy, Weidenfeld and Nicolson, London.
Hanson Robotics (2005), http://hansonrobotics.com/humankind.htm#video, last checked February
11th 2008.
K. Dick, P. (1990), ‘The Second Variety’, Second Variety, Grafton Books. First published 1953.
Performing with machines and machines that perform, May 2008, International Journal of
Performance Arts and Digital Media 4(1):45-57
DOI: 10.1386/padm.4.1.45_1