Content uploaded by Steven Kemper
Author content
All content in this area was uploaded by Steven Kemper on Aug 08, 2015
Content may be subject to copyright.
MARIE: Monochord-Aerophone Robotic Instrument
Ensemble
Troy Rogers
Expressive Machines
Musical Instruments
Duluth, MN, USA
troy@expressivemachines.com
Steven Kemper
Mason Gross School of the Arts
!Rutgers, The State University of New
Jersey
New Brunswick, NJ, USA
steven.kemper@rutgers.edu
Scott Barton
Department of Humanities and Arts
Worcester Polytechnic Institute
Worcester, MA 01609
sdbarton@wpi.edu
ABSTRACT
The Modular Electro-Acoustic Robotic Instrument System
(MEARIS) represents a new type of hybrid electroacoustic-
electromechanical instrument model. Monochord-Aerophone
Robotic Instrument Ensemble (MARIE), the first realization of a
MEARIS, is a set of interconnected monochord and cylindrical
aerophone robotic musical instruments created by Expressive
Machines Musical Instruments (EMMI). MARIE comprises one or
more matched pairs of Automatic Monochord Instruments (AMI)
and Cylindrical Aerophone Robotic Instruments (CARI). Each AMI
and CARI is a self-contained, independently operable robotic
instrument with an acoustic element, a control system that enables
automated manipulation of this element, and an audio system that
includes input and output transducers coupled to the acoustic
element. Each AMI-CARI pair can also operate as an interconnected
hybrid instrument, allowing for effects that have heretofore been the
domain of physical modeling technologies, such as a "plucked air
column" or "blown string." Since its creation, MARIE has toured
widely, performed with dozens of human instrumentalists, and has
been utilized by nine composers in the realization of more than
twenty new musical works.
Author Keywords
musical robots, robotic musical instruments, plucked string
instruments, aerophones, hybrid instruments
ACM Classification
H.5.5 [Information Interfaces and Presentation] Sound and
Music Computing, H.5.1 [Information Interfaces and
Presentation] Multimedia Information Systems.
1. INTRODUCTION
The Modular Electro-Acoustic Robotic Instrument System
(MEARIS) represents a new type of hybrid electroacoustic-
electromechanical instrument model in which individual
robotic musical instruments can function as tunable acoustic
filters in an interconnected multi-module signal chain.
Monochord-Aerophone Robotic Instrument Ensemble (MARIE),
the first realization of a MEARIS, comprises one or more
matched pairs of Automatic Monochord Instruments (AMI) and
Cylindrical Aerophone Robotic Instruments (CARI). In designing
MARIE, we employed the MEARIS paradigm to create an
ensemble of versatile robotic musical instruments with maximal
registral, timbral, and expressive ranges that are portable,
reliable, and user-friendly for touring musicians.
MARIE was commissioned in 2010 by bassoonist Dana
Jessen and saxophonist Michael Straus of the Electro Acoustic
Reed (EAR) Duo, and designed and built by Expressive
Machines Musical instruments (EMMI)
1
for a set of tours
through the US and Europe. The first prototype of the
instrument was created in early 2011 [5], and has since been
field tested and refined through many performances with the
EAR Duo (Figure 1) and numerous other composers and
performers.
Figure 1. EAR Duo performs with MARIE at the Logos
Foundation in Ghent, Belgium.
2. RELATED WORK
The contemporary field of musical robotics spans a wide range of
research and creative activities [6]. Within this diverse field, we can
distinguish between 1) emulative machines that help researchers
better understand and/or replicate human performers, and 2) inventive
machines developed by composer-builders seeking new vehicles for
musical expression. EMMI’s work is largely focused on this second
category.
Numerous existing robotic instruments influenced MARIE’s
design, including those created by Trimpin, Roland Olbeter, Eric
Singer, and most significantly, Godfried-Willem Raes. In addition
to musical robotics, MARIE is influenced by the parallel field
of active control of acoustic musical instruments, as described
by Berdahl, Niemeyer, and Smith at CCRMA [1].
1
www.expressivemachines.com
Permission to make digital or hard copies of all or part of this work for personal
or classroom use is granted without fee provided that copies are not made or
distributed for profit or commercial advantage and that copies bear this notice
and the full citation on the first page. To copy otherwise, to republish, to post on
servers or to redistribute to lists, requires prior specific permission and/or a fee.
NIME’15, May 31-June 3, 2015, Louisiana State University, Baton Rouge, LA.
Copyright remains with the author(s).
AMI was conceived as an updated version of EMMI's first robotic
string instrument, PAM (Poly-tangent Automatic multi-Monochord),
which itself was influenced by LEMUR's GuitarBot [14], Trimpin's
Jackbox [6], and Raes' <Hurdy> [11]. AMI’s updated features also
draw upon ideas from Roland Olbeter's Fast Blue Air [10] and Raes'
<Aeio> [7]. James McVay's MechBass [8] as well as Raes'
<SynchroChord> and <Zi> [11] are notable robotic string
instruments that have been created between the development of
our initial prototype (early 2011) and the present.
CARI builds upon a number of robotic aerophones that have
previously been developed for both research and creative purposes.
Instruments such as the WF-4RV flutist and WAS-1 saxophonist
robots of Waseda University's Takanishi Lab [15, 16]; Roger
Dannenberg's robotic bagpipe player McBlare [3]; and the robotic
clarinet created by NICTA and UNSW's Music Acoustics
Laboratory [13] have been inspired by human performance models
and thus exemplify the emulative category described above. Though
these instruments helped shape our approach, CARI is primarily
influenced by the numerous inventive monophonic aerophones
developed by Raes including <AutoSax>, <Korn>, and <Ob>
[7]. Since the creation of the first CARI prototype in 2011,
Raes has developed several other related aerophone
instruments, including his automated clarinet <Klar> [11]. His
electroacoustic organ <Hybr> [12] shares acoustic features with
CARI as both are cylindrical air columns set into resonance by
an audio-rate driver.
3. CONCEPTUAL ORIGINS OF MARIE:
THE MEARIS PARADIGM
3.1 Inspiration
The MEARIS concept was inspired by Raes' automated
monophonic aerophone instruments, which operate as
automatically tunable acoustic filters. While the vast majority
of prior robotic instruments focus upon note actuation (i.e., act
as impulse generators), the electromechanical control systems
of Raes’ aerophones alter the acoustic resonances, which shape
the source sounds over time. With the MEARIS paradigm, we
seek to expand upon this work in order to further explore the
expressive possibilities enabled by connecting robotic impulse
and filtering “modules” in a variety of ways, as one would do
with a modular synthesizer.
3.2 Elements of a MEARIS
Figure 2 displays the basic elements of a MEARIS module. At
the center of the module is an acoustic element: a resonant body
(vibrating string, air column, membrane, etc.) that is activated,
modified, and sensed by the control and audio systems. The
control system communicates via the MIDI protocol and drives
automated mechanisms that excite or alter the acoustic element.
The audio system uses input transducers to force the acoustic
element into resonance, and output transducers to capture the
vibrations of the resonant body. The audio system can also
include automated mixing circuitry and onboard analog or
digital effects. Signals can be routed from one MEARIS
module to another to create rich hybrid instrumental timbres.
To make the sound producing/altering gestures as visually
salient as possible, instruments’ actions are amplified through
cameras and projection, as well as through onboard lighting
displays that illuminate form and action.
Figure 2. Funtional diagram of a MEARIS, with acoustic
element (center), control system (green), audio system
(blue), and visual elements (purple).
4. MARIE DESIGN
MARIE represents the first realization of a MEARIS. Each of
MARIE’s modules (each AMI and each CARI) contains a
resonant acoustic element that can function as a filter and a
control system with automated electromechanical actuators that
excite, tune, and dampen this acoustic element. Each module
also features an audio system with input and output transducers
and automated input and output matrices.
4.1 Instruments
4.1.1 AMI
AMI is a robotic monochord instrument with automated
mechanisms to articulate the string and alter its vibrating
length; an electromagnetic bowing mechanism (input
transducer); a pickup (output transducer); automated analog
mixing and effects circuitry; and programmable LED display.
AMI’s acoustic element is an electric guitar string that is
manually tunable with a standard guitar tuning machine.
Produced sound is transduced by a flexible contact microphone
and sent to either an on-board or an external amplifier/speaker.
The frame is divided vertically into two equal halves, each of
which can house a single instance of AMI. Figure 3 diagrams
AMI’s acoustic, control, audio, and visual elements.
Figure 3. Functional diagram of AMI.
4.1.2 CARI
CARI is a cylindrical aerophone modeled on the
clarinet. Rather than retrofitting an existing acoustic instrument,
we re-imagined the instrument itself. Because an automatic
instrument does not need to accommodate the hands of human
performers, the encumbrances of traditional keying
mechanisms can be avoided. As a result, CARI's 19 toneholes
are arranged linearly. Each tonehole is independently operable
via a solenoid-driven keying valve. Sound is produced by an
audio signal routed to the compression driver, which is directly
coupled to the cylindrical air column. Figure 4 diagrams
CARI’s acoustic, control, audio, and visual elements.
Figure 4. Functional diagram of CARI.
4.2 Control Systems
AMI's acoustic control system includes mechanisms to pick and
damp the string. AMI's 17 solenoid driven tangents are
positioned at fixed equal tempered half step intervals, giving a
range of pitches from E2-A3. The tangents can be used to
articulate notes without picking (hammer-ons). Varying the
duty cycles of trills and tremolos of tangents above 20Hz
produces timbral shifts. The tangents operate in conjunction
with a moving bridge to allow both discrete and continuous
control of string length.
CARI is outfitted with 19 solenoids that change the length of
the air column by opening and closing toneholes. These
actuators can achieve trills and tremolos up to 55 Hz. In
addition, thousands of “fingering” configurations are possible,
many of which would be inaccessible to a human performer on
a standard clarinet.
4.3 Audio Systems
4.3.1 Input and Output Transducers
AMI and CARI’s acoustic elements can be utilized as
automatically tunable acoustic filters when driven by onboard
audio-rate actuators. AMI’s string can be excited via a custom-
built electromagnetic “bowing” mechanism (E-driver) [11, 2,
4]. CARI’s cylindrical air column can be excited by a
compression driver to which it is coupled. In typical usage
scenarios, an input audio signal will be tuned to match resonant
frequencies of the acoustic element, which are manipulated by
AMI and CARI’s pitch control mechanisms. By tuning an input
signal to harmonics of a fundamental frequency (CARI's odd
harmonics; both even and odd string harmonics on AMI), the
instruments’ ranges can be extended well above the
fundamental frequencies, giving each instrument a range of
more than five octaves.
4.3.2 Inter-instrument Connections
Interconnections between AMI and CARI (Figure 5) allow
audio to be routed between the two instruments to create
instrument hybridizations that have previously been accessible
only in the virtual realm of physical modeling. For example, the
plucked string sound from AMI can be used to drive CARI’s air
column, creating a “plucked air column.” Conversely, sound
from CARI’s air column can be sent to AMI’s E-Driver to
create a “blown string.”
Figure 5. The interconnected audio systems of AMI and
CARI that together constitute MARIE.
4.4 Software Control of MARIE
MARIE can be controlled by any software or hardware capable
of generating audio signals and MIDI messages. However, in
order to access the more sophisticated features of the
instruments, we have developed a control panel based in
Cycling 74's Max environment. The Max MARIE Console
centralizes control over the instruments' acoustic, audio, and
visual systems and allows for automation of note generation
and shaping, signal mixing and routing, and lighting functions.
The panel manages the timing of various messages and
simplifies complex control operations, such as generating the
combination of MIDI messages and audio signal modulations
necessary to produce a note on CARI.
5. MUSICAL EXPLORATIONS OF MARIE
EMMI is dedicated not only to the design and construction of
novel robotic instruments, but also to composing music that
takes full advantage of these instruments’ capabilities. The
authors, as well as several other composers, have created new
pieces for MARIE that explore the specific features of this
instrument.
2
5.1 EMMI’s Compositions for MARIE
In addition to hyper-virtuosic speed and rhythmic complexity,
as displayed in From Here to There (Barton), Push for Position
(Barton), and Microbursts (Kemper), MARIE is capable of
dynamic and timbral control, intra- and inter-instrument
feedback, and the decoupling of sound source and resonator.
These new possibilities have been explored in In Illo Tempore
(Kemper), MARIE Explorations (EMMI) and Phantom
Variations (Rogers). Rogers’ Improvisation X series unifies all
of the performance concepts described here as an interactive
framework for real-time free improvisation with human
performers.
5.2 SMC 2012 Curated Concert
One indicator of an instrument’s successful design is the ability
for other musicians to be creative with it. EMMI achieved this
milestone in 2012, hosting a curated concert of new pieces for
MARIE and Transportable Automatic Percussion Instrument
(TAPI) for the 2012 Sound and Music Computing conference
in Copenhagen [17]. Composers from the U.S., Canada, and the
U.K. were invited to write new pieces for MARIE. The
resulting works utilized a variety of software systems and
consisted of acoustic instruments and MARIE (Nebula
Squeeze—Lane), interactive systems (Untitled—Trail,
Détente—Miller), and algorithmic systems (Coming
Together:EMMI—Eigenfeldt, Blues for Nancarrow—Collins).
6. FUTURE DIRECTIONS
Given MARIE's immense parameter space, along with its status
as an actively touring instrument, some of the original design
concepts have yet to be fully implemented and explored,
including the moving bridge, digitally controlled on-board
effects circuitry, and on-board video. We continue to optimize
and improve upon AMI's pickup and picking mechanisms, and
may incorporate additional features such as automated string
tuning in future iterations [9].
7. ACKNOWLEDGMENTS
MARIE was generously funded by the backers of a Kickstarter
campaign. Godfried-Willem Raes provided endless ideas and
guidance during and following Rogers’ residency at the Logos
Foundation in Ghent, Belgium, which was made possible by the
Logos Foundation and a Fulbright Research Fellowship. EMMI and
EAR Duo's Northeast US was funded in part by Meet the Composer
grants, and residencies at Brandeis, STEIM and De Lindenberg were
essential in refining the instruments and music.
2
www.expressivemachines.com/MARIE-Compositions
8. REFERENCES
[1] E. J. Berdahl and G. Niemeyer, and J. O. Smith III,
Feedback Control of Acoustic Musical Instruments.
CCRMA Report no. 120, Stanford University, CA, 2008.
[2] P. Bloland. The Electromagnetically-Prepared Piano and its
Compositional Implications, In Proceedings of the 2007
International Computer Music Conference. Copenhagen,
Denmark, 2007, 125-128.
[3] R. Dannenberg, et al, McBlare: a robotic bagpipe player.
In Proceedings of the 2005 conference on New Interfaces
for Musical Expression. National University of Singapore,
2005.
[4] M. A. Fabio, The Chandelier: An Exploration in Robotic
Musical Instrument Design. M.S. Thesis, M.I.T., Cambridge,
MA, 2007.
[5] H. Hart, Robotic Ensemble MARIE Will Jam With Humans (If
the Money’s Right). http://www.wired.com/2010/12/marie-
robot-music-ensemble/, Accessed January 19, 2015.
[6] A. Kapur, A history of robotic musical instruments. In
Proceedings of the 2005 International Computer Music
Conference. Barcelona, Spain, 2005.
[7] L. Maes, G.-W. Raes, and T. Rogers, The Man and Machine
robot orchestra at Logos. In Computer Music Journal 35(4),
M.I.T.a Press, Cambridge, MA, 2011, 28–48.
[8] J.,D. McVay, A. Carnegie, J. W. Murphy, and A. Kapur.
Mechbass: A systems overview of a new four-stringed robotic
bass guitar. In Proceedings of the 2012 Electronics New
Zealand Conference, Dunedin, New Zealand. 2012.
[9] J. Murphy, P. Mathews, A. Kapur, and D. A. Carnegie, Robot:
Tune Yourself! Automatic Tuning in Musical Robotics. In
Proceedings of the 2014 International Conference on New
Interfaces for Musical Expression, London, United Kingdom,
2014, 565-568.
[10] R. Olbeter, Fast Blue Air. Roland Olbeter - Set Designer and
Rob Art. http://www.olbeter.com/fast_blue.html, Accessed
November 19, 2014.
[11] G.-W. Raes, Expression Control in Automated Musical
Instruments. http://logosfoundation.org/g_texts/expression-
control.html, Accessed April 14, 2015.
[12] G.-W. Raes, <Hybr>,
http://logosfoundation.org/instrum_gwr/hybr.html, Accessed
April 14, 2015.
[13] Robotic Clarinet Wins Orchestra Competition. University of
New South Wales Newsroom, http://goo.gl/GFkcPk, Accessed
January 19, 2015.
[14] E. Singer, K. Larke, and D. Bianciardi, (2003). LEMUR
GuitarBot: MIDI Robotic String Instrument. In
Proceedings of the 2003 Conference on New Interfaces for
Musical Expression (NIME-03), Montreal, Canada, 2003,
188-191.
[15] J. Solis, K. Chida, K. Suefuji, and A. Takanishi, The
development of the anthropomorphic flutist robot at Waseda
university. International Journal of Humanoid Robots (IJHR) 3,
2006, 127-151.
[16] J. Solis et. al., Mechanism Design and Air Pressure Control
System Improvements of the Waseda Saxophonist Robot. In
2010 IEEE International Conference on Robotics and
Automation (ICRA), 2010, 42–47.
[17] Concert 4: Music Robots.
http://smc2012.smcnetwork.org/program-2/program/ Accessed
January 25, 2015.
Video of MARIE: https://youtu.be/KOIUvFIPfts