Content uploaded by Victor Gama
Author content
All content in this area was uploaded by Victor Gama on Jun 25, 2018
Content may be subject to copyright.
The Instrument and its Double: a New Perspective on
Instrument Making
Victor Gama
MA by project theorization essay
January 2012
Overview
This essay defines a process whereby the concept, design, development and
construction of musical instruments are part of a method of music composition. In this
method, a musical instrument is first and foremost a formless container of meanings that
constitute the narrative of a particular composition. It is therefore a guiding interface or
mediation between the composer and the symbolic material he has collected to arrive at
a particular musical lexicon. Moving away from the common perspective on writing for
specific instruments as the main scoring method in music composition, this process
redirects the focus from instruments with a fixed design to instruments in which design is
a variable. Whereas in the former method an instrument is chosen for its specific timbre,
texture, volume, etc., in this method the instrument represents an additional set of
variables that is used in the development of the composition(s). It challenges the fixation
on making music for "conventional" instruments and on making instruments for
"conventional" music. It proposes a new approach to making music by including the
process of developing the tools (the instruments) for that aim, while considering it as an
open field for the composer to develop his own cultural and aesthetic modes of
communication through music.
Furthermore, by allowing the development of the instrument to become part of the
process of composition, the need to bring in technical methods, skills, processes,
techniques, tools and materials, in other words, technology, arises. Technology can also
be viewed as an activity that forms or changes culture1. The choice of technologies and
their key factor in this project is indelibly linked to how much it plays a role in our lives
today and how much it is changing our cultures. The rise of the digital era has brought
an array of technical methods, fabrication processes, skills that can be acquired so
1 Borgmann, Albert (2006). "Technology as a Cultural Force: For Alena and Griffin" . The Canadian
Journal of Sociology 31 (3): 351–360.
quickly, with learning curves faster then ever and huge amounts of learning tools on the
web such as tutorials, pdf manuals, talks and videos, that we can no longer avoid
embracing the tools of our time, and most importantly, take in our hands the opportunity
to affect that cultural change. Each new technical advance seems to spawn a series of
pieces that exploit the new effect. Technology pushes the music, which pulls the
thechnology along with it.2
From Physical to Digital
The major impact of emerging digital technologies on music from the late 70s up to
today has been on the digitalization of musical instruments, i.e. instruments that are
generated either by sound synthesis of various types or by libraries of sampled sounds
controlled via MIDI (Musical Instrument Digital Interface) or by sequencer software. For
the first time in history, instruments that once had the exclusive rights to their sound
"lost" that exclusivity to formless virtual instruments. Literally all types of musical
instruments have been sampled and digitalized and new ones have been invented in the
digital domain. An increasing number of composers use sound libraries not just for
mock-up pieces that later are recorded using an orchestra but for scoring large budget
Holywood productions, music for documentaries, TV series, advertising and computer
games. A huge variety of plug-in virtual instruments are available from numerous
manufacturers, and a selection of top-name libraries, such as the Vienna Symphony
Library, East West Orchestral Library and Sonic Implant's packages cover all orchestral
instruments, solo or in groups, readily available for notation software such as Sibelius,
Finale, Notion, etc. Any of these instrument libraries contain the most complete
articulations of the various instruments that enable the composer to create sonic results
of the highest caliber and utmost authenticity. Virtual instruments can be used in three
ways: as a plug-in in a mixing and host software; as a plug-in in a sequencer – e.g.,
Cubase, Logic, Digital Performer, Pro Tools etc. – or notation software (e.g., Sibelius,
Finale, Notion) or as a simple stand-alone application. An entire orchestra can be loaded
on a single laptop computer.
2 Road, Curtis. The Music Machine: Selected Readings from Computer Music Journal, MIT Press 1989, xi
Refering to electronic music and the "glitch" movement, Kim Cascone, composer,
researcher and assistant music editor for director David Lynch on Twin Peaks and Wild
At Heart, states that computers have become one of the primary tools for creating music,
and certainly the main ones for creating sound design. In this new music, the tools
themselves have become the instruments, and the resulting sound is born of their use in
ways unintended by their designers3.
New Tools
When Ableton Live was launched in 2001 it changed the global electronic music scene
once again. Ableton Live is a loop-based software music sequencer and DAW (Digital
Audio Workstation) that was designed to be an instrument for live performances as well
as a tool for composing and arranging. It turned DJs from turntablists to composers in
their own right and gave rise to a new generation of composers who are using the
program. Driven by hardware interfaces and their laptops they perform their music live
and interact in real-time with musicians and whole orchestras. The program is based
around the use of sound libraries collected by the composer and played by looping those
sampled sounds for a variable duration and activated from USB/MIDI pad controllers.
Chicago Symphony Orchestra composer-in-residence Mason Bates is a good example
of how young composers are operating today. A DJ and techno artist with a phD in
music composition from the University of California, DJ Masonic has written works for
orchestra and electronics using such programs and devices4 while playing live with the
orchestra. Frequently performed by orchestras large and small, his symphonic music
has been the first to receive widespread acceptance for its expanded palette of
electronic sounds, and it is championed by leading conductors such as Michael Tilson
Thomas, Leonard Slatkin, and John Adams5.
But working in the digital domain has moved on from using sampled sounds of real
instruments or sounds from the real world to creating all sorts of mechanisms of sound
generation and manipulation to produce and compose new music. Curtis Roads coined
the term microsound for all variants of granular and atomic methods of sound synthesis,
3 Kascone, Kim,”The Aesthetics of Failure: "Post-digital" Tendencies in Contemporary Computer
Music”.
4 Bates, Mason. "Mothership" played by the London Symphony Orchestra on-line at
http://tinyurl.com/27lc29g
5 From Mason Bate's website, on-line at http://www.masonbates.com/classical/
and tools capable of operating at this microscopic level. Electronic musicians now use
Max/Msp to develop unique sound-making tools, unusual instruments, and custom
performance systems that go deeper into the manipulation of acquired samples of
sounds from the physical domain. Max/Msp is a program that uses parts to create
sounds, visuals and interactive media. These parts, called ‘objects’, are visual boxes that
contain tiny programs that have specific functions. Some of these objects make noises,
some make video effects while others just do simple calculations or make decisions. In
Max, objects are added to a visual canvas and are connected together with patch cords
forming patches. By combining objects, one creates interactive and unique software
without ever writing a single line of code. For composers, Max is more like a Lego set
that allows them to create their own instruments or sound devices within the digital
domain and then perform them live almost always behind their laptops. Digital
technology, for all its virtues as a precise tool for analysis, articulation of data,
communication and control, is propelling society towards a detachment from physicality6.
Using computers, as opposed to analogue electronics and tape as it was common
before the 1980s, composers started to integrate the digital domain into their methods of
composition in an interactive cyclical movement between the physical domain and the
digital domain.
A pioneering example of this going and coming back movement from the physical
domain to the digital domain is illustrated by Pierre Boulez when he began to explore the
use of electronic sound transformation in real time at IRCAM with his piece Répons,
which premiered in 1981. Répons was composed for six soloists and chamber orchestra
that responded to the resonance and spacialization sounds created by the ensemble
and processed in real time by a computer. Real instruments were sampled, processed in
real-time inside a computer's microprocessors and re-emitted into the air with hardly any
latency time or delay, allowing the musicians who were playing a note to interact with its
digital domain counterpart7.
6 Mott, Iain; Sosnin, Jim. Sound Mapping: an assertion of place. Proceedings of Interface '97.
Conservatorium of Music, University of Tasmania
7 Repons September 5th, 2009 Lucerne Switzerland. Pierre Boulez conducting. On-line video at
http://www.youtube.com/watch?v=DsEGijWx3YI
From Digital to Physical
This essay considers that a de-materialization of the musical instrument has been made
possible with the emergence and development of digital age technologies. It has allowed
the instrument to be performed and used by the composer or musician from within a
different domain other then physical where certain parameters, which once were
constant, have become variable. The argument in this essay is that digital technologies,
which have made possible the de-materialization or digitalization of musical instruments,
can also be used to re-materialize a new type of musical instrument. Created in the
digital domain and resulting in a physical object, this process generates parameters
associated with form, design, materials and construction that can be considered
additional variables. What is proposed is that these new variable parameters are
integrated into the composer's writing process on the same level playing field as those
that are variable parameters in the physical domain, such as duration, dynamics, pitch,
attack, etc.
The new musical instrument is a result of a collection of meanings, a symbolic system
created by the composer, or as Elizbieta Kazmierczak puts it, a simiotic interface, crafted
in the digital domain. In her paper "Design as Meaning: From Making Things to the
Design of Thinking", she proposes that all designs be regarded as diagrams of mental
maps of individual and collective cultures8. The instrument is now a personal matter and
the composer controls more parameters then ever before. What emerges from the
inversion of the digitalization process is an instrument that now has two existences. One
in the digital world, where it was created, as a sound library, a virtual instrument, a 3D
model and other digital parameters, and its double in the physical world as an instrument
that can be played and used in live performance. Both are the result of a back and forth
movement, this time from the digital world to the physical world. This new musical
instrument is free from the finite, fixed design paradigm as its originating parameters can
be altered. The virtual instrument can be so intimately coupled with the physical
instrument that they must now be understood together, as two aspects of the same
instrument, a virtual/physical instrument.
8 Kazmierczak, Elizbieta (2003). Design as Meaning: From Making Things to the Design of
Thinking. Design Issues: Vol. 19. No. 2, Spring 2003, pp 45 - 59.
How is re-materializing accomplished?
One of the most outstanding and ground-breaking technologies today is 3D modelling.
Commonly known as Cumputer Aided Design or CAD, it is an important industrial technique
extensively used in many applications, including automotive, shipbuilding, and aerospace
industries, industrial and architectural design, prosthetics, and many more. It permeates all
industries as most of the fabrication methods are now numerically controlled (NC). CNC
(Computer Numerical Control) refers to the automation of machine tools that are operated
by a software program running on a computer, as opposed to controlled manually via hand
wheels or levers. In other words, machines that can make components straight from files
that have been created using 3D modelling or CAD programs. Any conceivable part or
component can be machined or fabricated using a process that starts in the digital domain
and ends in the physical domain, usually refered to as CAM (Computer Aided
Manufacturing). This is done by designing the object in 3D using a modelling package such
as Rhino, SolidWorks, AutoCad or 3D Max among others, testing the design through
analysis of various types, exporting the file in a standard compatible format to the machine
and having it made in the material of choice. Rhino3D has been the 3D modelling program
used throughout this MA. Rhino can create, edit, analyze, document, render, animate, and
translate NURBS curves, surfaces, and solids with no limits on complexity, degree, or size.
Rhino also supports polygon meshes and point clouds. Starting with a sketch, drawing,
physical model, scan data, or only an idea, Rhino provides the tools to accurately model and
document any design ready for rendering, animation, drafting, engineering, analysis, and
manufacturing or construction9.
In the last decade or so another new technology has become available and affordable in
which an object is literally printed into existence. Called 3D printing or rapid prototyping it
uses a number of different technics such as SLS (Selective Laser Sintering). SLS is an
additive manufacturing technique that uses a high power laser to fuse small particles of
plastic, metal, ceramic or glass powders into an object that has a desired 3-dimensional
shape. The laser selectively fuses powdered material by scanning cross-sections generated
from a 3-D digital description of the part (a CAD file) on the surface of a powder bed.
9 From Rhino3D website, on-line at http://www.rhino3d.com/
Despite having all of these technologies available to design new musical instruments and
manufacture their components in any shape and materials there is one important area that
needs to be part of any successeful instrument maker's digital workshop. Testing the
components of the new instrument for strength and most importantly for optimum vibrational
characteristics whithin the digital domain. This is accomplished using the 3D model of each
part of the instrument beeing tested and inputing it into a test environment that performs
modal analysis through a Finite Element Analysis program.
Refining in the digital domain
Finite Element Analysis (FEA) is used to analyse and predict how an instrument will behave
acoustically. The use of FEA software comes in at the stage of determining the vibrational
behaviour of a soundboard or determining at which natural frequency a certain component
resonates. This will in turn allow the instrument builder to refine the various components and
look for the ideal materials, configurations and designs that create the best possible
frequency response of the instrument. If the instrument needs a better frequency response
in the bass range for instance, this can be determined using a software program such as
LISA. LISA is a user-friendly low-cost finite element analysis program with an integrated
modeller, multi-threaded solver and graphical post-processor. It has a broad range of
functionalities to test the design integrity of custom machines/devices, structures, heat
exchangers, etc. It finds the natural frequencies of a structure and its vibrating mode
shapes. Supports membranes, surfaces, solids and space frames. Resonances in rooms
and other closed cavities can be found. This is useful to ensure vibrating elements such as
strings don't set up strong standing waves inside resonating boxes. LISA has been
thouroughly tested and used in the course of this MA and proven to be an accessible, easy
to use and accurate testing program. It lets the user specify the main characteristics of the
materials of the 3D model it is working on, such as the young's module, poisson's ration and
density. Defining the constraints applied to the component it its solver generates a graphic
and animation illustrating the modal shapes while showing the resulting frequencies of
vibration.
This resulting data can be used to generate new sound material. Determining the vibrational
characteristics of a vibrating element generates the fundamental frequency and its harmonic
frequencies at wich it vibrates. It is basically a synthesiser working closely with the shape
and material characteristics of the instrument. The digital workshop is now complete and
ready to produce the virtual instrument, this time not by sampling or digitalizing but by
generating it from the resulting data, while its double in the physical domain is produced by
"reverse digitalization" or materialization using computer aided manufacturing.
But there are even more oportunities that arise from the use of an extensive digital
workshop supplied with such programs as Ableton Live or Max/MSP in combination with 3D
modelling and finite element analysis. As Kim Cascone puts it "today’s digital technology
enables artists to explore new territories for content by capturing and examining the area
beyond the boundary of "normal" functions and uses of software". In "Acrux Variations", a
live new set composed by myself and David Gun, the acrux is performed live interacting with
its own acoustic sounds augmented and transformed digitally, using custom-built audio
software designed by Gunn, who also creates an immersive visual show live on stage10.
The digital workshop proposed in order to accomplish the practical implementation of this
method is thus made of:
• a 3D modelling environement,
• a testing digital workbench based on finite element analysis,
• manufacturing technologies such as laser cutting, CNC for wood and metal and SLS to
manufacture small parts,
• a digital audio workstation such as Ableton Live or Cubase
• music notation software such as Sibelius and Notion,
• Max/MSP and various effects processors.
Future developments
The interaction between the digital and physical domains has a potential to yield interesting
developments in the creation of interfaces that can expand the capabilities of new musical
instruments developed through this method. Instruments can expand their natural range by
feeding from their virtual instrument counterpart or expand their tuning system to include
more intervals. These capabilities could be triggered by specific movements of an arm,
10 From Incidental's website, on-line at http://www.theincidental.com/projects/acrux/
elbow, or eyes, for instance, while playing the instrument, much the same way an harpist
presses the pedals to mechanically access all of the accidentals in the chromatic scale.
Bibliography
J.-L. Le Carrou, F. Gautier, J. Gilbert, J. R. de França Arruda: " Low frequency model of
the sound radiated by a concert harp" ForumAcusticum, (2005)
Q. Leclere, J.-L. Le Carrou, F. Gautier: "Study of a concert harp's radiation using
acoustic imaging methods" Acoustics (2008)
J. Acoust. " Vibroacoustic properties of a composite harp soundboard (A)"Soc. Am.
Volume 104, Issue 3, pp. 1767-1767 (September 1998)
Thompson, L. L. (2006). "A review of finite-element methods for time-harmonic
acoustics. Journal of the Acoustical Society of America", 119(3), 1315. ASA. doi:
10.1121/1.2164987.
Gay, D. A. (2008). "Finite element modelling of steelpan acoustics." Journal of the
Acoustical Society of America, 123(5), 3799. doi: 10.1121/1.2935485.
I.M, Firth and A.J. Bell: "On the Acoustics of the Concert Harp's Soundboard and
Soundbox" Royal Swedish Academy of Music Publication No. 46.2, pages 167-183
(1985)
I.M. Firth and A.J. Bell: "Vibrations of the Concert Harp and Soundbox" Proceedings of
the Institute of Acoustics Spring Conference, Swansea, 1984, pages 65-73 (1984)
A,J. Bell and 1.-N. Firth: "The Physical Properties of Gut Musical Instrument Strings"
Acustica Vol.60 No.4 pages 87-89 (1986)
A.J. Bell and 1.1'!. Firth: "The Acoustic Effects of Wood Veneers" Acustica (submitted for
publication)
Rivers, Shayne and Umney, Nick (2003). Conservation of Furniture. Butterworth-
Heinemann
Rossing, Thomas (2010). The Science of String Instruments. Springer; 1st Edition.
edition
Fletcher, H. Neville. The Physics of Musical Instruments. Springer (December 1, 2010)
Henrique, Luis L. Acústica Musical (2003), Serviço de Educação da Fundação Calouste
Gulbenkian.
Reintjes, J. Francis (1991), Numerical Control: Making a New Technology, Oxford
University Press, ISBN 9780195067729.
Wildes, Karl L.; Lindgren, Nilo A. (1985), A Century of Electrical Engineering and
Computer Science at MIT, MIT Press, ISBN 0-262-23119-0.
Martinez-Ruiz, Barbaro, “Beyond the Scripture – physical forms of graphic writing”, Temple
Press
Martinez-Ruiz, Barbaro, “Ma kisi nsi: Kongo a Sansala Art”
Fu-Kiau, Kimbwandende Kia Bunzeki, “African Cosmology of the Bantu-Kongo”, Athelia
Henrietta Press
Fu-Kiau, Kimbwandende Kia Bunzeki, “Sel-Healing Power and Therapy: Old Teachings from
Africa” Vantage Press, 1991
Avorgbedor, Daniel K. ed., “The Interrelatedness of Music, Religion, and Ritual in African
Performance Practice”, African Studies, Volume 68, The Edwin Mellen Press, 2003
S.K. Heninger, Jr., “The Cosmographical Glass, Renaissance Diagrams of the
Universe”, The Huntington Library, 2004
Eglash, Ron, “African Fractals: Modern Computing and Indigenous Design”, Rutgers
University Press (March 1, 1999)
Claudia Zaslavsky, “Africa Counts: Number and Pattern in African Cultures”, Lawrence Hill
Books; 3rd edition (April 1, 1999)
Hayles, N.K. 1996. "Embodied Virtuality: Or How To Put Bodies Back into the Picture". In M.
Moser & D. MacLeod (Eds.): Immersed in Technology , MIT Press, Cambridge