Conference PaperPDF Available

Maintaining and Constraining Performer Touch in the Design of Digital Musical Instruments


Abstract and Figures

Expression in musical practice is inextricably tied to the touch of the performer. In digital musical instruments (DMIs) the relationship of touch to sound is indirect: the nuances and fine detail of performer control can be flattened and limited during the translation of physical gesture to physical sound. The locus of this research is in the contact made between performer and DMI: focusing on this area can grant insight on fundamental issues of human computer interaction, particularly regarding intimate and expressive control of tangible interfaces. In this paper I present my research on this topic so far, which includes empirical studies that focus on specific parameters of performance where touch plays an integral role. The first study investigates how dynamic vibrations in an instrument's body can guide the hand of a performer and assist with intonation. The second study looks at asynchrony between action and sound and the influence this latency has on the perceived quality of an instrument.
Content may be subject to copyright.
Maintaining and Constraining Performer Touch in the
Design of Digital Musical Instruments
Robert H Jack
Centre for Digital Music
Queen Mary University of
Tony Stockman
Centre for Digital Music
Queen Mary University of
Andrew McPherson
Centre for Digital Music
Queen Mary University of
Expression in musical practice is inextricably tied to the touch
of the performer. In digital musical instruments (DMIs) the
relationship of touch to sound is indirect: the nuances and
fine detail of performer control can be flattened and limited
during the translation of physical gesture to physical sound.
The locus of this research is in the contact made between
performer and DMI: focusing on this area can grant insight
on fundamental issues of human computer interaction, partic-
ularly regarding intimate and expressive control of tangible
interfaces. In this paper I present my research on this topic
so far, which includes empirical studies that focus on spe-
cific parameters of performance where touch plays an integral
role. The first study investigates how dynamic vibrations in
an instrument’s body can guide the hand of a performer and
assist with intonation. The second study looks at asynchrony
between action and sound and the influence this latency has
on the perceived quality of an instrument.
ACM Classification Keywords
H.5.5 [Information Interfaces and Presentation]: Sound and
Music Computing–Methodologies and techniques
Author Keywords
Haptic interfaces; Digital musical instruments; Design
framework; Tangible interaction; Multi-sensory; Touch;
Physicality; Embodied cognition; Design toolkit
This PhD research investigates the design of digital musical
instruments and the role of the sense of touch in the design
process. DMIs have been defined as musical instruments that
consist of a gestural sensing layer that is then used to drive
musical parameters of sound synthesis [17]. Unlike acoustic
instruments where action and sound are tightly coupled via
the physical mechanism of the instrument, DMIs lack a tight
coupling between sound and tactility. Haptic experience, the
umbrella term for perceptions pertaining to touch, is becoming
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specific permission and/or a
fee. Request permissions from
TEI ’17, March 20-23, 2017, Yokohama, Japan
© 2017 ACM. ISBN 978-1-4503-4676-4/17/03.. .$15.00
increasingly important to the design of all kinds of human-
computer interfaces: haptic attributes and characteristics are
becoming central to our interaction with digital objects [20].
The finer detail of how haptic experience unfolds has however
been under explored in comparison to research conducted on
vision and audition. This research exists within a growing
body of work that addresses these matters [9, 6, 15], and
contributes to this discourse by shifting the focus to the design
of musical instruments, a field of design where touch and ‘feel’
are of highest importance. This research proposes that these
considerations can and should be knowingly and explicitly
held into account by designers of DMIs.
Alongside audition, touch is the primary mode through which
we engage with musical instruments. In fact learning a musical
instrument can be understood as one of the most developed
haptic cultural practices, where years of practical and theoret-
ical training reinforce sensorimotor pathways allowing us to
perform complex music. When playing an acoustic instrument
we get a great deal of rich sensory information from the part
of our body which makes contact with the instrument (hands,
fingertips, lips, shoulders). This has been shown to contribute
to temporal control [7], quality judgements [4] and expres-
sive control [15]. For most DMIs, although touch-mediated
interaction is still the primary means of control, there is no
comparably rich physical experience from the instrument. By
focusing on the contact that a performer makes with an in-
strument we gain access to an area of performer-instrument
interaction that is not accessible by concentrating on hearing
alone. The focus shifts to instrument physicality, that is to the
physical characteristics of sound and to the action and labour
of performance. This research project investigates how we can
re-frame the sense of touch in the design process of a DMI.
As part of this project new musical instruments have been
designed specifically to investigate different aspects of touch
in DMI performance.
Context and motivation
The emotional capacity of touch is central to the field of
tangible computing, which calls for a rediscovery of the
rich physical aesthetics of manual interaction with comput-
ers [20]. Two aspects of the idea of tangibility distinguish
touch from the other senses and are of particular relevance to
DMIs:‘immediacy’ and ‘manipulation’ [2]. Immediacy refers
to the fact that the sense of touch relies on direct physical con-
tact, and gives us an almost intimate experience of an object
that the other senses can’t provide. Manipulation refers to
the bi-directional nature of touching: as both an input and an
Graduate Student Consortium
TEI 2017, March 20–23, 2017, Yokohama, Japan
output act, touching is our primary means of effecting change
on objects in our environment. As tangible user interfaces
DMIs are important examples due to the high degree of skill
required in their operation and to the expressive, culturally
meaningful music they can produce.
Each of the studies presented in this paper focus on the implicit
cross-modality of controlling a musical instrument: the congru-
encies and redundancies that exist between touch and hearing
[8]. Perceptual attendance, that is the relative weighting of
sensory information provided to different sensory channels, is
crucial to the design of DMIs. Touch itself has a much lower
bandwidth than vision or audition [6] however, the overall
amount of information available to a sensory channel does
not necessarily relate to its value or importance: the power of
the haptic channel to transmit emotional (and at times vitally
relevant) information can certainly overpower its bandwidth
limitations [8]. Another way in which touch stands out from
the other sensory channels is in its explicit reliance on move-
ment: touch is never a passive sense and "movement is as
indispensable to touch as light is to vision" p.8 [14].
From the perspective of embodied music cognition [15] mu-
sic, too, is based on action: body movement is given prime
importance in the formation of musical meaning. In the case
of a DMI the relationship of action to sound does not have to
be direct, and in fact the beauty of electronic music is perhaps
in the shift of responsibility for sound making from the human
to the machine [3]. Within the vast spectrum of approaches to
DMI design, this research studies instruments that display a
tight coupling of action and sound [13] due to the connection
of intimate control and expressive performance [21]. A related
concept is that of ergocity introduced by Cadoz [1], which
posits that an essential property of instrumental interaction is
the preservation of energy through both digital and physical
components of a system: all signals produced by the system
should correspond to the amount and shape of energy fed into
the system so as to provide a more natural form of interaction.
This is usually not the case with DMIs and it has been identi-
fied by many working within the field of computer music that
the expressive possibilities of traditional musical instruments
(such as the piano, the electric guitar or the cello) have not yet
been matched by DMIs [2, 15, 17].
Haptic engagement with a DMI is guided by both static factors
(material, weight, arrangement of keys, strings or frets) and
dynamic factors (how it responds both physically and sonically
to energy put in by the performer) [17]. Leman describes
haptic feedback as "a multi-modal prerequisite for musical
expressiveness" p. 163 [15] as it gives the performer a more
reliable sense of how gesture translates to sound at the moment
of excitation. Fostering or maintaining this kind of ergotic or
familiar interaction while utilising the great sonic potential that
digital technologies offer is a tough challenge for designers.
This research seeks to clarify the parameters of design that
can help maintain the presence of a performer’s touch in their
interaction with a DMI, parameters that influence the feel and
hence the perceived quality of a DMI, and parameters that
are fundamentally to do with how touch is catered for and
understood in the design process. To return to perceptual
attendance, if we know the important factors of touch that
performers attend to, the kind of stimuli or characteristics of
stimuli that naturally pop-out from the background, then we
Actuator attached to
bottom of sensor
Felt Cradles for the sensor
Capacitive Touch Sensor
Instrument body
made from laser
cut plywood Loudspeaker
Figure 1. The top surface and front panel of the embedded instrument
with vibrotactile feedback built for study 1 [12].
know what to simulate in high fidelity, and what we only need
to give a gist of in low fidelity.
Research objectives and questions
The primary research objective of this project is to develop
a design framework for DMIs that focuses on the sense of
touch. This is achieved by looking at the parameters of design
that bind touch and sound together in a DMI. The questions
that this research seeks to answer can be divided into three
complimentary domains: touching and tactile orientation;
audio-haptic immediacy and intimate control;excitation and
note onset. Each question has both a theoretical and techno-
logical element.
The first domain touching and tactile orientation aims to an-
swer the following question: how does the physical structure
and dynamic tactile behaviour of an instrument influence a
performer’s control and understanding of instrument layout?
Rojas [18] proposes that learning an instrument can be under-
stood as the development of a musical topography, one that
maps out how performer-instrument relationship relates to aes-
thetic, ecological, and technological threads contained within
the instrument’s structure. This area of research looks at how
such musical topographies are influenced by an instrument’s
physical structure.
The second domain considers the temporal relationship of
audio and haptic feedback from an instrument, and aims to
answer the following question: how can the the temporal
relationship between audio and haptic feedback in a DMI
be designed to improve perceived controllability, perceived
expressivity or perceived quality of the instrument? This area
addresses issues of perceptual attendance, studying how small
changes in the perceived mechanism of a DMI can influence
performance and subjective opinions about instruments.
The third domain considers how sound is excited from an
instrument and aims to answer the following question: how
much of an instrument’s perceived quality lies in the character
of the onset of each note? This area addresses the manner in
which action translates to sound in the moment of excitation,
and the specific parameters of design that influence the fluency
of this translation.
Graduate Student Consortium
TEI 2017, March 20–23, 2017, Yokohama, Japan
Figure 2. The ceramic tile percussion instrument built for study 2 [11].
This cross-disciplinary research builds on recent work in
human-computer interaction (particularly physical and tan-
gible computing), multi-modal perception studies, sound de-
sign and DMI research, and aims to provide both technical
implementation guidelines for design and theoretical contri-
butions on the role of touch when interacting with a musical
instrument. The methodology involves a series of reduced
musical instruments, that are used as a means of testing spe-
cific theoretical territory of musical interaction. Each of these
prototypes are tested with musicians with varying levels of
expertise and are designed to gather empirical information
about their performance while provoking reflection about the
dynamics of interaction with the device. The evaluation of
these instruments happens through both a quantitative analysis
of specific performance parameters, and through qualitative
interviews with the musicians (employing amongst other eval-
uation techniques recognised frameworks for the evaluation of
musical instruments [19]), paired with my own self-reflective
assessment as the instrument designer.
This paper provides an overview of the first two years of this
PhD project. Over this time, alongside developing a strong the-
oretical base, I have conducted two user studies each focused
on a specific parameter of haptic interaction with DMIs.
Creating a theoretical base
The first stage of research involved the development of a pre-
liminary design framework for the touch-led design of DMIs.
Initial stages of this development were informed by working
on a project with profoundly deaf people exploring their ex-
perience of music through tactile vibration [10]. Alongside
this, I reviewed current research into haptics, tangible user in-
terfaces, physical computing and DMI design. I identified the
research questions that would focus the rest of this research:
navigation,intimacy and excitation.
Study 1: intonation and tactile feedback
This first study investigated how a DMI can respond dynami-
cally to performer control through vibrations in the body of the
instrument, and how this feedback can provide the performer
with additional performance information. The aim was to cre-
ate vibro-tactile feedback conditions that could guide the hand
of the performer whilst remaining harmonically linked to the
sound output of the instrument. An embedded DMI with three
vibro-tactile feedback conditions was created (see Figure 1),
designed to assist intonation on an instrument with continuous
pitch control. The first two conditions acted like virtual frets
on the pitch continuum, vibrating or ceasing to vibrate when
in tune or approaching an in tune note; the third, inspired by
accounts of double bassists using haptic beating in the body of
their instrument to adjust their tuning in ensemble playing [5],
used beat frequency differences between the note played and
the target note to guide the performer’s hand. The instrument
was tested with ten musicians who played the instrument while
it was hidden from sight, encouraging concentration on touch
and hearing alone.
I found increased tuning accuracy with each of the conditions,
but noted that certain vibration patterns imposed temporal
constraints that disrupted performance [12]. Although pro-
viding the most accuracy, the more temporally complex beat
frequency condition was harder to integrate into performance;
this was due to the time it took to unfold, whereas the simpler
‘fretting’ conditions made for easier integration. Musicians
also reported clear preferences for a certain polarity of feed-
back; whether the feedback notified them of a correct or incor-
rect action. I also witnessed a series of emergent gestures that
used the interplay of audio and haptic feedback in unexpected
ways. Where previous studies used force feedback to push the
performer to the right pitch, this interface required an active
correction by the performer. It is interesting that both methods
are successful in improving accuracy which suggests that there
is great potential for integrating audio-related vibrotactile feed-
back for guidance tasks in interfaces although the temporal
demands of performance limit the complexity of the feedback.
Study 2: immediacy, latency and intimacy
In the second study the focus shifted to the temporal behaviour
of a DMI. A study was designed to test the impact of audio-
haptic asynchrony (latency) on the perceived quality of a DMI
(in terms of dynamic control, temporal control, quality, natu-
ralness). The work built on a previously conducted study[16]
which found that common techniques employed in DMI de-
sign exhibit action-sound latency above a threshold of 10ms
set by Wessel in 2002 [21]. For the study I built a novel percus-
sion instrument constructed of eight ceramic tiles with piezo
sensors on each (see Figure 2), capable of sub-millisecond
latency and negligible jitter (variation in latency).
Round one: general musicians
The first iteration of the study was conducted with general
musicians who freely improvised on the instrument evaluating
different latency conditions (0ms, 10ms, 20ms, 10ms
in terms of instrument quality. They also performed rhythmic
tasks on the instrument which was then used to evaluate the
impact of latency on their rhythmic accuracy. I found that
even if the level of latency is below the degree of accuracy
that can be achieved by the performer on an instrument, it can
still impact on how the quality of that instrument is judged.
None of the participants were able to perform with a degree
of accuracy that was better than the jitter condition (
20ms latency showed significantly lower ratings of quality
compared to the zero or 10ms latency conditions: there were
also multi-sensory by-products of some of the latency condi-
tions, where participants put more force into their strikes as
the latency increased [11].
Round two: professional percussionists
At the time of writing I have just completed a second repetition
of this study with ten professional orchestral percussionists.
Graduate Student Consortium
TEI 2017, March 20–23, 2017, Yokohama, Japan
This study was rerun with a different sample to evaluate the
impact of training on sensitivity to latency. A preliminary
analysis of the data shows that percussionists were more sen-
sitive to the latency conditions when presented side by side.
The temporal accuracy with which they performed was signif-
icantly negatively impacted by the latency conditions when
performing fast passages. This study illuminates how asyn-
chrony between action and sound can have an effect on the
perceived quality of an instrument. Subtle changes in the level
of latency seem to demonstrate an impact on the feel of the
instrument to the performer: it can temporally constrain a
performer’s control gestures and can create differences in how
the performer judges the effort and perceived weight of strike
that is needed to trigger a note.
A focus on note onset and attack
From the studies conducted so far as part of this research
it seems that a crucial aspect of DMI design is how a note
comes into being; how it can be excited from the instrument.
This is often seen as a trivial problem with latency being the
main barrier but in fact it is a complex issue that conditions
many aspects of an instrument’s quality. The next study in
this research project will investigate the extent to which the
physical quality of an instrument, its feel, resides in its onset
behaviour in combination with its physical form factor. This
study will involve the creation of a series of prototype tangible
instruments but this time in a more collaborative process with
professional musicians. A goal of this study is to assess the
impact on instrument quality of physical form factor paired
with the dynamic timbral characteristics of an instrument’s
sound. I’m keen to discuss the evaluation technique I shall
apply at the Graduate Student Consortium A final step in this
research may be to conduct a survey asking both acoustic
and electronic musicians to describe the feel of their favourite
instruments. I then intend to build all of these findings into a
wider touch-led design framework for DMIs.
Considering musical interaction with a digital system from
the perspective of touch illustrates some of the relatively sim-
ple changes to design that can improve the tangible quality
of these instruments. It puts the focus on the nuance of con-
trolling a DMI and thus also the complex choices behind the
production of musical meaning, allowing us to understand
what makes instruments stand apart from interfaces. This re-
search also aims to show how, as tangible devices that are built
to foster creativity and expression, DMIs serve as good places
for considering our interaction with computers in general.
This work was supported by EPSRC under the grant
EP/G03723X/1 (Doctoral Training Centre in Media and Arts
1. C. Cadoz. Supra-instrumental interactions and gestures.
Journal of New Music Research, 38(3):215–230, 2009.
2. C. Cadoz et al. Tangibility, presence, materiality, reality
in artistic creation with digital technology. In ICMC/SMC,
pages 754–761, 2014.
3. S. Emmerson. Living electronic music. Ashgate
Publishing, Ltd., 2013.
4. C. Fritz and J. Poitevineau. Influence of vibrotactile
feedback on some perceptual features of violins. The
Journal of the Acoustical Society of America,
136(2):910–921, 2014.
5. R. Fulford, J. Ginsborg, and J. Goldbart. Learning not to
listen: the experiences of musicians with hearing
impairments. Music Education Research, 13(4), 2011.
6. A. Gallace and C. Spence. In touch with the future: The
sense of touch from cognitive neuroscience to virtual
reality. OUP Oxford, 2014.
7. W. Goebl and C. Palmer. Tactile feedback and timing
accuracy in piano performance. Experimental Brain
Research, 186(3):471–479, 2008.
8. C. Ho and C. Spence. Affective multisensory driver
interface design. International Journal of Vehicle Noise
and Vibration, 9(1-2):61–74, 2013.
9. M. S. Horn. The role of cultural forms in tangible
interaction design. In Proc. TEI’13, pages 117–124, 2013.
10. R. H. Jack, A. McPherson, and T. Stockman. Designing
tactile musical devices with and for deaf users: a case
study. In Proc. ICMEM Sheffield, 2015.
11. R. H. Jack, T. Stockman, and A. McPherson. Effect of
latency on performer interaction and subjective quality
assessment of a digital musical instrument. In Proc.
Audio Mostly, 2016.
12. R. H. Jack, T. Stockman, and A. McPherson. Navigation
of pitch space on a digital musical instrument with
dynamic tactile feedback. In Proc. TEI’16, 2016.
A. R. Jensenius. Action-sound: Developing methods and
tools to study music-related body movement. 2007.
14. L. E. Krueger. Tactual perception in historical
perspective: David katz’s world of touch. Tactual
Perception; A Sourcebook, pages 1–55, 1982.
15. M. Leman. Embodied music cognition and mediation
technology. Mit Press, 2008.
16. A. McPherson, R. H. Jack, and G. Moro. Action-sound
latency: Are our tools fast enough? In Proc. NIME, 2016.
E. R. Miranda and M. M. Wanderley. New digital musical
instruments: control and interaction beyond the keyboard,
volume 21. AR Editions, Inc., 2006.
18. P. Rojas. To become one with the instrument: The
unfolding of a musical topography. Culture &
Psychology, 21(2):207–230, 2015.
19. C. Saitis et al. Perceptual evaluation of violins: A
quantitative analysis of preference judgments by
experienced players. The Journal of the Acoustical
Society of America, 132(6):4002–4012, 2012.
O. Shaer and R. J. Jacob. A specification paradigm for the
design and implementation of tangible user interfaces.
ACM Transactions on Computer-Human Interaction
(TOCHI), 16(4):20, 2009.
21. D. Wessel and M. Wright. Problems and prospects for
intimate musical control of computers. Computer Music
Journal, 26(3):11–22, 2002.
Graduate Student Consortium
TEI 2017, March 20–23, 2017, Yokohama, Japan
... The controllers were clumsy, and it is likely that ML could have better affordances with gestural control but less immersion with a loss of haptic feedback. "Expression in musical practice is inextricably tied to the touch of the performer" [24] To synthesise touch response, the appropriate GIVME would be to apply haptic feedback-based solutions like the Hapring. [25]. ...
... Andersen and Ward highlight how this experimentation relates to current state-of-the-art developments in Tangible, Embedded and Embodied Interfaces (TEI), and in some cases surpasses it. Shared research between NIME and TEI is increasingly happening, and the research presented in this thesis is representative of such a cross-over with elements of this thesis being presented at TEI 2016 [155], TEI 2017 [156], NIME 2016 [239] and NIME 2018 [159], and part of the work this thesis aims to do is to bridge discourses in both fields. ...
Full-text available
The sense of touch plays a fundamental role in musical performance: alongside hearing, it is the primary sensory modality used when interacting with musical instruments. Learning to play a musical instrument is one of the most developed haptic cultural practices, and within acoustic musical practice at large, the importance of touch and its close relationship to virtuosity and expression is well recognised. With digital musical instruments (DMIs) – instruments involving a combination of sensors and a digital sound engine – touch-mediated interaction remains the foremost means of control, but the interfaces of such instruments do not yet engage with the full spectrum of sensorimotor capabilities of a performer. This poses compelling questions for digital instrument design: how does the nuance and richness of physical interaction with an instrument manifest itself in the digital domain? Which design parameters are most important for haptic experience, and how do these parameters affect musical performance? Built around three practical studies which utilise DMIs as technology probes, this thesis addresses these questions from the point of view of design, of empirical musicology, and of tangible computing. In the first study musicians played a DMI with continuous pitch control and vibrotactile feedback in order to understand how dynamic tactile feedback can be implemented and how it influences musician experience and performance. The results suggest that certain vibrotactile feedback conditions can increase musicians’ tuning accuracy, but also disrupt temporal performance. The second study examines the influence of asynchronies between audio and haptic feedback. Two groups of musicians, amateurs and professional percussionists, were tasked with performing on a percussive DMI with variable action-sound latency. Differences between the two groups in terms of temporal accuracy and quality judgements illustrate the complex effects of asynchronous multimodal feedback. In the third study guitar-derivative DMIs with variable levels of control richness were observed with non-musicians and guitarists. The results from this study help clarify the relationship between tangible design factors, sensorimotor expertise and instrument behaviour. This thesis introduces a descriptive model of performer-instrument interaction, the projection model, which unites the design investigations from each study and provides a series of reflections and suggestions on the role of touch in DMI design.
Conference Paper
Full-text available
The importance of low and consistent latency in interactive music systems is well-established. So how do commonly-used tools for creating digital musical instruments and other tangible interfaces perform in terms of latency from user action to sound output? This paper examines several common configurations where a microcontroller (e.g. Arduino) or wireless device communicates with computer-based sound generator (e.g. Max/MSP, Pd). We find that, perhaps surprisingly, almost none of the tested configurations meet generally-accepted guidelines for latency and jitter. To address this limitation, the paper presents a new embedded platform, Bela, which is capable of complex audio and sensor processing at submillisecond latency.
Conference Paper
Full-text available
We present a study investigating the impact of dynamic tactile feedback on performer navigation of a continuous pitch space on a digital musical instrument. Ten musicians performed a series of blind pitch selection and melodic tasks on a self- contained digital musical instrument with audio-frequency tactile feedback that was generated in response to their inter- action. Results from the study show that tactile feedback can positively impact a performer’s ability to play in tune when the instrument is hidden from sight, however with a temporal impact on performance. Furthermore, several playing techniques were observed that emerged from the performer’s engagement with the tactile feedback conditions. We discuss the implications of our findings in the context of tangible interface design and non-visual interface navigation. We also discuss how our implementation suggests guidelines for future instruments and interfaces incorporating dynamic tactile feedback and present a novel tactile feedback technique that uses tactile ‘beating’.
Conference Paper
Full-text available
When designing digital musical instruments the importance of low and consistent action-to-sound latency is widely accepted. This paper investigates the effects of latency (0- 20ms) on instrument quality evaluation and performer inter- action. We present findings from an experiment conducted with musicians who performed on an percussive digital musical instrument with variable amounts of latency. Three latency conditions were tested against a zero latency condition, 10ms, 20ms and 10ms ± 3ms jitter. The zero latency condition was significantly rated more positively than the 10ms with jitter and 20ms latency conditions in six quality measures, emphasising the importance of not only low, but stable latency in digital musical instruments. There was no significant difference in rating between the zero latency condition and 10ms condition. A quantitative analysis of timing accuracy in a metronome task under latency conditions showed no significant difference in mean synchronisation error. This suggests that the 20ms and 10ms with jitter latency conditions degrade subjective impressions of an instrument, but without significantly affecting the timing performance of our participants. These findings are discussed in terms of control intimacy and instrument transparency.
Conference Paper
Full-text available
The democratization of Computer Arts and Computer Music has, due to dematerialization(virtualization) con-sequence of digital technologies, considerably widenedthe boundaries of creativity. As we are now entering a second phase that has been labeled “post-digital”, we are calledto reconcile this openness with notions such as embodiment, presence, enaction and tangibility.These notions are in our view inherently linked to creativity. Here we outline some approaches to this problem under development within the “European Art-Science-Technology Network” (EASTN1). Several areas of artis-tic creation are represented (Music, Animation, Multi-sensory Arts, Architecture, Fine Arts, Graphic communi-cation, etc.). A main objective of this network is to estab-lish common grounds through collaborative reflection and work on the above notions, using the concept of tangibilityas a focal point. In this paper we describe several different approaches to the tangibility, in relation to concepts such as reality, materiality, objectivity, pres-ence, concreteness, etc. and their antonyms. Our objec-tive is to open a debate on tangibility, in the belief that it has a strong unifying potential but is also at the same time presents challenging and difficult to define. Here we present some initial thoughts on this topic in a first effort to bring together the approaches that arise from the dif-ferent practices and projects developed within the partner institutions involved in the EASTN network.
Full-text available
The present article advances the notion of musical topography to describe the engagement between a practitioner and the musical instrument, emphasizing its developmental character. From the point of view of semiotic anthropology, it is suggested that the development of such a practical engagement is guided by expressivity, and that the instrument appears not only as an extension of the body, but participates in the generation of a unitary field, where bodily motion, the instrument, and the tonal space are intertwined. The development of lived musical practice draws its force from a situated tradition that consists of normative, structural, and stylistic elements, and of a constellation of genres and values shaped and reshaped by generations of practitioners. Finally, it is emphasized that the notion of musical topography brings back to musical praxis its long neglected imaginative dimension.
Drawing on recent ideas that explore new environments and the changing situations ofcomposition and performance, Simon Emmerson provides a significant contribution to the study of contemporary music, bridging history, aesthetics and the ideas behind evolving performancepractices. Whether created in a studio or performed on stage, how does electronic music reflect what is live and living? What is it to perform ‘live’ in the age of the laptop? Many performer-composers draw upon a ‘library’ of materials, some created beforehand in a studio, some coded ‘on the fly’, others ‘plundered’ from the widest possible range of sources. But others refuse to abandon traditionally ‘created andstructured’ electroacoustic work. Lying behind this maelstrom of activity is the perennial relationship to ‘theory’, that is, ideas, principles and practices that somehow lie behind composers’ and performers’ actions. Some composers claim they just ‘respond’to sound and compose ‘with their ears’, while others use models and analogies of previously ‘non-musical’ processes. It is evident that in such new musical practices the human body has a new relationship to the sound. There is a historical dimension to this, for since the earliest electroacoustic experimentsin 1948 the body has been celebrated or sublimated in a strange ‘dance’ of forces in which it has never quite gone away but rarely been overtly present. The relationship ofthe body performing to the spaces around has also undergone a revolution as the sourceof sound production has shifted to the loudspeaker. Emmerson considers these issues in the framework of our increasingly ‘acousmatic’ world in which we cannot see the source of the sounds we hear.
Digital media handles music as encoded physical energy, but humans consider music in terms of beliefs, intentions, interpretations, experiences, evaluations, and significations. In this book, drawing on work in computer science, psychology, brain science, and musicology, Marc Leman proposes an embodied cognition approach to music research that will help bridge this gap. Assuming that the body plays a central role in all musical activities, and basing his approach on a hypothesis about the relationship between musical experience (mind) and sound energy (matter), Leman proposes that the human body is a biologically designed mediator that transfers physical energy to a mental level--engaging experiences, values, and intentions--and, reversing the process, transfers mental representation into material form. He suggests that this idea of the body as mediator offers a promising framework for thinking about music mediation technology. Leman argues that, under certain conditions, the natural mediator (the body) can be extended with artificial technology-based mediators. He explores the necessary conditions and analyzes ways in which they can be studied. Leman outlines his theory of embodied music cognition, introducing a model that describes the relationship between a human subject and its environment, analyzing the coupling of action and perception, and exploring different degrees of the body's engagement with music. He then examines possible applications in two core areas: interaction with music instruments and music search and retrieval in a database or digital library. The embodied music cognition approach, Leman argues, can help us develop tools that integrate artistic expression and contemporary technology.
This paper investigated how auditory and vibrotactile feedback information is integrated within the context of violin quality evaluation. Fifteen violinists evaluated three violins on four criteria-"Rich Sound," "Loud and Powerful," "Alive and Responsive," and "Pleasure"-during a perceptual experiment. Violinists first evaluated the violins one at a time under three experimental conditions: (1) playing, (2) listening to it (played by a professional player) in an active way by fingering the score on an isolated neck, (3) same as (2) with vibrotactile feedback provided at the isolated neck. Violinists were then asked to evaluate the violins through pairwise comparisons under condition (3): Each violin was paired with itself while the level of vibrations of the isolated neck was either the original one or divided by two. The first part of the experiment demonstrated that Loud and Powerful judgments were affected by the presence of vibrations given that violins were rated louder in condition (3) than in (2). In the second part, violins were rated more positively with original vibration level at the isolated neck than with half the level, for all criteria but Alive and Responsive. Consistently with sensory interaction, the magnitude of the enhancement remained relatively constant across violins.
The last decade has seen a surge of interest in the development of affective driver interfaces designed to enhance driver safety and convey useful information to the driver. These technological advances have brought about changes in the ambient driving environment and clearly have the potential to enhance the driving experience in the years to come. This review provides an overview of existing research approaches to the study of these innovations. Other possible psychophysiological approaches to affective driver interface design, such as via mood induction procedures, are also discussed. Finally, we highlight the likely impact of the latest findings on the topic of multisensory integration research for the future design of affective driver interfaces. In particular, we look at how multisensory driver interfaces may evolve in the future, and assess the likely impact of multisensory warning signals that have been designed specifically to trigger the brain's defensive circuits.
Conference Paper
I suggest an approach to tangible interaction design that builds on social and cultural foundations. Specifically, I propose that designers can evoke cultural forms as a means to tap into users' existing cognitive, physical, and emotional resources. The emphasis is less on improving the usability of an interface and more on improving the overall experience around an interactive artifact by cueing productive patterns of social activity. My use of the term cultural form is derived from the work of Geoffrey Saxe and his form-function shift framework. This framework describes a process through which individuals appropriate cultural forms and restructure them to serve new functions in light of shifting goals and expectations. I describe Saxe's framework and then illustrate the use of cultural forms in design with three examples.