Conference PaperPDF Available

A User Experience Review of Music Interaction Evaluations

Authors:

Abstract and Figures

The need for thorough evaluations is an emerging area of interest and importance in music interaction research. As a large degree of DMI evaluation is concerned with exploring the subjective experience: ergonomics, action-sound map-pings and control intimacy; User Experience (UX) methods are increasingly being utilised to analyse an individual's experience of new musical instruments, from which we can extract meaningful, robust findings and subsequently gen-eralised and useful recommendations. However, many music interaction evaluations remain informal. In this paper, we provide a meta-review of 132 papers from the 2014 – 2016 proceedings of the NIME, SMC and ICMC conferences to collate the aspects of UX research that are already present in music interaction literature, and to highlight methods from UX's widening field of research that have not yet been explored. Our findings show that usability and aesthetics are the primary focus of evaluations in music interaction research , and other important components of the user experience such as enchantment, motivation and frustration are frequently if not always overlooked. We argue that these factors are prime areas for future research in the field and their consideration in design and evaluation could lead to a better understanding of NIMEs and other computer music technology.
Content may be subject to copyright.
A User Experience Review of Music Interaction
Evaluations
Dom Brown
University of the West of
England
Bristol, UK
dom.brown@uwe.ac.uk
Chris Nash
University of the West of
England
Bristol, UK
chris.nash@uwe.ac.uk
Tom Mitchell
University of the West of
England
Bristol, UK
tom.mitchell@uwe.ac.uk
ABSTRACT
The need for thorough evaluations is an emerging area of
interest and importance in music interaction research. As a
large degree of DMI evaluation is concerned with exploring
the subjective experience: ergonomics, action-sound map-
pings and control intimacy; User Experience (UX) methods
are increasingly being utilised to analyse an individual’s ex-
perience of new musical instruments, from which we can
extract meaningful, robust findings and subsequently gen-
eralised and useful recommendations. However, many music
interaction evaluations remain informal. In this paper, we
provide a meta-review of 132 papers from the 2014 – 2016
proceedings of the NIME, SMC and ICMC conferences to
collate the aspects of UX research that are already present
in music interaction literature, and to highlight methods
from UX’s widening field of research that have not yet been
explored. Our findings show that usability and aesthetics
are the primary focus of evaluations in music interaction re-
search, and other important components of the user expe-
rience such as enchantment, motivation and frustration are
frequently if not always overlooked. We argue that these
factors are prime areas for future research in the field and
their consideration in design and evaluation could lead to a
better understanding of NIMEs and other computer music
technology.
Author Keywords
Evaluation, Methods, Meta-Analysis, User Experience
(UX), Human Computer Interaction (HCI), User Studies.
ACM Classification
A.1 [Introductory and Survey]; H.5.5 [Informa-
tion Interfaces and Presentation] Sound and Music
Computing—Methodologies and Techniques; H.5.2 [In-
formation Interfaces and Presentation] User Interfaces—
Evaluation/Methodology.
1. INTRODUCTION
Evaluation has become the subject of important discussion
and consideration in the NIME and wider computer mu-
sic community, and has been previously described as the
“holy grail of NIME research” [20]. Perhaps due to NIME’s
historic connection to SIGCHI, the field has often looked
Licensed under a Creative Commons Attribution
4.0 International License (CC BY 4.0). Copyright
remains with the author(s).
NIME’17, May 15-19, 2017, Aalborg University Copenhagen, Denmark.
.
Figure 1: Qualities evaluated in NIME papers from the per-
former’s perspective [1].
to the Human-Computer Interaction (HCI) community for
inspiration in evaluation methods and frameworks [22, 36].
While early NIME evaluation methods focused on task-
based usability [36], there has been a shift towards a more
subjective and experiential focus [16, 22, 34], inspired by
User Experience (UX), a movement within HCI that focuses
on the user’s subjective experience of an interaction with
technology [23, 28].
Despite this, many music interaction evaluations remain
informal, and do not adhere to any particular method, while
the qualities evaluated has become diverse (Figure 1) [1].
However, NIME evaluations often correlate with themes of
UX, with researchers studying phenomena like engagement
[35], emotion [11] and interest [3] in a computer music con-
text, and with a desire for generalisable results, the NIME
community would benefit from evaluations following estab-
lished methods.
Barbosa et al. [1] make a valuable contribution in their
analysis of NIME papers, and provide an overview that
highlights the diverse nature of evaluations used in NIME
research.
In contrast, we will analyse recent interaction evalua-
tions published at the NIME, SMC and ICMC conferences
from the perspective of UX, using an adaptation of the
QUOROM method used by Bargas-Avila and Hornbæk [2].
By doing so, we intend to provide a fresh perspective of
music interaction evaluations through the lens of a sepa-
rate, but very much related, discipline, examining recent
trends and identifying areas for future consideration in the
design and evaluation of NIMEs and other music interaction
technology.
2. BACKGROUND
Various evaluation methods have been put forward to eval-
uate digital musical instruments and other computer music
applications. The most notable of which is that of Wander-
ley and Orio [36], who borrow from HCI and suggest using
Figure 2: The QUOROM procedure for this study.
musical tasks to quantitatively evaluate the usability of mu-
sical controllers. While usability is an important aspect of
HCI research and a useful metric in some musical domains,
users of a musical technology are often more interested in
its capacity for expression, or whether it is engaging, en-
joyable, or rewarding over whether it is easy to use. As
such, there has been a move within the NIME community
away from usability as an evaluation target and towards
more subjective, experiential-based methods [34, 16, 22] in-
spired by UX, with the need for suitable evaluation metrics
becoming an important area of discussion [12].
3. OBJECTIVES
This paper seeks to provide an alternative perspective of
the music interaction evaluations taking place in the com-
puter music community by analysing recent literature from
NIME, SMC and ICMC involving empirical user-focused
evaluations. Our review will focus on:
1. The stakeholders considered in the evaluations.
2. The dimensions of UX that are evaluated.
3. What participant tasks are used.
4. How data is collected.
By studying these papers through a UX perspective, es-
tablished criteria in the field such as virtuosity and trans-
parency [14] are not explicitly considered, as they do not fit
within the definitions of the UX dimensions.
4. METHODOLOGY
Our method is drawn from an adaptation of the QUOROM
method [29] used by Bargas-Avila and Hornbæk [2]. We
have filtered our corpus as follows:
1. Identify sources.
Source Selection: Conference proceedings of New In-
terfaces for Musical Expression (NIME), Sound and
Music Computing (SMC), and the International Com-
puter Music Conference (ICMC) for 2014 – 2016 (N
= 862).
2. Find appropriate publications.
Screening criteria: Papers that mention an empirical
user study in the title or abstract, using the keywords
Evaluat[e,ion,ed,ing], User, Study. (N = 147).
3. Publications retrieved for detailed evaluation.
Screening Criteria: Papers of which the evaluation fo-
cuses on the user’s experience (For example, papers
that use musical Turing tests are omitted) (N = 132).
4. Final Corpus.
The final corpus for our meta-analysis consisted of 132
papers.
For the UX dimensions, we chose to use similar dimen-
sions to those found by Bargas-Avila and Hornbæk [2] in
their meta-analysis to be prominent aspects evaluated in
the UX field. We also decided to note when papers focus
on usability, allowing us to compare its use against the di-
mensions of UX. We have provided the definitions used in
categorising the corpus.
1. Usability Evaluations cover concepts such as ease of
use, effectiveness and ergonomics, efficiency and learn-
ability [30].
2. Generic UX Evaluations take a holistic approach
and seek to explore the participants’ experiences as
a whole, without focusing on any specific dimensions.
3. Aesthetics Evaluations focus on the aesthetic, artis-
tic properties of the experience [24], such as appeal,
taste, style, and expression [9].
4. Emotion Evaluations measure the emotional re-
sponse and feelings of participants.
5. Enchantment Evaluations focus on the affective at-
tachment of people to technology [25].
6. Engagement Evaluations study flow [8], intrinsic in-
terest and curiosity [7].
7. Enjoyment Evaluations focus on the hedonic quali-
ties of interaction [6].
8. Motivation Evaluations focus on what drives a par-
ticipant’s decisions and behaviour [13].
9. Frustration Evaluations focus on the participant’s
dislikes and hindrances during an interaction [27].
In a similar method to Barbosa et al. [1], we have identi-
fied the stakeholders in each evaluation, using the following
categories:
1. Performers Participants with agency, actively affect-
ing their experience of real-time auditory interaction.
2. Audiences Participants without agency in the eval-
uation, passively involved in the experience.
3. Designers Participants with agency in evaluations
that involve creating or designing hardware or soft-
ware.
4. Composers Participants with agency in evaluations
that involve composing or creating artistic material,
but not performing.
We feel that it is important to give these definitions as
some of the evaluations do not follow a traditional perfor-
mance framework, for example in [17], where each partici-
pant is asked to play an auditory game. As the participant
is actively engaging in a task, they have been categorised as
a performer.
5. RESULTS
Due to our analysis taking place before the ICMC 2016 pro-
ceedings were available, the small number of relevant ICMC
2015 papers (N = 6) and the joint ICMC/SMC conference
of 2014, we decided to group the ICMC and SMC papers
together in our analysis. The breakdown of papers used in
the analysis is as follows:
NIME 2014: 28
NIME 2015: 19
NIME 2016: 15
ICMC/SMC 2014: 32
ICMC 2015: 6
SMC 2015: 16
ICMC 2016: N/A
SMC 2016: 16
Our analysis was non-exclusive, with some evaluations
covering more than one UX dimension, data collection
method, stakeholder or participant task. If more than
one evaluations were included in a publication they were
recorded as separate results. After our analysis, we identi-
fied the following categories for the participant tasks:
1. Specific Task Participants are asked to perform a
pre-determined exercise, such as listen to auditory
stimulus, or perform certain tasks with an instrument.
2. Open Exploration Participants are free to do as
they please during an interaction.
3. Guided Exploration Participants have some free-
dom, but are guided by certain constraints.
4. Watch Performance Participants watch a perfor-
mance given by a musician, in either a concert or lab-
oratory setting (e.g. watching a video).
5. Prepare and/or Give Performance Participants
are asked to prepare a piece and give a performance
as part of the evaluation.
6. Workshop Participants’ interactions take place in a
workshop setting.
7. In The World Use Participants use the technology
in their own personal environments.
8. Other Any other task that does not fit in the above
categories.
As well as data collection methods:
1. Questionnaires Specific questions used to gather re-
sponses.
2. Likert Scales Questionnaires use the Likert format.
3. Comparisons Participants are asked to compare
stimulus, and give ratings; perform pair-wise compar-
isons and the like.
4. Interviews Either structured or unstructured.
5. Field Notes Observations are taken by researchers
during the evaluation.
6. Audio/Video Recording Recordings of experiment
are used in the analysis.
7. Interaction Log The user’s interaction with an in-
terface is logged.
8. Open/Informal Comments Unstructured feedback
is provided.
9. Created Materials Things made by participant’s
during the evaluation are analysed, e.g. [26].
10. Physiological Measurements Methods such as
EEG, ECG and the like are used to record a partici-
pant’s body.
11. Other Any other method that does not fit in the
above categories.
12. NS The data collection method is not specified.
5.1 Stakeholders
The most popular stakeholder used in evaluations was the
performer (50.7%), followed by the audience (39.3%). De-
signers (3.3%) and composers (6.7%) perspectives were
rarely evaluated. While it has been suggested that perform-
ers are the most important stakeholders in digital music [4],
our results suggest that the perspectives of designers and
composers could be better represented during evaluations,
as these perspectives may reveal aspects of musical interac-
tions that have previously been overlooked.
Our stakeholders results are quite different to those of
[1], whose stakeholders results were: Performers: 52, De-
signers: 28 and Audience: 20. We believe this is because of
Figure 3: Stakeholders
their inclusion of technical evaluations as evaluations from
the designer’s perspective. Since our focus is on evaluations
with participants our designers result is low, as a designer’s
subjective experience is not usually solicited during techni-
cal evaluations.
5.2 UX Dimensions
Our results indicate that although UX concepts are be-
ing applied in computer music research, usability remains
a popular metric in NIME papers (21.7%), while within
ICMC and SMC, the largest proportion were not applica-
ble to dimensions of UX, for example [15], in which an au-
dience’s perception of vibro-tactile feedback is measured. A
high amount of not applicable papers is to be expected, and
is most likely due to the fact that empirical evaluations in
computer music research do not always share the same tar-
gets as UX research, and so a large number of papers will
not fit within our scope.
Of the dimensions of UX, aesthetics is the most commonly
used (19.4%), followed by generic UX (13.7%) and engage-
ment (10.9%). This reflects the literature of the field, which
highlights the importance of expression [10], style [18, 19]
and engagement [37] in computer music research. Generic
UX papers often included evaluations with less formal struc-
tures, such as [21], in which a group of children are used
to evaluate a museum experience through open exploration
and group interview, and reflect the ideas of Stowell et al.
[34] in their proposed qualitative method.
Emotion and enjoyment were evaluated in relatively equal
measure (9.1%), but emotion evaluations in ICMC/SMC
occurred only from the audience’s perspective.
Interestingly, three dimensions: motivation, enchantment
and frustration; were evaluated for either rarely or not at all.
This suggests that these are areas of UX that are currently
overlooked in music interaction, and represent an opportu-
nity for new directions in research. For example, studying
how musicians become affectionately attached to an instru-
ment may help us understand long term uptake of NIMEs,
while studying motivation may allow us to explore their ap-
peal over traditional instruments.
Although frustration is often linked to measurements of
user error used in usability studies, in UX, frustration repre-
sents a qualitative exploration of negative aspects of a user’s
experience, for example in [5], and its study could help the
computer music community identify areas for improvement
in the design of NIMEs and interaction technology.
5.2.1 Performers
From the performer’s perspective, usability was found to
be the most prominent dimension (29.4%), followed by
generic UX (18.8%) while aesthetics, engagement and enjoy-
ment share a similar proportion (10.5%). While NIME and
Figure 4: UX Dimensions
ICMC/SMC have different quantities of performer evalua-
tions, they have a similar spread of evaluation dimensions,
with usability being the most popular.
Usability remains prominent most probably because of its
close relation to ideas of learnability and playability, which
are important ideas in NIME and computer music research.
5.2.2 Audience
Aesthetics was the most prominent dimension from the au-
dience’s perspective, in both ICMC/SMC and NIME. Inter-
estingly, emotion was commonly studied within SMC and
ICMC, while it was rare within NIME evaluations. Con-
versely, NIME often focused on engagement and enjoyment
while ICMC/SMC evaluations rarely did so.
5.3 Participant Tasks
Overwhelmingly, the most popular participant tasks were
specific tasks (53.1%), which make up the majority of
ICMC/SMC evaluations. Meanwhile, NIME evaluations
use specific tasks and open exploration in equal measure.
The other tasks were used much less frequently.
Interestingly, NIME evaluations include watching perfor-
mances much more than ICMC/SMC. This could be due to
NIME’s focus on instruments, which suit audience evalua-
tion through performance.
When filtered by UX dimension, it is interesting to ob-
serve that while questionnaires are the most popular tech-
nique for most dimensions, open exploration is the most
Figure 5: Participant Tasks
popular for generic UX. This reflects the dimension’s less
focused approach, in that via open exploration, any aspect
of the interaction may be explored by participants. Simi-
larly, “in the world” use is used mostly in generic UX, as
this technique also encourages an open response from par-
ticipants.
Meanwhile, emotion is studied nearly exclusively using
specific tasks, with evaluations often asking audience par-
ticipants to report on their emotions after listening to musi-
cal stimuli. The dimensions of aesthetics, engagement and
enjoyment are each studied using a wide range of tasks,
but most prominently specific tasks, open exploration and
watching performances.
5.4 Data Collection
The most popular method of collecting data was by ques-
tionnaire (24.6%), and our results reflect those of [1]. Due to
their prominence, questionnaires formatted as Likert scales
were included in their own category (12.0%). Question-
naires most likely remain a popular technique as they give
evaluations an ability to focus on specific aspects, and quan-
titatively analyse otherwise qualitative elements of an inter-
action.
Interaction logs are used mainly to measure usability.
This reflects the evaluation technique of Wanderley and
Orio [36], as well as Kiefer et al. [22], which use inter-
action logs to provide quantitative data for usability mea-
surements.
Interviews and field notes were mostly used to measure
generic UX, while questionnaires are rarely used. This also
reflects the open nature of the dimension, as interviews and
field notes do not limit a participant’s response.
Comparisons, such as pair-wise comparisons and prefer-
ence ranking, are most commonly used to measure aesthetic
qualities.
While we found that emotional responses are elicited us-
ing specific tasks, they are collected using a wide variety of
methods, including specific emotion measurement tools like
the Self-Assessment Manikin (SAM).
Figure 6: Data Collection
6. DISCUSSION
Our results indicate that there is a strong correlation be-
tween UX and the evaluation criteria used in computer mu-
sic research. However, usability remains the most promi-
nent idea from HCI used in the field, despite efforts to move
the field towards UX theories and principles.
We have found three common dimensions in UX research:
motivation, enchantment and frustration; that are evalu-
ated rarely or not at all in computer music interactions.
These areas could help to address key questions regarding
digital musical instruments, and help us to better under-
stand the nature of the instruments and technologies we
create. For example, looking at enchantment and the way
in which musicians become emotionally attached to DMIs
may help to us to understand how short-term experimenters
become long-term practitioners; understanding what moti-
vates and influences musicians to choose DMIs could en-
able us to design in ways that encourage new players; and
studying frustration in DMIs could help us to design more
enjoyable and engaging music interaction experiences.
While these dimensions are inherently very different from
each other, they share a very qualitative nature. Exam-
ples of their previous use in HCI literature use descriptive
case studies [32] and highlight the need for “rich personal
accounts” [33]. This more qualitative perspective is also
shared with much research in the NIME community, and
highlights the growing trend in both UX and music inter-
action towards deeper explorations of a user’s subjective
experience, as well as the potential ease with which these di-
mensions could be adopted into music interaction research.
As well as our UX dimension findings, we have found
that specific tasks are the most popular participant task
used in evaluations, and data is most commonly collected
through questionnaires. While these are tried and tested
methods, it indicates that there is room within computer
music evaluations for the use of alternative methods, which
may help us to evaluate our technologies more thoroughly.
For example, studying how musicians use instruments in
their own personal environments (“in the wild”) allows us
to better examine their creative process, as it is difficult to
capture this in laboratory environments [16].
Similarly, the tasks of watching and preparing for a per-
formance reflect real world use cases for musical technology,
and we can learn much from studying the dynamics behind
these processes. As every evaluation needs to be tailored
to the specific goals and needs of the research in question
[31], a full discussion of how our findings should affect fu-
ture evaluations is beyond the scope of this paper, and is
an area for future exploration.
Our analysis may have benefited from delineating be-
tween individual and group stakeholders, which would have
provided a deeper insight into the user experience of multi-
user interactions, such as collaborative installations. Also,
breaking specific tasks into subcategories (for example into
listening exercises and performance tasks) would have al-
lowed for more detailed analysis of participant tasks.
By reviewing which areas of UX are commonly evalu-
ated in music interaction research and which are overlooked,
alongside the participant tasks and data collection methods
used, we have provided a new perspective on the interaction
evaluations taking place, and revealed alternative qualities
to be considered in future NIME research.
7. CONCLUSION
In this paper we have found that usability and aesthetics
are commonly used in evaluations of interaction in the com-
puter music field, while three areas of UX: motivation, en-
chantment and frustration; are often overlooked in current
interaction evaluations, and represent potential avenues for
future research. As well as this, we have found that ques-
tionnaires are the most popular method of data collection,
and specific tasks are the most common participant tasks.
Future work will include the analysis of earlier years of
NIME, SMC and ICMC to reveal how evaluations have
evolved over time, as well as exploring how these findings
may be applied to future evaluation methods.
8. REFERENCES
[1] J. Barbosa, J. Malloch, M. M. Wanderley, and
S. Huot. What does “evaluation” mean for the nime
community? In Proc. of NIME 2015, pages 156–161,
Baton Rouge, LA, USA, June 2015.
[2] J. A. Bargas-Avila and K. Hornbæk. Old wine in new
bottles or novel challenges: a critical analysis of
empirical studies of user experience. In Proc. of the
SIGCHI Conference on Human Factors in Computing
Systems 2011, pages 2689–2698. ACM, 2011.
[3] S. A. Bin, N. Bryan-Kinns, and A. P. McPherson.
Skip the pre-concert demo: How technical familiarity
and musical style affect audience response. In Proc. of
NIME 2016, pages 200–205, Brisbane, Australia, July
2016.
[4] D. Birnbaum, R. Fiebrink, J. Malloch, and M. M.
Wanderley. Towards a dimension space for musical
devices. In Proc. of NIME 2005, pages 192–195,
Vancouver, Canada, May 2005.
[5] M. Blythe, J. Reid, P. Wright, and E. Geelhoed.
Interdisciplinary criticism: analysing the experience of
riot! a location-sensitive digital narrative. Behaviour
& Information Technology, 25(2):127–139, 2006.
[6] M. A. Blythe, K. Overbeeke, A. F. Monk, and P. C.
Wright. Funology: from usability to enjoyment.
Springer Science & Business Media, 3rd edition, 2004.
[7] P. Chapman, S. Selvarajah, and J. Webster.
Engagement in multimedia training systems. In Proc.
of the 32nd Annual Hawaii International Conference
on Systems Sciences, pages 9–17, Jan 1999.
[8] M. Csikszentmihalyi. Creativity: Flow and the
Psychology of Discovery and Invention. New York:
Harper Collins, 1996.
[9] A. C. Danto. The transfiguration of the commonplace:
a philosophy of art. Harvard University Press, 1981.
[10] C. Dobrian and D. Koppelman. The ‘e’ in nime:
Musical expression with new computer interfaces. In
Proc. of NIME 2006, Paris, France, June 2006.
[11] J. Eaton, D. Williams, and E. R. Miranda. Affective
jukebox: A confirmatory study of eeg emotional
correlates in response to musical stimuli. In Proc. of
ICMC/SMC 2014, Athens, Greece, September 2014.
[12] D. El-Shimy and J. R. Cooperstock. User-driven
techniques for the design and evaluation of new
musical interfaces. Computer Music Journal,
40(2):35–46, 2016.
[13] P. Evans. Motivation. Psychology Press, New York,
NY, USA, 2015 edition, 1975.
[14] S. Fels, A. Gadd, and A. Mulder. Mapping
transparency through metaphor: Towards more
expressive musical instruments. Organised Sound,
7(2):109–126, 2002.
[15] F. Fontana, I. Camponogara, P. Cesari, M. Vallicella,
and M. Ruzzenente. An exploration on whole-body
and foot-based vibrotactile sensitivity to melodic
consonance. In Proc. of SMC 2016, Hamburg,
Germany, September 2016.
[16] S. Gelineck and S. Serafin. Longitudinal evaluation of
the integration of digital musical instruments into
existing compositional work processes. Journal of
New Music Research, 41(3):259–276, 2012.
[17] F. Grani, R. Paisa, J. S. Banas, I. Vogiatzoglou, and
S. Serafin. Design and evaluation of a gesture driven
wavefield synthesis auditory game. In Proc. of NIME
2016, Brisbane, Australia, 2016.
[18] M. Gurevich, P. Stapleton, and A. Marquez-Borbon.
Style and constraint in electronic musical instruments.
In Proc. of NIME 2010, Sydney, Australia, June 2010.
[19] S. Jord`a. Digital instruments and players: Part
ii–diversity, freedom and control. In Proc. ICMC 2004,
pages 706–710, Miami, FL, USA, November 2004.
[20] S. Jord`a. Digital Lutherie: Crafting musical computers
for new musics’ performance and improvisation. PhD
thesis, Universitat Pompeu Fabra, 2005.
[21] M. H. Jørgensen, A. S. Knudsen, T. M. Wilmot,
K. D. Lund, S. Serafin, and H. Purwins. A mobile
music museum experience for children. In Proc. of
NIME 2015, pages 36–37, Baton Rouge, LA, USA,
June 2015.
[22] C. Kiefer, N. Collins, and G. Fitzpatrick. Hci
methodology for evaluating musical controllers: A
case study. In Proc. of NIME 2008, pages 87–90,
Genova, Italy, June 2008.
[23] E. L.-C. Law, V. Roto, M. Hassenzahl, A. P.
Vermeeren, and J. Kort. Understanding, scoping and
defining user experience: A survey approach. In Proc.
of the SIGCHI Conference on Human Factors in
Computing Systems, CHI ’09, pages 719–728, New
York, NY, USA, 2009. ACM.
[24] M. Luhtala, I. Niemel¨
ainen, J. Plomp, M. Turunen,
and J. Tuomisto. Studying aesthetics in a musical
interface design process through ‘aesthetic experience
prism’. In Proc. of NIME 2012, Ann Arbor, Michigan,
2012. University of Michigan.
[25] J. McCarthy, P. Wright, J. Wallace, and A. Dearden.
The experience of enchantment in human–computer
interaction. Personal and Ubiquitous Computing,
10(6):369–378, 2006.
[26] A. McPherson and V. Zappi. Exposing the scaffolding
of digital instruments with hardware-software
feedback loops. In Proc. of NIME 2015, pages
162–167, Baton Rouge, LA, USA, June 2015.
[27] V. Mendoza and D. G. Novick. Usability over time. In
Proc. of the 23rd Annual International Conference on
Design of Communication: Documenting & Designing
for Pervasive Information, pages 151–158, Coventry,
UK, September 2005. ACM.
[28] M. Minge. Dynamics of user experience. In Proc. of
the Workshop on Research Goals and Strategies for
Studying User Experience and Emotion, NordiCHI,
2008.
[29] D. Moher, D. J. Cook, S. Eastwood, I. Olkin,
D. Rennie, D. F. Stroup, Q. Group, et al. Improving
the quality of reports of meta-analyses of randomised
controlled trials: the quorom statement. The Lancet,
354(9193):1896–1900, 1999.
[30] J. Nielsen. Usability Engineering. Academic Press,
Inc., Cambridge, MA, USA, 1993.
[31] S. O’Modhrain. A framework for the evaluation of
digital musical instruments. Computer Music Journal,
35(1):28–42, 2011.
[32] P. R. Ross, C. J. Overbeeke, S. A. G. Wensveen, and
C. M. Hummels. A designerly critique on
enchantment. Personal and Ubiquitous Computing,
12(5):359–371, 2008.
[33] P. Sengers, K. Boehner, M. Mateas, and G. Gay. The
disenchantment of affect. Personal and Ubiquitous
Computing, 12(5):347–358, 2008.
[34] D. Stowell, M. D. Plumbley, and N. Bryan-Kinns.
Discourse analysis evaluation method for expressive
musical interfaces. In Proc. of NIME 2008, pages
81–86, Genova, Italy, June 2008.
[35] K. Tahiroglu, J. C. Vasquez, and J. Kildal.
Non-intrusive counter-actions: Maintaining
progressively engaging interactions for music
performance. In Proc. of NIME 2016, pages 444–449,
Brisbane, Australia, July 2016.
[36] M. M. Wanderley and N. Orio. Evaluation of input
devices for musical expression: Borrowing tools from
hci. Computer Music Journal, 26(3):62–76, 2002.
[37] D. Wessel and M. Wright. Problems and prospects for
intimate musical control of computers. Computer
Music Journal, 26(3):11–22, 2002.
... Research in HCI and NIME shows a trend towards experienced-focused over taskfocused evaluations [3][12] [13]. Experience-focused evaluations provide a more usercentric perspective on interaction with technology [14]. ...
... Evaluations were classified based on several characteristics. In addition to using classifications from [2] and [3] to produce comparable results, we identified evaluation types based on researchers' higher level evaluation goals. ...
... To be included in the pool, a paper was required to (1) present and (2) [2] and [3]. Keywords were selected based on the authors' initial review of the papers as well as visualizations presented in [2] and [3]. ...
... Within DMI interaction, there exist several perspectives with vested interest in design and performance. These stakeholders have been identified as designer, performer, composer, and audience (Brown et al., 2017;O'modhrain, 2011). While stakeholder requirements may not overlap, and may even compete, the multidisciplinary nature of DMI interaction means that musicians may be able to fluidly shift their concerns between these roles, simultaneously exemplifying multiple perspectives (Barbosa et al., 2015;Magnusson, 2009Magnusson, , 2010. ...
... Reviews of DMI evaluation (Barbosa et al., 2015;Brown et al., 2017;Reimer & Wanderley, 2021;Stowell et al., 2009) note several limitations of existing strategies, including idiosyncrasy, lack of formal structure, insufficient inclusion of detail necessary for replication, and absence of evaluations conducted over time (longitudinal studies). Longitudinal studies are of particular importance given the centrality of time in musical activity. ...
Chapter
This chapter focuses on different aspects of sound interaction ranging from how we hear, how we create, and the technologies that we use to interact with sound in different contexts. Three different contexts of uses are used to illustrate a wide range of users and areas of application. This includes interactive soundscape planning of public spaces, interacting with sound installation, and music interaction with digital music instruments. Each section highlights the needs of the users and discusses user experience evaluation, shining a light on the wide range of disciplines that sound interaction can have an impact on. This concludes with a discussion of current challenges and tips to support future innovation with sound.
... Cada vez más investigadores son conscientes de la necesidad de encontrar, definir o desarrollar nuevas formas de evaluar las interacciones y las interfaces tecnológicas musicales (O'Modhrain, 2011;Vasilakos, 2016;Yang & Lerch, 2020), algunas de ellas incluyendo la experiencia protagónica de usuario (Brown et al., 2017). También es un hecho la existencia de investigaciones que integran además aspectos conjuntos de la educación, la salud y el bienestar de los usuarios dentro de los procesos de diseño de las nuevas tecnologías musicales (Casal, 2000;Cerebello et al., 2019;Francis, 2018;Holland & Fiebrink, 2019;Knight, 2013;Krout et al., 1993;Mcgowan et al., 2021;Streeter, et al., 2012;Van Besouw et al., 2016). ...
... Por otra parte, los últimos incrementos en las publicaciones dentro del periodo de crisis que va del 2008 al 2018 (Lansac Colom, 2019;Ramus, 2019), desembocan en una caída drástica que refleja la actual crisis mundial. A partir de este hecho, y frente a una normalización del uso y la apropiación de las nuevas tecnologías cognitivas que está afectando a la ciencia, la educación, la salud y el bienestar, se puede afirmar que realizar evaluaciones exhaustivas es actualmente un área emergente de interés dentro de la investigación de la interacción musical tecnológica, especialmente si dichas evaluaciones exploran la experiencia subjetiva de cada usuario, con el fin de entender mejor una idea de usabilidad y ergonomía más amplia y flexible que integre otros componentes importantes de la experiencia de usuario, como el encanto, la motivación o la frustración, aspectos cognitivos que, junto con otros, suelen ser erróneamente soslayados en la mayor parte de las evaluaciones e investigaciones (Brown et al., 2017). ...
Article
Full-text available
Este artículo revisa el estado del arte de las investigaciones que han evaluado hasta el momento las interacciones entre los humanos y las tecnologías musicales. El objetivo es estudiar cuántas de las investigaciones existentes en este campo han evaluado la singular interacción proactiva, creativa y expresiva de los usuarios tecnológicos, y cuántas integran además aspectos relacionados con la educación, la salud y el bienestar, con el fin de potenciar un futuro enriquecimiento del trabajo colaborativo entre profesionales especializados dentro de los diseños de tecnología creativa musical, y los equipos educativos en los que se presencian este tipo de interacciones tecnológicas. La cuarta revolución industrial hipernormaliza progresivamente la creatividad, manteniendo una falsa y excesiva normalización social de su lógica funcional. Su paradigma tecnosimbólico afecta a ciencia, educación, uso y apropiación de la tecnología. Como consecuencia, la rigidez de la estandarización y automatización normaliza los procesos de aprendizaje creativos, generando disonancias cognitivas que afectan a la salud y el bienestar de las personas, evidenciando la necesidad de encontrar diseños tecnológicos flexiblemente humanizados. Este estudio pone en valor la práctica musical como una de las áreas creativas que más está revolucionando el campo de las interacciones tecnológicas cognitivas. A través de una metodología mixta, basada en la recopilación, el análisis y la interpretación de las fuentes más pertinentes, recabadas en tres de las principales bases de datos científicas, se obtiene un porcentaje escaso de investigaciones relacionadas con el objeto central de búsqueda ya mencionado en el objetivo descrito. Las conclusiones evidencian que la comunidad científica no se ha enfocado en la búsqueda de paradigmas inclusivos que integren una dimensión de la evaluación respetuosa con las capacidades creativas del usuario tecnológico, a partir de un concepto abierto y holístico de la educación, la salud y el bienestar. Ello señala la oportunidad para implementar evaluaciones que cuenten con la participación de musicoterapeutas y terapeutas artístico-creativos dentro de los procesos de diseño de la tecnología creativa y expresiva, con el fin de facilitar una búsqueda crítica y proactiva de nuevos paradigmas tecnológicos más respetuosos, flexibles y resilientes con la educación, la salud y el bienestar, incluyendo marcos capaces de aportar alternativas viables al problema que plantea la hipernormalización tecnológica actual.
... Some of these criteria included engagement, playability, expres-siveness, effectiveness, control, precision, latency and intuitiveness. In 2017, Brown, Nash and Mitchell [22] also presented a literature review regarding three audio-based conferences to collect user-centred evaluation methods, with the goal of fnding trends within music interaction research and future directions. Gathering the evaluation stakeholders (performer, audience, designer and composer), the UX dimensions, the tasks that the participants had to undertake and the respective methods, the authors found usability and aesthetics as the primary focus of evaluation. ...
... Regarding the participant as the main agent of action, a "part of a dynamic system of information exchange" [2], the multitude of roles each can embody may be further deepened. Answering what these roles could be as functional entities with variable freedom exploring the contextual soundscape, we defned three kinds of embodied personas inspired by previous theoretical works [19,21,22]: 1. the spectator role, as a member of an audience exposed to an auditory artefact and actively listening to it, but with no direct action to control or change its course; 2. the performer, where sound becomes the conductor of the experience through the user's actions and exploration of the space of possibilities; and 3. the composer, a participant who becomes a member of a creation process, a creator of the auditory space itself, either in its design, composition and interaction layout. ...
Conference Paper
Full-text available
The act of sound perception and its subjective dimensions, from physical to psychoacoustics, from semantic to affective, carry an inherent challenge for the conception and evaluation of every audio-based artefact. Starting from a previous framework of evaluation approaches, we seek to deconstruct the confguring elements of these processes, searching for theoretical foundations informing Sound Design and possible applications for Auditory Displays. This work is a frst step into identifying a body of knowledge on the listener’s experience, how the act of listening takes place and how the sequence of listening actions can evolve as forms of dialogue, creating dialogical spaces for making sense of auditory information. With this work, practitioners can gain new insights into how existing techniques for creating auditory artefacts can be confgured and transformed into new, alternative approaches.
... This review also linked the four stages of interaction (electrical, symbolic, textual, and graphical) to specific music technologies to demonstrate the appropriateness in a music interaction context. However, a study undertaken in 2017 reviewed conference papers published in the previous three years within the music technology community to assess which aspects of interaction design were considered [10]. They assessed all the papers using the user experience (UX) dimensions of usability, generic UX, aesthetics, emotion, enchantment, engagement, enjoyment, motivation, and frustration, to establish if there were areas for further consideration. ...
... Also, when stages 4 & 5 are designed they will be completed in compliance with standards of evaluation and testing in the sector [227]. As already highlighted in Section 2.1, in this sector when systems are considered from a design perspective the use of quantitative analysis and evaluation is more widespread [10]. As a result, authors of this study identified there is evidence that not all the work considers the full user experience. ...
Thesis
This research investigates the use of graphical interpolation to control the mapping of synthesis parameters for sound design, and the impact that the visual model can have on the interpolator’s performance and usability. Typically, these systems present the user with a graphical pane where synthesizer presets, each representing a set of synthesis parameter values and therefore an existing sound, can be positioned at user-selected locations. Subsequently, moving an interpolation cursor within the pane will then create novel sounds by calculating new parameter values, based on the cursor position and an interpolation model. These systems therefore supply users with two sensory modalities, sonic output and the visual feedback from the interface. A number of graphical interpolator systems have been developed over the years, with a variety of user-interface designs, but few have been subject to formal user evaluation making it difficult to compare systems and establish effective design criteria to improve future designs. This thesis presents a novel framework designed to support the development and evaluation of graphical interpolated parameter mapping. Using this framework, comparative back-to-back testing was undertaken that studied both user interactions with, and the perceived usability of, graphical interpolation systems, comparing alternative visualizations in order to establish how the visual feedback provided by the interface aids the locating of desired sounds within the space. A pilot investigation compared different levels of visual information, the results of which indicated that the nature of visualisation did impact on user interactions. A second study then reimplemented and compared a number of extant designs, where it became apparent that the existing interpolator visuals generally relate to the interpolation model and not the sonic output. The experiments also provide new information about user interactions with interpolation systems and evidence that graphical interpolators are highly usable in general. In light of the experimental results, a new visualization paradigm for graphical interpolation systems is proposed, known as Star Interpolation, specifically created for sound design applications. This aims to bring the visualisation closer to the sonic behaviour of the interpolator by providing visual cues that relate to the parameter space. It is also shown that hybrid visualizations can be generated that combine the benefits of the new visualization with the existing interpolation models. The results from the exploration of these visualizations are encouraging and they appear to be advantageous when using the interpolators for sound design tasks.
... Upon completion of the tests, the user underwent a semi-structured interview to elucidate these topics. The interview encompassed nine evaluation aspects, as follows [20], [21]: ...
Conference Paper
Full-text available
The remix technique has been widely used in musical practice, mainly due to the figure of Disc Jockeys (DJs), which combines several pre-existing sounds to produce completely new content. However, this creation method also appears in other forms of artistic expressions, such as architecture, photography, fashion design, video games, etc. Recent technological advances favor the emergence of gadgets that help expand this practice, such as Smart Musical Instruments (SMI), devices equipped with sensors, actuators, embedded intelligence, and wireless connectivity to allow new forms of musical expression. In view of this scenario and the versatility of remix, this paper presents a prototype of an SMI, called RemixDrum, conceived not only for creating sounds through technological means but also to allow the mixing of multimedia content, proposing a new context of use for this type of interface and a new way to make art. Finally, the prototype was evaluated quantitatively, measuring average latency, jitter and throughput, qualitatively, investigating the user experience and their interaction with the equipment, and comparatively, through an analysis between our prototype and related works, in order to analyze perceptible delays, expressiveness and other artistic factors.
... Cada vez más investigadores son conscientes de la necesidad de encontrar nuevas formas de evaluar las interacciones y las interfaces tecnológicas musicales (O'Modhrain, 2011;Vasilakos, 2016;Yang & Lerch, 2020). Algunas de estas formas incluyen la experiencia creativa del usuario tecnológico (Brown et al., 2017). Otras investigaciones integran además aspectos conjuntos de la educación, la salud y el bienestar de los usuarios dentro de los procesos de diseño de las nuevas tecnologías musicales (Cerebello Poblete & Farías Serey, 2019;Francis, 2018;Krout et al., 1993;Mcgowan et al., 2021;Streeter et al., 2012;Van Besouw et al., 2016). ...
Article
ARTSEDUCA, en su n. 33 vuelve a mostrar importantes investigaciones y experiencias que refuerzan los objetivos y las metas que la Agenda 2030 de la Organización de Nacio- nes Unidas, necesita conseguir. Cada vez más, se pone en valor la necesidad de contar con el ARTE más allá de la educación, en la propia sociedad, no obstante, es fundamental que en las principales etapas educativas, la educación artística llegue de la mano de los especialistas, y se prolongue a las personas más mayores, respondiendo al significado del aprendizaje a lo largo de la vida.
... Lerch, 2020), formas que empoderen la experiencia de usuario(Brown, Nash & Mitchell, 2017), integrando aspectos conjuntos de la educación, la salud y el bienestar de los usuarios dentro de los procesos de diseño de las nuevas tecnologías musicales(Cerebello Poblete & Farías Serey, 2019;Francis, 2018;Mcgowan, Mcgregor & Leplatre, 2021;Streeter, Davies, Reiss, Hunt, Caley & Roberts, 2012;Van Besouw, Oliver, Grasmeder, Hodkinson & Solheim, 2016). Es esta potencialidad de la música para integrar de una amplia gama de capacidades cognitivas, creativas, expresivas, motoras y emocionales(Greenberg & Paivio, 2000;Vuilleumier & Trost, 2015), y su contribución a revolucionar áreas relacionadas con el aprendizaje, la inteligencia, la memoria, la emotividad, la percepción o la psicomotricidad, la que está despertando el interés por desarrollar marcos alternativos de evaluación. ...
Conference Paper
Full-text available
Consonance is a distinctive attribute of musical sounds, for which a psychophysical explanation has been found leading to the critical band perceptual model. Recently this model has been hypothesized to play a role also during tactile perception. In this paper the sensitivity to vi-brotactile consonance was subjectively tested in musicians and non-musicians. Before the test, both such groups listened to twelve melodic intervals played with a bass guitar. After being acoustically isolated, participants were exposed to the same intervals in the form of either a whole-body or foot-based vibrotactile stimulus. On each trial they had to identify whether an interval was ascending , descending or unison. Musicians were additionally asked to label every interval using standard musical nomenclature. The intervals identification as well as their labeling was above chance, but became progressively more uncertain for decreasing consonance and when the stimuli were presented underfoot. Musicians' labeling of the stimuli was incorrect when dissonant vibrotactile intervals were presented underfoot. Compared to existing literature on auditory, tactile and multisensory perception, our results reinforce the idea that vibrotactile musical consonance plays a perceptual role in both musicians and non-musicians. Might this role be the result of a process occurring at central and/or peripheral level, involving or not activation of the auditory cortex, concurrent reception from selective somatosensory channels, correlation with residual auditory information reaching the basilar membrane through bone conduction, is a question our preliminary exploration leaves open to further research work.
Conference Paper
Full-text available
This paper explores the roles of technical and musical familiarity in shaping audience response to digital musical instrument (DMI) performances. In an audience study conducted during an evening concert, we examined two primary questions. First, whether a deeper understanding of how a DMI works increases an audience's enjoyment and interest in the performance; and, second, given the same DMI and same performer, whether playing in a conventional (vernacular) versus an experimental musical style affects an audience's response. We held a concert in which two DMI creator-performers each played two pieces in differing styles. Before the concert, each half the 64-person audience was given a technical explanation of one of the instruments. Results showed that receiving an explanation increased the reported understanding of that instrument, but had no effect on either the reported level of interest or enjoyment. On the other hand, performances in experimental versus conventional style on the same instrument received widely divergent audience responses. We discuss implications of these findings for DMI design.
Conference Paper
Full-text available
In this paper we present the new development of a semi-autonomous response module for the NOISA system. NOISA is an interactive music system that predicts performer's engagement levels, learns from the performer, decides what to do and does it at the right moment. As an improvement for the above, we implemented real-time adaptive features that respond to a detailed monitoring of the performer's engagement and to overall sonic space, while evaluating the impact of its actions. Through these new features, the response module produces meaningful and non-intrusive counter actions , attempting to deepen and maintain the performer's engagement in musical interaction. In a formative study we compared our designed response module against a random control system of events, in which the former performed consistently better than the latter.
Article
Full-text available
This paper introduces 'The Aesthetic Experience Prism', a framework for studying how components of aesthetic experience materialize in the model's of interaction of novel musical interfaces as well as how the role of aesthetics could be made more explicit in the processes of designing interaction for musical technologies. The Aesthetic Experience Prism makes use of Arthur Danto's framework of aesthetic experience that consists of three conceptual entities: (1) metaphor; (2) expression; and (3) style. In this paper we present key questions driving the research, theoretical background, artistic research approach and user research activities. In the DIYSE project a proof-of-concept music creation system prototype was developed in a collaborative design setting. The prototype provides means to the performer to create music with minimum effort while allowing for versatile interaction. We argue that by using an artistic research approach specifically targeting designing for aesthetic experience we were able to transform the knowledge from early design ideas to resulting technology products in which model's of interaction metaphors, expression and style are in an apparent role.
Article
Full-text available
This paper explores a longitudinal approach to the qualitative evaluation of a set of digital musical instruments, which were developed with a focus on creativity and exploration. The instruments were lent to three electronic musicians/composers for a duration of four weeks. Free exploration periods and a compositional task evaluated using semi-structured show and tell sessions revealed that the context of use had a major influence on how the instruments were experienced by the test persons. Central issues revolved around the learning/adoption process, and around the importance of playability, explorability, and connectivity.
Article
The merits of user-driven design have long been acknowledged in the field of human-computer interaction (HCI): Closely involving target users throughout the lifecyle of a project can vastly improve their experiences with the final system. Thus, it comes as no surprise that a growing number of music technology researchers are beginning to incorporate user-driven techniques into their work, particularly as a means of evaluating their designs from the perspectives of their intended users. Many, however, have faced the limitations that arise from applying the task-based, quantitative techniques typically encountered in classical HCI research to the evaluation of nonutilitarian applications. The nature of musical performance requires that designers reevaluate their definitions of user "goals," "tasks," and "needs." Furthermore, within the context of performance, the importance of creativity and enjoyment naturally supersedes that of efficiency, yet these concepts are more difficult to evaluate or quantify accurately. To address these challenges, this article contributes a set of key principles for the user-driven design and evaluation of novel interactive musical systems, along with a survey of evaluation techniques offered by new directions in HCI, ludology, interactive arts, and social-science research. Our goal is to help lay the foundation for designers of new musical interfaces to begin developing and customizing their own methodologies for measuring, in a concrete and systematic fashion, those critical aspects of the user experience that are often considered too nebulous for assessment.