Chapter

Musical Instruments for Novices: Comparing NIME, HCI and Crowdfunding Approaches: Methods and Protocols

Authors:
  • Nordoff and Robbins
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Designing musical instruments to make performance accessible to novice musicians is a goal which long predates digital technology. However, just in the space of the past 6 years, dozens of instrument designs have been introduced in various academic venues and in commercial crowdfunding campaigns. In this paper, we draw comparisons in design, evaluation and marketing across four domains: crowdfunding campaigns on Kickstarter and Indiegogo; the New Interfaces for Musical Expression (NIME) conference; conferences in human-computer interaction (HCI); and researchers creating accessible instruments for children and adults with disabilities. We observe striking differences in approach between commercial and academic projects, with less pronounced differences between each of the academic communities. The paper concludes with general reflections on the identity and purpose of instruments for novice musicians, with suggestions for future exploration.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Despite the publication of studies dedicated reviewing the state-of-the-art of ADMIs [8,19,9,25], there is still a lack of contributions towards a systematic analysis of the most important dimensions of their design. As an example, Frid [8] categorizes instrument reviewed in her work in terms of control interface type (tangibles, wearables, gaze-based, etc.), for the purpose of making statistics on the literature. ...
... These three broad categories are often employed in the literature of accessible HCI [31,. Previous reviews on ADMIs [8,25,20] suggest that target user groups can be classified into these three categories. Moreover, the multidimensional character of disability also means that physical, sensory, and cognitive impairments are often intertwined. ...
... However, here we use this term in a wider sense: the degree of simplification of an ADMI along this axis refers to all the aspects of the instrument design aimed at aiding the user in completing musical tasks. These may include enlarging of elements of the visual interface, but also temporal quantization of musical events to compensate for rhythmic difficulties, simplified gestures to play chords or arpeggios, etc. Related concepts have been investigated in the context of DMIs for novices and non-musicians (beginning with the "low entry fee with no ceiling on virtuosity" claim by Wessel and Wright [36]), and are discussed by McPherson et al. [25]. Correspondingly, the Birnbaum space includes the dimension "Required expertise". ...
Conference Paper
Full-text available
Research on Accessible Digital Musical Instruments (AD-MIs) has received growing attention over the past decades, carving out an increasingly large space in the literature. Despite the recent publication of state-of-the-art review works, there are still few systematic studies on ADMIs design analysis. In this paper we propose a formal tool to explore the main design aspects of ADMIs based on Dimension Space Analysis, a well established methodology in the NIME literature which allows to generate an effective visual representation of the design space. We therefore propose a set of relevant dimensions, which are based both on categories proposed in recent works in the research context, and on original contributions. We then proceed to demonstrate its applicability by selecting a set of relevant case studies, and analyzing a sample set of ADMIs found in the literature.
... In this paper, we explain why this lack of critical work should be a matter of concern. With a few notable exceptions [32], our community has overlooked political issues connected to new instruments. This might be due to reasons such as an outward-looking perspective running counter to the practical, geeky, crafty interests of most members rather than any intentional negligence or self-indulgence. ...
... McPherson et al. express scepticism for this anyone can play sentiment: "Music is not one homogeneous entity but rather an umbrella term encompassing a huge variety of genres, styles and techniques. Few people would learn a traditional instrument to generically create music of any arbitrary style; most people are motivated to participate in particular genres, often ones they also listen to" [32]. ...
... The idea of an easily-accessible musical instrument is not new and, in fact, far predates digital technology [32]. Inventors have been promising music for the masses for centuries: the hurdy-gurdy, autoharp, harmonica, and Suzuki Omnichord all promised to unlock the musician in anyone. ...
Conference Paper
Full-text available
So far, NIME research has been mostly inward-looking, dedicated to divulging and studying our own work and having limited engagement with trends outside our community. Though musical instruments as cultural artefacts are inherently political, we have so far not sufficiently engaged with confronting these themes in our own research. In this paper we argue that we should consider how our work is also political, and begin to develop a clear political agenda that includes social, ethical, and cultural considerations through which to consider not only our own musical instruments, but also those not created by us. Failing to do so would result in an unintentional but tacit acceptance and support of such ideologies. We explore one item to be included in this political agenda: the recent trend in music technology of "democratising music", which carries implicit political ideologies grounded in techno-solutionism. We conclude with a number of recommendations for stimulating community-wide discussion on these themes in the hope that this leads to the development of an outward-facing perspective that fully engages with political topics.
... Such case studies 140 express a necessity for inclusive participation to establish grounds for progressive social 141 integration, framed as the collective right to be in control of the surroundings through 142 co-creation. Acting upon the current situation, we enquire into a sensory intervention that 143 aids awareness of one's physical presence and self-affirmed boundaries, without the need 144 to declare any action as right or wrong. The urgent nature of the pandemic demanded disruption to common interactional 147 norms, namely the concern of keeping distance and avoiding touch [24,54]. ...
... outcomes, and made apparent the moment that sound would stop playing, user's would 817 immediately lose interest and move on to proceed with the rest of their day. In lack of a firm 818 recommendation here, these outcomes continue to linger onto the feasibility of engaging 819 unskilled users with novel interactive music systems [144], only to be exaggerated in a 820 pervasive setting where prolonged participation is nonobligatory. In its wearable form, 821 the original orchestration was purposed specifically to capture group dynamics by way of 822 geometric and temporal relationships. ...
Preprint
Full-text available
Within the field of movement sensing and sound interaction research, multi-user systems have gradually gained interest as a means to facilitate an expressive non-verbal dialogue. When tied with studies grounded in psychology and choreographic theory, we consider the qualities of interaction that foster an elevated sense of social connectedness, non-contingent to occupying one’s personal space. In reflection of the newly adopted social distancing concept, we orchestrate a technological intervention, starting with interpersonal distance and sound at the core of interaction. Materialised as a set of sensory face-masks, a novel wearable system was developed and tested in the context of a live public performance from which we obtain the user’s individual perspectives and correlate this with patterns identified in the recorded data. We identify and discuss traits of the user’s behaviour that were accredited to the system’s influence and construct 4 fundamental design considerations for physically distanced sound interaction. The study concludes with essential technical reflections, accompanied by an adaptation for a pervasive sensory intervention that’s finally deployed in an open public space.
... museums, art exhibitions, public parks, etc..) where players of all skill levels can participate in the activity. This often requires being accessible to novice musicians, which in turn guides system design [2,19]. Additionally, CME's have a range of methods for including collaboration in musical experiences. ...
... Polymetros does this by allowing players to contribute to a single piece of music, inputting only one instrument and sequence at a time. In addition, like many CME's [1,2,6,19], Polymetros also uses a sequencer to limit the musical range, which makes playing music less physically demanding, but also limits the open-endedness of the system. ...
... Accessibility is a key consideration among NIME practitioners, often described as having a "low entry fee" [44], to make NIMEs usable by individuals with limited or no musical background [31,15,14]. Our design of dB reflects this philosophy, aiming to be accessible to a wide audience, regardless of their skill or affinity. ...
Conference Paper
Full-text available
dB is a web-based interface that serves as a "drummer bot" for exploring interactive groove-making experiences with an AI percussion system. This system, leveraging Variational Autoencoders (VAEs), transforms simple rhythmic inputs into complex drum patterns with microtiming and dynamics. Designed for accessibility and playfulness, dB is easily operated via a computer keyboard, making it suitable for a wide range of users. This paper outlines dB's foundational concepts, data collection, and a comprehensive overview of system and interface architecture. We then present our preliminary user study that investigated specific aspects of user engagement, including joy and boredom states, as well as perceptions of effort and control. The study's results underscore the musical background, expertise, and generational differences as significant influences on user experiences. Notably, test conditions characterized by greater randomness and rhythmic variation were consistently perceived as more engaging, and emerging trends were observed in user responses diverging over time.
... Designing new DMIs has become very successful and popular among researchers, musicians, and developers, however, they are determined by the designers (Bowen, 2013;Cohé & Hachet, 2012;Wobbrock et al., 2009) and are often exploratory and in a constant state of development (Morreale et al., 2018). This is probably due to various motivations and purposes when designing DMIs Puteri Suhaiza Sulaiman & Ahmad Faudzi Musib such as to develop new sounds, improve audience experiences, as well as making DMIs accessible for novices to make musical performances (Emerson & Egermann, 2020;McPherson et al., 2019). ...
Article
Full-text available
In recent years, computer technologies have been impactful in the design and development of Digital Musical Instruments (DMIs). As music interaction became prominent in the Human-Computer Interaction (HCI) field, emphasis on user requirement upon the design of musical interfaces has also grown since the last decade. Although designing new DMIs is becoming very popular, it is often determined by the designers and often not reflective of users’ needs. In this study, we explored user requirements for the design of a virtual musical instrument of the Malay bonang, an instrument found in the Malay gamelan ensemble. The requirements were elicited from a group of gamelan experts to establish the bonang playing techniques to be mapped to the virtual instrument which we called Air Bonang. Findings revealed that in designing the Air Bonang that is natural and expressive, the fundamental playing techniques of the bonang should be integrated into the system using mid-air interaction. In addition, exploratory techniques might also be integrated into the Air Bonang to leverage musical expression. The outcome of the study proposes design criteria that encompass three aspects of a natural Air Bonang, namely, embodiment, expressiveness, and feedback.
... With that said, we believe this approach minimised 810 the user's resilience to unexpected outcomes, made apparent the moment that sound would 811 stop playing, user's would immediately lose interest and move on to proceed with the rest 812 of their day. In lack of a firm recommendation here, these outcomes continue to linger onto 813 the feasibility of engaging unskilled users with novel interactive music systems [151], only 814 to be exaggerated in a pervasive setting where prolonged participation is nonobligatory. In 815 its wearable form, the original orchestration was purposed specifically to capture group 816 dynamics by way of geometric and temporal relationships. ...
Preprint
Within the field of movement sensing and sound interaction research, multi-user systems have gradually gained interest as a means to facilitate an expressive non-verbal dialogue. When tied with studies grounded in psychology and choreographic theory, we consider the qualities of interaction that foster an elevated sense of social connectedness, non-contingent to occupying one’s personal space. In reflection of the newly adopted social distancing concept, we orchestrate a technological intervention, starting with interpersonal distance and sound at the core of interaction. Materialised as a set of sensory face-masks, a novel wearable system was developed and tested in the context of a live public performance from which we obtain the user’s individual perspectives and correlate this with patterns identified in the recorded data. We identify and discuss traits of the user’s behaviour that were accredited to the system’s influence and construct 4 fundamental design considerations for physically distanced sound interaction. The study concludes with essential technical reflections, accompanied by an adaptation for a pervasive sensory intervention that’s finally deployed in an open public space.
... With that said, we believe this approach minimised the user's resilience to unexpected outcomes, and made apparent the moment that sound would stop playing, user's would immediately lose interest and move on to proceed with the rest of their day. In lack of a firm recommendation here, these outcomes continue to linger onto the feasibility of engaging unskilled users with novel interactive music systems [147], only to be exaggerated in a pervasive setting where prolonged participation is nonobligatory. In its wearable form, the original orchestration was purposed specifically to capture group dynamics by way of geometric and temporal relationships. ...
Article
Full-text available
Within the field of movement sensing and sound interaction research, multi-user systems have gradually gained interest as a means to facilitate an expressive non-verbal dialogue. When tied with studies grounded in psychology and choreographic theory, we consider the qualities of interaction that foster an elevated sense of social connectedness, non-contingent to occupying one’s personal space. Upon reflection of the newly adopted social distancing concept, we orchestrate a technological intervention, starting with interpersonal distance and sound at the core of interaction. Materialised as a set of sensory face-masks, a novel wearable system was developed and tested in the context of a live public performance from which we obtain the user’s individual perspectives and correlate this with patterns identified in the recorded data. We identify and discuss traits of the user’s behaviour that were accredited to the system’s influence and construct four fundamental design considerations for physically distanced sound interaction. The study concludes with essential technical reflections, accompanied by an adaptation for a pervasive sensory intervention that is finally deployed in an open public space.
... The distinction between novices and expert performers is crucial. While interaction with music in some form is almost universal, DMIs may be designed for a number of target groups, including non-musicians, amateurs, experts, and individuals with disabilities [17]. The difference in musical skill level across these groups is significant, and the nature of interaction is likely to change over time as users interact, practice, compose, and perform with a DMI. ...
... Researchers have dedicated considerable energy and resources toward the design of instruments for novices. However, participatory design methods have not been so prominent (see, e.g., an overview of recent work in McPherson, Morreale, & Harrison, 2019). In a study by Mazzone, Iivari, Tikkanen, Read, and Beale (2010), three different design activities were carried out for the design of a musical device for children. ...
Article
Full-text available
A class of master of science students and a group of preschool children codesigned new digital musical instruments based on workshop interviews involving vocal sketching, a method for imitating and portraying sounds. The aim of the study was to explore how the students and children would approach vocal sketching as one of several design methods. The children described musical instruments to the students using vocal sketching and other modalities (verbal, drawing, gestures). The resulting instruments built by the students were showcased at the Swedish Museum of Performing Arts in Stockholm. Although all the children tried vocal sketching during preparatory tasks, few employed the method during the workshop. However, the instruments seemed to meet the children’s expectations. Consequently, even though the vocal sketching method alone provided few design directives in the given context, we suggest that vocal sketching, under favorable circumstances, can be an engaging component that complements other modalities in codesign involving children.
... The term "accessible DMIs" (ADMIs) refers to instruments designed for persons with disabilities. A distinction can be drawn between "performance-focused" and "therapeutic" instruments [7], where the former include ADMIs designed to enable masterful performances by musicians with disabilities, while the latter include instruments designed to elicit therapeutic or wellbeing aspects of music making, even for non-musicians. Some recent works have provided broad surveys of ADMIs, including both research projects and commercial products. ...
Article
Full-text available
Exponential increases of available computational resources, miniaturization, and sensors, are enabling the development of digital musical instruments that use non-conventional interaction paradigms and interfaces. This scenario opens up new opportunities and challenges in the creation of accessible instruments to include persons with disabilities into music practice. This work focuses in particular on instruments dedicated to people who can not use limbs, for whom the only means for musical expression are the voice and a small number of traditional instruments. First, a modular and adaptable conceptual framework is discussed for the design of accessible digital musical instruments targeted at performers with motor impairments. Physical interaction channels available from the neck upwards (head, mouth, eyes, brain) are analyzed in terms of potential and limitations for musical interaction. Second, a systematic survey of previously developed instruments is presented: each is analyzed in terms of design choices, physical interaction channels and related sensors, mapping strategies, performer interface and feedback. As a result of this survey, several open research directions are discussed, including the use of unconventional interaction channels, musical control mappings, multisensory feedback, design, evaluation, and adaptation.
... Since the advent of electronic sound production, a strain of idealistic discourse has posited that the new technology could create any sound imaginable (Théberge 1997). Such claims have been made of early recording, analogue synthesis and digital synthesis, and they feature prominently in the marketing of novel digital musical instruments (McPherson, Morreale and Harrison 2019). Indeed, utopian predictions accompany the introduction of many new technologies. ...
Article
Full-text available
It is widely accepted that acoustic and digital musical instruments shape the cognitive processes of the performer on both embodied and conceptual levels, ultimately influencing the structure and aesthetics of the resulting performance. In this article we examine the ways in which computer music languages might similarly influence the aesthetic decisions of the digital music practitioner , even when those languages are designed for generality and theoretically capable of implementing any sound-producing process. We examine the basis for querying the non-neutrality of tools with a particular focus on the concept of idiomaticity: patterns of instruments or languages which are particularly easy or natural to execute in comparison to others. We then present correspondence with the developers of several major music programming languages and a survey of digital musical instrument creators examining the relationship between idiomatic patterns of the language and the characteristics of the resulting instruments and pieces. In an open-ended creative domain, asserting causal relationships is difficult and potentially inappropriate , but we find a complex interplay between language, instrument, piece and performance that suggests that the creator of the music programming language should be considered one party to a creative conversation that occurs each time a new instrument is designed.
... In both the academic and commercial spheres of Music Technology there is a longstanding interest for Digital Musical Instrument (DMI) design and innovation [20]. Particularly within the sound and music computing, digital arts and human-computer interaction research communities there is a substantial body of studies that explore New Interfaces for Musical Expression (NIMEs). ...
Conference Paper
Full-text available
A substantial number of Digital Musical Instruments (DMIs) are built upon existing musical instruments by digitally and physically intervening in their design and functionality to augment their sonic and expressive capabilities. These are commonly known as Augmented Musical Instruments (AMIs). In this paper we survey different degress of invasiveness and transformation within augmentations made to musical instruments across research and commercial settings. We also observe a common design rationale among various AMI projects, where augmentations are intended to support the performer's interaction and expression with the instrument. Consequently, we put forward a series of minimally-invasive supportive Guitar-based AMI designs that emerge from observational studies with a community of practicing musicians preparing to perform which reveal different types of physical encumbrances that arise from the introduction of additional resources beyond their instrument. We then reflect on such designs and discuss how both academic and commercially-developed DMI technologies may be employed to facilitate the design of supportive AMIs.
... The authors debate how specific features of DMIs could benefit music therapy sessions and propose future lines of research concerned with designing multimodal and empowerment-based technologies. Finally, work presented in [55] focused on accessible instruments for disability, considering both commercial products and instruments presented at NIME (International Conference on New Interfaces for Musical Expression) and related research. Results suggested that the commercial instruments mainly were MIDI controllers that, whatever their physical configuration, managed musical events on a note-by-note or sequence-level basis. ...
Article
Full-text available
Current advancements in music technology enable the creation of customized Digital Musical Instruments (DMIs). This paper presents a systematic review of Accessible Digital Musical Instruments (ADMIs) in inclusive music practice. History of research concerned with facilitating inclusion in music-making is outlined, and current state of developments and trends in the field are discussed. Although the use of music technology in music therapy contexts has attracted more attention in recent years, the topic has been relatively unexplored in Computer Music literature. This review investigates a total of 113 publications focusing on ADMIs. Based on the 83 instruments in this dataset, ten control interface types were identified: tangible controllers, touchless controllers, Brain–Controlled Music Interfaces (BCMIs), adapted instruments, wearable controllers or prosthetic devices, mouth-operated controllers, audio controllers, gaze controllers, touchscreen controllers and mouse-controlled interfaces. The majority of the AMDIs were tangible or physical controllers. Although the haptic modality could potentially play an important role in musical interaction for many user groups, relatively few of the ADMIs (15.6%) incorporated vibrotactile feedback. Aspects judged to be important for successful ADMI design were instrument adaptability and customization, user participation, iterative prototyping, and interdisciplinary development teams.
... Within the NIME community 'instrument-like' controllers, or DMIs that resemble traditional instruments, have been a persistently popular area of exploration [29]. Similar trends can also be seen in the commercial world where many novel instruments and controllers resemble traditional instruments [22]. It is often suggested that a reason for this focused energy is that it allows the re-use of playing techniques from traditional instruments and hence offers a route to faster uptake. ...
Cover Page
Full-text available
The design of traditional musical instruments is a process of incremental refinement over many centuries of innovation. As a result, the shape and form of instruments are well established and recognised across cultures. Conversely, digital musical instruments (DMIs), being unconstrained by requirements of efficient acoustic sound production and er-gonomics, can take on forms which are more abstract in their relation to the mechanism of control and sound production. In this paper we consider the case of designing DMIs that resemble traditional instruments, and pose questions around the social and technical acceptability of certain design choices relating to physical form and input modality (sensing strategy and the input gestures that it affords). We designed four guitar-derivative DMIs to be suitable for performing strummed harmonic accompaniments to a folk tune. Each instrument possesses a combination of one of two global forms (guitar-like body and a smaller tabletop enclosure) and one of two control mechanisms (physical strings and touch sensors). We conducted a study where both non-musicians and guitarists played two versions of the instruments and completed musical tasks with each instrument. This study highlights the complex interaction between global form and input modality when designing for existing musical cultures and varying levels of expertise.
... This article proposes that these values, rather than being peculiar of individual musicians' practices, are often common among several other artists, contributing determining some of the greatest identifying factors of NIME performances. However, the extent to which these values are idiosyncratic traits of NIME performances, as opposed to more general DMIs, is left for future work (a comparisons between instruments presented at NIME versus instruments presented at other HCI conferences and on crowdfunding platforms is discussed at [31]). ...
Conference Paper
Full-text available
The term 'NIME'-New Interfaces for Musical Expression-has come to signify both technical and cultural characteristics. Not all new musical instruments are NIMEs, and not all NIMEs are defined as such for the sole ephemeral condition of being new. So, what are the typical characteristics of NIMEs and what are their roles in performers' practice? Is there a typical NIME repertoire? This paper aims to address these questions with a bottom up approach. We reflect on the answers of 78 NIME performers to an on-line questionnaire discussing their performance experience with NIMEs. The results of our investigation explore the role of NIMEs in the performers' practice and identify the values that are common among performers. We find that most NIMEs are viewed as exploratory tools created by and for performers, and that they are constantly in development and almost in no occasions in a finite state. The findings of our survey also reflect upon virtuosity with NIMEs, whose peculiar performance practice results in learning trajecto-ries that often do not lead to the development of virtuosity as it is commonly understood in traditional performance.
Thesis
Full-text available
This cumulative dissertation is dedicated to the question of the educational significance of hybrid materialities, which, as digital-material things and environments based on informatization and auto-operative designs, increasingly prefigure everyday practices and help to structure self and world relations. This is examined with good reason using the example of music technologies. The submitted essays present research results from the context of the joint research project Musical Interface Designs: Augmented Creativity and Connectivity (BMBF 2017-2022). They are embedded and contextualized in terms of research logic by a detailed framework text. In the broadest sense, the work ties in with the discourse on the educational and pedagogical significance of things, whereby, in view of the hybrid, digital-material things and environments examined, some perspective updates with general pedagogical relevance are introduced. The work focuses in particular on the question of how the interactive hybrid designs of ›music-making things‹ contribute to subjecti- vation processes. This is followed by questions about the associated challenges and opportunities for the education sector. The four essays address different aspects of the tense relationship between the designs and affordances of interactive things, the associated subjectivation processes, their novel socio-technical character and their institutionalization in educational policy. The framework text explains the research logic and the fundamental perspectives of the work with reference to discourses in the field and thematically related disciplines. The cultural-historical contextualization of hybrid music-making things and their interactive designs makes them visible as part of a broader development that has also influenced educational theory itself and can be subsumed under the buzzword of a polyvalent social ›cybernetization‹. As part of this development, which also includes digitalization and the subsequent discourse about the post-digital, new understandings of education and the self are emerging that are no longer necessarily aimed at processes of (self-)reflection but are rather post-humanist in their core configuration. From a genealogical perspective, two strands emerge within the ›cybernetic dispositive‹ – a representational and a corporeal-performative one – which represent two ways of interpreting this dispositive, relating to it and shaping it. The two strands not only open up different individual and collective development possibilities, but also span different normative horizons. Since the processes of subjectivation in the engagement with the hybrid materialities of interactive music-making things are to a large extent mediated physically, a perspective on processes of embodied interactivity is developed with which these can be examined empirically and precisely as socio-technical boundary processes against the background of the cybernetization described above. To designate these forms of subjectivation, in which people and (computational) environments coordinate as closely as possible with each other for reasons of cooperation, the term confluent subjectivation seems to be adequate, whereby the ecological perspective resulting from the environmentality of ubiquitous media technologies is not limited to individuals, but also refers to supra-individual and trans-subjective levels. A related result of the work is the development of a method for educational-theoretical analyses for hybrid media technologies. This is based on ideas from structural media education, which are supplemented with media and design theoretical perspectives and developed into a method that allows more complex analyses than conventional artifact analyses, but is also very well suited for use in concrete empirical and didactic contexts. This is based on a model of subjectivation in which different levels of relation between humans and artifacts are contrasted and forms of reductionism are avoided as far as possible. Moreover a typology of the attitudes with which the participants encounter the borrowed music technologies is developed. In conjunction with the educational-theoretical analyses, it’s possible to take a closer look at the fit between users and media designs and make it useful for educational purposes. In terms of attitudes, very different patterns of expectation, approach, appropriation and acceptance emerge, in which previous musical and/or technical knowledge play a role, but personal interests and concerns, private environments, individual visions of the future, etc. prove to be at least as important in relation to the results of the engagement with the music-making things. From a media didactic point of view, the combination of attitude types and structural analyses can be operationalized well in order to develop tailor-made educational offers with specific pedagogical objectives for different target groups. Further results of the work are, on the one hand, the concept of media sensuality regimes and an associated body politics developed with regard to Jacques Rancière. This also corresponds with specific types of historicity of education and its mediation as well as with the logics of educational institutions. This is exemplified by a short empirical example from the project, in which the development and adaptation or rejection of gesture repertoires that emerge in the context of new media-music-making practices is addressed. On the other hand some educational policy recommendations can be derived from the research results. First, the promotion of an integrative (rather than a ›fragmented‹) cultural education and the development of corresponding reflection skills in the field is advocated. And second it is suggested to introduce the sonic as an epistemically relevant approach to the world as a separate area in music education.
Chapter
Full-text available
This chapter discusses ways to study sonic design from the perspective of musical performances with Digital Musical Instruments (DMIs). We first review the specificities of DMIs in terms of their unique affordances and limitations and comment on instrument availability, longevity, and stability issues, which impact the use of DMIs in musical practice. We then focus on the Karlax, a commercial device used in several musical performances for over a decade. We present an analysis of excerpts from three performances of D. Andrew Stewart’s piece Ritual for solo Karlax, discussing the variability of performers’ gestures and the musical choices made. We conclude by suggesting practice exercises to develop performance techniques with the Karlax and discussing musical composition and performance issues with DMIs.
Article
Full-text available
This article introduces a series of workshop activities carried out with expert musicians to imagine new musical instruments through design fiction. At the workshop, participants crafted nonfunctional prototypes of instruments they would want to use in their own performance practice. Through analysis of the workshop activities, a set of design specifications was developed that can be applied to the design of new digital musical instruments intended for use in real-world artistic practice. In addition to generating tangible elements for instrument design, the theories and models utilized, drawn from human-computer interaction and human-centered design, are offered as a possible model for merging the generation of creative ideas with functional design outputs in a variety of applications within and beyond music and the arts.
Article
Full-text available
Recent years have witnessed the appearance of many new digital musical instruments (DMIs) and other interfaces for musical expression (NIME). This paper highlights a well-established music educational background theory that we believe may help DMI developers and users better understand DMIs in the context of music cognition and education.From an epistemological perspective, we present the paradigm of enactive music cognition related to improvisation in the context of the skills and needs of 21st century music learners.We hope this can lead to a deeper insertion of DMIs into music education, as well as to new DMIs to be ideated, prototyped and developed within these concepts and theories in mind.We specifically address the theory generally known as the 4E model of cognition (embodied, embedded, extendedand enactive) within DMIs. The concept of autopoiesis is also described. Finally, we present some concrete cases of DMIs and NIMEs, and we describe how the experience of musical improvisation with them may be seen through the prism of such theories.Palavras-chave:Autopoiesis, Modelodos 4E da Cognição, DMIs, Improvisação, Educação Musical.Keywords:Autopoiesis, 4E Model of Cognition, DMIs, Improvisation, Music Education.
Chapter
The internet has allowed for the flourishing of several shared projects. Developers from different parts of the world can work asynchronously on the same project, formulating ideas, and implementing complex systems. For remote collaboration in software development projects, sharing information and code becomes more straightforward, given the textual and abstract nature of the source. However, for projects involving hardware and physical artifacts development, it is necessary to have a more detailed level of information, since many other characteristics of the project need to be covered in the documentation. In the case of digital musical instruments, which are physical artifacts aimed at musical interaction, we have several aspects such as mechanical structure, electronic, programming, mapping, and sound design. Currently, there are few initiatives in the literature that indicate possible ways to guide designers and developers in the process of documenting the replication of their projects. Given the importance of advancing the area of new interfaces for musical expression, the diffusion of ideas among innovators, and the popularization of innovative musical instruments, this paper discusses the challenges of sharing DMIs. Our main goal is to propose a checklist to help designers and developers to share their projects to increase the potential for replicability. As future steps, we hope to move towards a certificate that guarantees how replicable a given project is considering makers other than its developers. Besides, with better documentation and a more organized sharing process, we expect to encourage designers and developers to reproduce existing DMIs in different parts of the world so we could understand the adoption of these devices in different contexts and cultures. This paper presents a step towards making the projects more accessible to make the DMI community more connected.
Conference Paper
The innovation in the new musical interfaces is largely driven by the ground up endeavors that introduce a level of redundancy. Inspired by the successes of iPhone and other industry innovations that were driven by iteration, consolidation, and scalability, we present a new interface for musical expression and discuss key elements of its implementation and integration into an existing and established laptop ensemble. In 2019, the Linux Laptop Orchestra of Virginia Tech (L2Ork) introduced the L2Orkmote, a custom reverse engineered variant of the Wii Remote and Nunchuk controller that reorganizes sensors and buttons using an additively manufactured housing. The goal was to equip each orchestra member with two of the newly designed L2Orkmotes, which resulted to the production of 40 L2Orkmotes. This large-scale production mandated software improvements, including the development of a robust API that can support such a large number of concurrently connected Bluetooth devices. Considering that new musical interfaces for musical expression (NIMEs) are rarely designed to scale, we report on the design. Additionally, we share the large-scale real-world deployment concurrently utilizing 28 L2Orkmotes, the supporting usability evaluation, and discuss the impact of scaling NIME production on its design.
Chapter
This is the introductory chapter of a book dedicated to new research in, and emerging new understandings of, music and human-computer interaction—known for short as music interaction. Music interaction research plays a key role in innovative approaches to diverse musical activities, including performance, composition, education, analysis, production and collaborative music making. Music interaction is pivotal in new research directions in a range of activities, including audience participation, interaction between music and dancers, tools for algorithmic music, music video games, audio games, turntablism and live coding. More generally, music provides a powerful source of challenges and new ideas for human-computer interaction (HCI). This introductory chapter reviews the relationship between music and human-computer interaction and outlines research themes and issues that emerge from the collected work of researchers and practitioners in this book.
Poster
Full-text available
Most instruments traditionally used to teach music in early education, like xylophones or flutes, encumber children with the additional difficulty of an unfamiliar and unnatural interface. The most simple expressive interaction, that even the smallest children use in order to make music, is pounding at surfaces. Through the design of an instrument with a simple interface, like a drum, but which produces a melodic sound, children can be provided with an easy and intuitive means to produce consonance. This should be further complemented with information from analysis and interpretation of childlike gestures and dance moves, reflecting their natural understanding of musical structure and motion. Based on these assumptions we propose a modular and reactive system for dynamic composition with accessible interfaces, divided into distinct plugins usable in a standard digital audio workstation. This poster shows our concept and how it can facilitate access to collaborative music making for small children. A first prototypical implementation has been designed and developed during the ongoing research project Drum-Dance-Music-Machine (DDMM), a cooperation with the local social welfare association AWO Hagen and the chair of musical education at the University of Applied Sciences Bielefeld. GOALS RELATED WORK The objective of this project is to develop a system in order to support a low-threshold access to making music in the context of musical education for children. It is divided into three lines of research: • The development of an instrument with the interface of a drum, which, however, produces notes and sounds.
Conference Paper
Full-text available
We present a system that allows users to experience singing without singing using gesture-based interaction techniques. We designed a set of body-related interaction and multi-modal feedback techniques and developed a singing voice synthesizer system that is controlled by the user's mouth shapes and arm gestures. Based on the adaption of a number of digital media-related techniques such as face and body tracking, 3D rendering, singing voice synthesis and physical computing, we developed a media installation that allows users to perform an aria without real singing and provide the look and feel from a 20th century performance of an opera singer. We evaluated this system preliminarily with users. Keywords Gesture based musical interfaces, 3D character performance, singing voice synthesis, interactive media installation.
Conference Paper
Full-text available
Music technology can provide unique opportunities to allow access to music making for those with complex needs in special educational needs (SEN) settings. Whilst there is a growing trend of research in this area, technology has been shown to face a variety of issues leading to underuse in this context. This paper reviews issues raised in literature and in practice for the use of music technology in SEN settings. The paper then reviews existing principles and frameworks for designing digital musical instruments (DMIs.) The reviews of literature and current frameworks are then used to inform a set of design considerations for instruments for users with complex needs, and in SEN settings. 18 design considerations are presented with connections to literature and practice. An implementation example including future work is presented, and finally a conclusion is then offered. ACM Classification • Human-centered computing~HCI theory, concepts and models • Human-centered computing~Interface design prototyping • Applied computing~Sound and music computing • Hardware~Sound-based input / output • Hardware~Haptic devices
Conference Paper
Full-text available
Every new edition of NIME brings dozens of new DMIs and the feeling that only a few of them will eventually break through. Previous work tried to address this issue with a deductive approach by formulating design frameworks; we addressed this issue with a inductive approach by elaborating on successes and failures of previous DMIs. We contacted 97 DMI makers that presented a new instrument at five successive editions of NIME (2010-2014); 70 answered. They were asked to indicate the original motivation for designing the DMI and to present information about its uptake. Results confirmed that most of the instruments have diculties establishing themselves. Also, they were asked to reflect on the specific factors that facilitated and those that hindered instrument longevity. By grounding these reflections on existing reserach on NIME and HCI, we propose a series of design considerations for future DMIs.
Conference Paper
Full-text available
Learning to play the transverse flute is not an easy task, at least not for everyone. Since the flute does not have a reed to resonate, the player must provide a steady, focused stream of air that will cause the flute to resonate and thereby produce sound. In order to achieve this, the player has to be aware of the embouchure position to generate an adequate air jet. For a beginner, this can be a difficult task due to the lack of visual cues or indicators of the air jet and lip position. This paper attempts to address this problem by presenting an augmented flute that makes the parameters of the embouchure visible and measurable. The augmented flute shows information about the area covered by the lower lip, estimates the lip hole shape based on noise analysis, and shows the air jet direction. Additionally, the augmented flute provides directional and continuous feedback in real time, based on data acquired from experienced flutists. In a small experiment with five novices, most participants could produce a sound with only minimal instructions.
Conference Paper
Full-text available
This article presents observations and strategies for designing game-like elements for expressive mobile musical interactions. The designs of several popular commercial mobile music instruments are discussed and compared, along with the different ways they integrate musical information and game-like elements. In particular, issues of designing goals, rules, and interactions are balanced with articulating expressiveness. These experiences aim to invite and engage users with game design while maintaining and encouraging open-ended musical expression and exploration. A set of observations is derived, leading to a broader design motivation and philosophy.
Chapter
Full-text available
This article, which builds on several conference papers, describes what we call ‘Musicking Tangibles’, a novel approach towards understanding and design of interactive music technology for people with special needs. The health values of music are well documented, but so far little research on interactive music technology has been developed for music therapy and health improvement in everyday situations. In our opinion, the music technology that has been used exploits little of the potential that current computer technology has to offer these fields because it is designed and used within a narrow perspective on technology and its potential. With our long experience from design and development of interactive music technology, especially from the interdisciplinary research project RHYME (rhyme.no), we present and argue for a broader understanding of music technology for empowerment and health improvement, building on a multidisciplinary approach with perspectives from tangible interaction design and inspiration from resource oriented music therapy and empowerment thinking. We hereby suggest the notion, Musicking Tangibles, inspired by Christopher Small’s (1998) term ‘musicking’, as a label for our understanding. Based on our experiences and user observations from the RHYME project we argue that the Musicking Tangibles have unique empowering qualities with health potentials.
Conference Paper
Full-text available
The Music Room is an interactive installation that allows visitor to compose classical music by moving throughout a space. The distance between them and their average speed maps the emotionality of music: in particular, distance influences the pleasantness of the music, while speed influences its intensity. This paper focuses on the evaluation of visitors' experience with The Music Room by examining log-data, video footages, interviews, and questionnaires, as collected in two public exhibitions of the installation. We examined this data to the identify the factors that fostered the engagement and to understand how players appropriated the original design idea. Reconsidering our design assumptions against behavioural data, we noticed a number of unexpected behaviours, which induced us to make some considerations on design and evaluation of interactive art.
Conference Paper
Full-text available
We have applied interactive machine learning (IML) to the creation and customisation of gesturally controlled musical interfaces in six workshops with people with learning and physical disabilities. Our observations and discussions with participants demonstrate the utility of IML as a tool for participatory design of accessible interfaces. This work has also led to a better understanding of challenges in end-user training of learning models, of how people develop personalised interaction strategies with different types of pre-trained interfaces, and of how properties of control spaces and input devices influence people's customisation strategies and engagement with instruments. This work has also uncovered similarities between the musical goals and practices of disabled people and those of expert musicians.
Conference Paper
Full-text available
The multi-touch music table is a novel tabletop tangible interface for expressive musical performance. User touches the picture projected on the table glass surface to perform music. User can click, drag or use various multi-touch gestures with fingers to perform music expressively. The picture color, luminosity, size, finger gesture and pressure determine the music output. The table detects up to 10 finger touches with their touch pressure. We use a glass, a wood stand, a mini projector, a web camera and a computer to construct this music table. Hence this table is highly customizable. The table generates music via a re-interpretation of the artistic components of pictures. It is a cross modal inspiration of music from visual art on a tangible interface.
Conference Paper
Full-text available
We present a ludic interactive music performance that allows live recorded sounds to be re-rendered through the users' movements. The interaction design made the control similar to a shaker where the motion energy drives the energy of the played music piece. The instrument has been designed for musicians as well as non-musicians and allows for multiple players. In the MubuFunkScatShare performance, one performer plays acoustical instruments into the system, subsequently rendering them by shaking a smartphone. He invites participation by volunteers from the audience, resulting in a fun musical piece that includes layers of funk guitar, scat singing, guitar solo, and beatboxing.
Conference Paper
Full-text available
NoiseBear is a malleable multiparametric interface, currently being developed in a series of participatory design workshops with disabled children. It follows a soft toy design, using conductive textiles for pressure sensing and circuitry. The system is a highly sensitive deformable controller; it can be used flexibly in a range of scenarios for continuous or discrete control, allowing interaction to be designed at a range of complexity levels. The controller is wireless, and can be used to extend the interactive possibilities of mobile computing devices. Multiple controllers may also be networked together in collaborative scenarios.
Conference Paper
Full-text available
This paper reports on the design and audience evaluation of a collaborative interactive music system titled Polymetros. Designed for broad audiences, Polymetros aims to enable users without musical skills to experience collaborative music-making. First, we describe our design approach with reference to related research. A particular interest was to investigate how to provide novices with individual musical control within a collaborative context. We then present an audience evaluation that was conducted during an exhibition at a major art museum in the UK attended by large numbers of the general public across the age range. The results lead us to evaluate our design approach and reflect on implications for facilitating collaborative music-making for broad audiences. Furthermore, the findings provide interesting indications how the context of a public exhibition setting affects the audience interaction with such an interactive multi-player experience.
Article
Full-text available
We present moosikMasheens, a novel system for musical expression by young people who have physical impairments or complex needs, playing music in a mixed ability group context. moosikMasheens consists of three electro-mechanical musical instruments that can be controlled via simple tablet computer-based interfaces. An adapted glockenspiel, guitar and a set of electro-mechanical drumsticks have the potential to provide feedback through many perceptual channels including sonic, visual, vibro-tactile and kinsaethetic. Through the use of both a priori theory and an extended participatory requirements analysis, the system has been designed for use by both teachers/workshop leaders and students, as this has previously been found to be the most common form of group musical interaction. The technical implementation of the system is outlined with an initial evaluation.
Conference Paper
Full-text available
This paper presents The Music Room, an interactive installation where couples compose original music. The music is generated by Robin, an automatic composition system, according to relative distance between the users and the speed of their own movements. Proximity maps the pleasantness of music, while speed maps its intensity. The Music Room was exhibited during the EU Researchers' Night in Trento, where it met with a strong interest by visitors.
Conference Paper
Full-text available
We report on the Music Ball Project, a longterm, exploratory project focused on creating novel instruments/controllers with a spherical shape as the common denominator. Besides a simple and attractive geometrical shape, balls afford many different types of use, including play. This has made our music balls popular among widely different groups of people, from toddlers to seniors, including those that would not otherwise engage with a musical instrument. The paper summarises our experience of designing, constructing and using a number of music balls of various sizes and with different types of sound-producing elements.
Article
Full-text available
Networked music environments (NMEs) allow experimental artists to explore the implications of interconnecting their computers for musical purposes. Despite an evident progress in recent years of networked music research, very little attention has been paid to a very common potential kind of user: novices in music, that is, users with little or no previous music knowledge. Indeed, the same way that principles of Rich Internet Applications like YouTube and Flickr have turned the passive user into an active producer of content, we are investigating the issues to be considered by networked music environments in order to allow effective support of musical creation and experimentation by novices. CODES—a Web-based environment designed to support cooperative ways of music creation by novices—puts these principles into practice. The goal of this paper is to present, discuss and illustrate two main principles: (1) music creation by novices should be prototypical; and (2) music creation by novices should be cooperative. These principles have emerged during CODES design and development and we think they are a good starting point for further investigation of a novice-oriented perspective of NME dimensions.
Article
Full-text available
This paper explores the differences in the design and performance of acoustic and new digital musical instruments, arguing that with the latter there is an increased encapsulation of musical theory. The point of departure is the phenomenology of musical instruments, which leads to the exploration of designed artefacts as extensions of human cognition – as scaffolding onto which we delegate parts of our cognitive processes. The paper succinctly emphasises the pronounced epistemic dimension of digital instruments when compared to acoustic instruments. Through the analysis of material epistemologies it is possible to describe the digital instrument as an epistemic tool: a designed tool with such a high degree of symbolic pertinence that it becomes a system of knowledge and thinking in its own terms. In conclusion, the paper rounds up the phenomenological and epistemological arguments, and points at issues in the design of digital musical instruments that are germane due to their strong aesthetic implications for musical culture.
Article
Full-text available
There is small but useful body of research concerning the evaluation of musical interfaces with HCI techniques. In this paper, we present a case study in implementing these techniques; we describe a usability experiment which eval-uated the Nintendo Wiimote as a musical controller, and reflect on the effectiveness of our choice of HCI methodolo-gies in this context. The study offered some valuable results, but our picture of the Wiimote was incomplete as we lacked data concerning the participants' instantaneous musical ex-perience. Recent trends in HCI are leading researchers to tackle this problem of evaluating user experience; we review some of their work and suggest that with some adaptation it could provide useful new tools and methodologies for com-puter musicians.
Article
Full-text available
Background in digital musical instruments. A digital musical instrument (DMI) comprises a control surface that controls the parameters of a digital synthesis algorithm in real time. In the Digital Orchestra Project, a three-year research/creation project, the synthesis engine was hosted on a general-purpose computer, while the gestural control surfaces were new hardware devices created for the project. The mapping between gestural data and synthesis parameters was carried out through the use of custom-written software: The Mapper. Background in music performance. From a performance perspective, a successful DMI should allow the performer to feel that he or she has accurate control of the musical result of their performance. This sensation results from a number of different factors, including the responsiveness of the instrument (low, consistent latency), haptic feedback, the mapping strategies used, and the reproducibility of musical ideas, among others. Aims. The aim of the project was to develop and use a number of new DMIs with musical potential comparable to that of existing acoustic musical instruments. An important goal was to foster long-term interdisciplinary collaborations between instrument designers, composers and performers. We also wanted to address the issue of reproducibility in the performance of digital musical instruments by developing appropriate notation methods. Main contribution. The Digital Orchestra resulted in of the development of several new DMIs, from laboratory prototypes to fully-fledged concert instruments. Composers created new works for small ensembles that included these instruments. A musical notation based on dynamic visual elements displayed on a computer screen was developed. The project notably included three years of intensive training on these instruments by performers who had previously already achieved a high level of expertise on acoustic musical instruments. Implications. The McGill Digital Orchestra presents a number of paradigms for the design, creation and performance of digital musical instruments in the context of a long-term interdisciplinary, collaborative environment. Based on our experience, we propose that one effective measure for the evaluation of a digital musical instrument is its ability to reproduce a performance of a particular piece, either by the same performer or by different performers. This involves the ability to realize a piece based on a notated score, whether on paper or using software-based visual feedback in a graphical environment. We suggest that this may aid in ensuring the viability and longevity of a novel digital musical instrument. The results of this approach to DMI design include instruments that have been used in high-profile professional performances and that are still being actively used by several performers world-wide.
Article
Full-text available
Music instruments are used to play and to produce music, transforming the actions of one or more performers into sound. This article explores some instrument design issues in three distinct parts. The first section attempts to define what music instruments are, how traditional instruments function and what they can do, and what future instruments could be, trying to figure out how we could better exploit their unlimited potential. The second section proposes a quick review of the different know-how, the technical and the conceptual frameworks and areas in which new instrument designers and researchers are currently working on. It is not in that sense, a survey of new instruments and controllers but more a survey of thoughts and knowledge about them. The third and last section studies the dynamic relation that builds between the player and the instrument, introducing concepts such as efficiency, apprenticeship and learning curve. It explores some music instruments generic properties such as the diversity, the variability or the reproducibility of their musical output, the linearity or non-linearity of their behavior, and tries to figure out how these aspects can bias the relation between the instrument and its player, and how they may relate to more commonly studied and (ab)used concepts such as expressivity or virtuosity. The aim of this paper is the foundation of a theoretical framework in which the possibilities and the diversity of music instruments as well as the possibilities and the expressive freedom of human music performers could all be evaluated; a framework that could help to figure out what the essentials needs for different types of musicians -from the absolute novice to t he professional or the virtuoso-may be.
Conference Paper
This paper presents the results of user interaction with two explorative music environments (sound system A and B) that were inspired from the Banda Linda music tradition in two different ways. The sound systems adapted to how a team of two players improvised and made a melody together in an interleaved fashion: Systems A and B used a fuzzy logic algorithm and pattern recognition to respond with modifications of a background rhythms. In an experiment with a pen tablet interface as the music instrument, users aged 10-13 were to tap tones and continue each other’s melody. The sound systems rewarded users sonically, if they managed to add tones to their mutual melody in a rapid turn taking manner with rhythmical patterns. Videos of experiment sessions show that user teams contributed to a melody in ways that resemble conversation. Interaction data show that each sound system made player teams play in different ways, but players in general had a hard time adjusting to a non-Western music tradition. The paper concludes with a comparison and evaluation of the two sound systems. Finally it proposes a new approach to the design of collaborative and shared music environments that is based on ”listening applications”.
Conference Paper
This paper introduces Simpletones, an interactive sound system that enables a sense of musical collaboration for non-musicians. Participants can easily create simple sound compositions in real time by collaboratively operating physical artifacts as sound controllers. The physical configuration of the artifacts requires coordinated actions between participants to control sound (thus requiring, and emphasizing collaboration). Simpletones encourages playful human-to-human interaction by introducing a simple interface and a set of basic rules [1]. This enables novices to focus on the collaborative aspects of making music as a group (such as synchronization and taking collective decisions through non-verbal communication) to ultimately engage a state of group flow[2]. This project is relevant to a contemporary discourse on musical expression because it allows novices to experience the social aspects of group music making, something that is usually reserved only for trained performers [3].
Conference Paper
Most instruments traditionally used to teach music in early education, like xylophones or flutes, encumber children with the additional difficulty of an unfamiliar and unnatural interface. The most simple expressive interaction, that even the smallest children use in order to make music, is pounding at surfaces. Through the design of an instrument with a simple interface, like a drum, but which produces a melodic sound, children can be provided with an easy and intuitive means to produce consonance. This should be further complemented with information from analysis and interpretation of childlike gestures and dance moves, reflecting their natural understanding of musical structure and motion. Based on these assumptions we propose a modular and reactive system for dynamic composition with accessible interfaces, divided into distinct plugins usable in a standard digital audio workstation. This paper describes our concept and how it can facilitate access to collaborative music making for small children. A first prototypical implementation has been designed and developed during the ongoing research project Drum-Dance-Music-Machine (DDMM), a cooperation with the local social welfare association AWO Hagen and the chair of musical education at the University of Applied Sciences Bielefeld.
Chapter
Rebecca Fiebrink is a Senior Lecturer at Goldsmiths, University of London, where she designs new ways for humans to interact with computers in creative practice. As a computer scientist and musician, much of her work focuses on applications of machine learning to music, addressing research questions such as: ‘How can machine learning algorithms help people to create new musical instruments and interactions?’ and ‘How does machine learning change the type of musical systems that can be created, the creative relationships between people and technology, and the set of people who can create new technologies?’ Much of Fiebrink’s work is also driven by a belief in the importance of inclusion, participation, and accessibility. She frequently uses participatory design processes, and she is currently involved in creating new accessible technologies with people with disabilities, designing inclusive machine learning curricula and tools, and applying participatory design methodologies in the digital humanities. Fiebrink is the developer of the Wekinator: open-source software for real-time interactive machine learning, whose current version has been downloaded over 10,000 times. She is the creator of a MOOC titled “Machine Learning for Artists and Musicians.” She was previously an Assistant Professor at Princeton University, where she co-directed the Princeton Laptop Orchestra. She has worked with companies including Microsoft Research, Sun Microsystems Research Labs, Imagine Research, and Smule. She has performed with a variety of musical ensembles playing flute, keyboard, and laptop. She holds a Ph.D. in Computer Science from Princeton University.
Chapter
This chapter explores different perspectives on the role of musical tools in musical interactions, with a particular focus on entanglements of agency. These perspectives can run the full gamut from musicians claiming to be “played by” their instruments and essentially at the mercy of the inner workings of the instruments, to musicians feeling as though the instrument is transparent, and that their inner impulses are communicated as sounds with no resistance from the instrument. Viewpoints are presented from contemporary musical practices and from instrument designers and makers, and are connected with wider theoretical accounts of agency in technology. These discussions are then brought back to the context of the design and development of digital musical instruments, and to human-computer interaction more broadly, reflecting on the relationships between designers and their technologies, and on how the design and development process can be viewed as nested inside its own chain of technological and social influences.
Conference Paper
While instrumental ensemble playing can benefit children's music education and collaboration skill development, it requires extensive training on music and instruments, which many school children lack. To help children with limited music training experience instrumental ensemble playing, we created EnseWing, an interactive system that offers such an experience. In this paper, we report the design of the EnseWing experience and a two-month field study. Our results show that EnseWing preserves the music and ensemble skills from traditional instrumental ensemble and provides more collaboration opportunities for children.
Conference Paper
Midi Motion Gloves is an interactive wearable that is used to manipulate, organize, and construct audio patterns for the purpose of composing music. Acting as an interface between the user and a Digital Audio Workspace, or DAW, the gloves give a user complete control over customization of sounds, effects, tempo, recording, looping, and other musical elements. MIDI Motion gloves utilize organic human movement like finger bending, hand rotation, and finger tapping to trigger audio clips or add effects and filters, ultimately allowing the user to efficiently compose and visualize complex musical patterns in a three-dimensional space.
Conference Paper
In this paper a new approach to music-making for people with disabilities is discussed. Until recently, the technology to enable people with disabilities to make music has been relatively limited, consisting primarily of mechanical approaches. With new developments in computing, including the Microsoft Kinect, touchless sensors are providing a new way for people with disabilities to interface with instruments in novel ways. There have been few papers that made empirical measurements of adaptive musical instruments, including latency. This paper will fill this gap by detailing an adaptive musical interface using the Microsoft Kinect. Then the overall latency, including response time of the musician, will be measured, and methods to decrease this latency will be proposed.
Article
The Renaissance genre of organological treatises inventoried the forms and functions of musical instruments. This article proposes an update and expansion of the organological tradition, examining the discourses and practices surrounding both musical and scientific instruments. Drawing on examples from many periods and genres, we aim to capture instruments' diverse ways of life. To that end we propose and describe a comparative "ethics of instruments": an analysis of instruments' material configurations, social and institutional locations, degrees of freedom, and teleologies. This perspective makes it possible to trace the intersecting and at times divergent histories of science and music: their shared material practices, aesthetic commitments, and attitudes toward technology, as well as their impact on understandings of human agency and the order of nature.
Article
We present the Collective Sound Checks, an exploration of user scenarios based on mobile web applications featuring motion-controlled sound that enable groups of people to engage in spontaneous collaborative sound and music performances. These new forms of musical expression strongly shift the focus of design from human-computer interactions towards the emergence of computer mediated interactions between players based on sonic and social affordances of ubiquitous technologies. At this early stage, our work focuses on experimenting with different user scenarios while observing the relationships between different interactions and affordances.
Article
EmotiSphere is an interactive sensor-based musical instrument that generates music based on a user's current emotional state. Interactions with EmotiSphere draw upon everyday interactions with physical spherical objects, as well as on familiar interactions with music players. EmotiSphere offers a novel way to understand the relationship between emotion and music, and is aimed at people who want to create music and express themselves but do not necessarily possess skills in music composition. We describe the conceptualization and context of EmotiSphere, as well as its technical implementation.
Article
SimpleTones is an interactive sound system that enables non-musicians to engage in collaborative acts of music making. The aim of SimpleTones is to make collaborative musical experiences more approachable and accessible for a wide range of users of different levels of musical expertise. This allows them to actively participate in the social aspects of collective musical improvisation, something usually confined to trained performers. Players can participate with ease and in real time by operating physical sound controllers in tandem. By using play as a catalyst and setting novices free from the requirement of previous musical experience, participants are able to focus on the collaborative aspects of performance, such as synchronizing movements, discovering the system's functionality together and making collective decisions.
Conference Paper
We investigate the effects of adding structure to musical interactions for novices. A simple instrument allows control of three musical parameters: pitch, timbre, and note density. Two users can play at once, and their actions are visible on a public display. We asked pairs of users to perform duets under two interaction conditions: unstructured, where users are free to play what they like, and structured, where users are directed to different areas of the musical parameter space by time-varying constraints indicated on the display. A control group played two duets without structure, while an experimental group played one duet with structure and a second without. By crowd-sourcing the ranking of recorded duets we find that structure leads to musically better results. A post experiment survey showed that the experimental group had a better experience during the second unstructured duet than during the structured.
Conference Paper
This paper presents new touch-screen collaborative interaction models for people with dementia. The authors argue that dementia technology has yet to focus on group musical interactions. The project aims to contribute to dementia care while addressing a significant gap in current literature. Research includes observations and two system trials exploring contrasting musical scenarios: the performance of abstract electronic music and the distributed performance of J. S. Bach's Goldberg Variations. Findings presented in this paper suggest that dementia people are able to successfully perform and engage in collaborative music performance activities with little or no scaffolded instruction.
Conference Paper
Creativity is a crucial skill in today's knowledge economy, and creativity support tools are a valuable and important area of human-computer interaction research. Passive affective priming has been shown to be an effective means for enhancing creativity. Based on music cognition research, we propose music improvisation as an active and participatory cognitive prime for boosting creative ability. To make music improvisation accessible to all individuals regardless of music expertise, we present a novel instrument with an adaptive interface that facilitates music creation. We demonstrate that improvising music with our adaptive instrument provides an immediate boost to creative ability, even for people without musical training. We quantify the efficacy of interface adaptation techniques for enabling creative expression.
Conference Paper
This paper describes the development of a prototype of a sonic toy for pre-scholar kids. The device, which is a mod- ified version of a football ratchet, is based on the spinning gesture and it allows to experience four different types of auditory feedback. These algorithms let a kid play with music rhythm, generate a continuous sound feedback and control the pitch of a piece of music. An evaluation test of the device has been performed with fourteen kids in a kindergarten. Results and observations showed that kids preferred the algorithms based on the exploration of the music rhythm and on pitch shifting.
Conference Paper
Many performers of novel musical instruments find it difficult to engage audiences beyond those in the field. Previous research points to a failure to balance complexity with usability, and a loss of transparency due to the detachment of the controller and sound generator. The issue is often exacerbated by an audience’s lack of prior exposure to the instrument and its workings. However, we argue that there is a conflict underlying many novel musical instruments in that they are intended to be both a tool for creative expression and a creative work of art in themselves, resulting in incompatible requirements. By considering the instrument, the composition and the performance together as a whole with careful consideration of the rate of learning demanded of the audience, we propose that a lack of transparency can become an asset rather than a hindrance. Our approach calls for not only controller and sound generator to be designed in sympathy with each other, but composition, performance and physical form too. Identifying three design principles, we illustrate this approach with the Serendiptichord, a wearable instrument for dancers created by the authors.
Article
We explore the context and design of collaborative musical experiences for novices. We first argue that musical expression with multi-person instruments is a form of communication between the players. We illustrate that design for musical collaboration facilitates exploration of sound space with low entry-level skill. In contrast to the western post-Renaissance focus on musical expression through virtuosity, collaborative musical experiences enable the media of sound and music to enhance the communication opportunities and intimacy between players. The main factor common to most of the interfaces discussed herein is that musical control is highly restricted, which makes it possible for novices to easily learn and participate in the collective experience. This happens at the expense of providing an upward path to virtuosity with the interface. Balancing this tradeoff is a key concern for designers. We look closely at many contemporary collaborative interface designs, with each exploring a different way to achieve successful musical experiences for novice players.
Article
Input devices for musical expression were evaluated by drawing parallels to existing research in the field of human computer interaction (HCI). The applications of the knowledge was discussed to the development of interfaces for musical expression. A set of musical tasks was discussed to allow the evaluation of existing input devices. The present evaluation methodology was found useful for designers, composers and performers.
Article
This paper reviews a number of projects that explore building electronic musical things, interfaces and objects designed to be used and enjoyed by anybody but in particular those who do not see themselves as naturally musical. On reflecting on the strengths of these projects, interesting directions for similar work in the future are considered.
Article
This article addresses Guitar Hero and Rock Band gameplay as a developing form of collaborative, participatory rock music performance. Drawing on ethnomusicology, performance studies, popular music studies, gender and sexuality studies, and interdisciplinary digital media scholarship, I investigate the games' models of rock heroism, media debates about their impact, and players' ideas about genuine musicality, rock authenticity, and gendered performance conventions. Grounded in ethnographic research—including interviews, a Web-based qualitative survey, and media reception analysis—this article enhances our understanding of performance at the intersection of the “virtual” and the “real,” while also documenting the changing nature of amateur musicianship in an increasingly technologically mediated world.
Article
We describe the prevailing model of musical expression, which assumes a binary formulation of "the text" and "the act," along with its implied roles of composer and performer. We argue that this model not only excludes some contemporary aesthetic values but also limits the communicative ability of new music interfaces. As an alternative, an ecology of musical creation accounts for both a diversity of aesthetic goals and the complex interrelation of human and non-human agents. An ecological perspective on several approaches to musical creation with interactive technologies reveals an expanded, more inclusive view of artistic interaction that facilitates novel, compelling ways to use technology for music. This paper is fundamentally a call to consider the role of aesthetic values in the analysis of artistic processes and technologies.