Content uploaded by Asha Ward
Author content
All content in this area was uploaded by Asha Ward on Jun 29, 2017
Content may be subject to copyright.
Design Considerations for Instruments for Users with
Complex Needs in SEN Settings
Asha Ward
Centre for Digital Entertainment
Bournemouth University
Fern Barrow, Poole, England
ablatherwick@bournemouth.ac.uk
Luke Woodbury
Three Ways School
!180 Frome Road
Bath, England
luke.woodbury@icloud.com
Tom Davis
Bournemouth University
Fern Barrow, Poole, England
tdavis@bournemouth.ac.uk
ABSTRACT
Music technology can provide unique opportunities to allow access
to music making for those with complex needs in special educational
needs (SEN) settings. Whilst there is a growing trend of research in
this area, technology has been shown to face a variety of issues
leading to underuse in this context. This paper reviews issues raised
in literature and in practice for the use of music technology in SEN
settings. The paper then reviews existing principles and frameworks
for designing digital musical instruments (DMIs.) The reviews of
literature and current frameworks are then used to inform a set of
design considerations for instruments for users with complex needs,
and in SEN settings. 18 design considerations are presented with
connections to literature and practice. An implementation example
including future work is presented, and finally a conclusion is then
offered.
Author Keywords
assistive technologies, adapted technology, music technology, novel
interfaces, special education, action research, modular systems
ACM Classification
• Human-centered computing~HCI theory, concepts and models •
Human-centered computing~Interface design prototyping • Applied
computing~Sound and music computing • Hardware~Sound-based
input / output • Hardware~Haptic devices
1.!INTRODUCTION
Music technology can provide unique opportunities to access music
making for those with complex needs. The term complex needs can
refer to spectrum of cognitive, physical, sensory impairments and/or
disabilities, or emotional and behavioral difficulties. There is a
growing trend of research [1] and creation of tools ([2], [3]) showing
that these technologies can be invaluable at facilitating interaction
with sound for those who face barriers when accessing traditional
musical instruments. Literature has shown there is not a strong uptake
of music technology in the special educational needs (SEN) setting
([4], [5]) and several issues have been flagged as contributing to their
lack of use. These include the financial cost of music technology,
space needed to store and set up, and a fear, dislike or indifference to
technology [6] leading to “a potential lack of confidence with putting
technology into practice [7]”. Musical output can also be seen as
being uninspiring, artificial and lacking expression [8] and
impersonal and lacking sophistication [7]. Technology can be seen as
a barrier when coupled with a lack of formal training and exposure,
[5] and the perceived need for insider knowledge when in use [7].
However, when technology is used successfully it can provide
unparalleled access to musical expression for those with complex
needs [1]. Creating technology for use within this setting can be
difficult and this paper attempts to assist designers with issues
specific to users with complex needs and the special educational
needs (SEN) setting. 15 years ago, Perry Cook created his principles
for designing computer music controllers [9] and these principles
have provided a guiding light to many DMI designers since. The
following paper was inspired by Cook’s 2001 paper and introduces
some novel principles of its own for the designing of instruments for
users with complex needs and in SEN settings. The paper begins by
laying down the context from which these principles have been
derived, moves into the background from which the principles are
formed, reviews literature pertinent to the design of digital musical
instruments and then outlines 18 design principles, with links to
literature and data gathered. An implementation example is offered
and a conclusion featuring future considerations.
2.!CONTEXT
Many children and young people (CYP) within SEN schools might
need additional help with “thinking and understanding, physical or
sensory difficulties, emotional and behavioral difficulties, difficulties
with speech and language, and also how to relate and behave with
other people” [10]. Individuals with severe complex needs can
experience “minimal movement, disordered movement, altered states
on consciousness, and may have no verbal communication”. [1]
Engaging in musical expression can be beneficial to those with
complex needs, providing an opportunity for expression to enhance
wellbeing [11], and opportunities for communication helping to
establish a sense of identity [12]. However, access to music making
can be difficult in terms of physically being able to interact with the
tools provided, and cognitively being able to understand and use
traditional instruments or equipment designed for the typical user,
and difficulties when playing together in an ensemble. Whilst there
are many instruments aimed at typically able users, there are few
specialist musical instruments available to assist users with complex
needs. Larsen et al [13] offer a recent review of such custom
technologies, the most notable being the Soundbeam developed in
the late 80s [2] and more recently the Skoog [3]. Music technology
that is developed is often underused with a compounding factor being
that music based sessions are facilitated by non-musical and/or non-
technical practitioners. Hahna et al [4] surveyed Music Therapists
about their feelings towards technology and found distress, fear, lack
of confidence, lack of interest, dislike, or belief that music technology
is not appropriate in music therapy clinical work, or not appropriate
for particular clientele were some of the reasons against technology
usage. Some participants in the same survey stated that they thought
that music technology was intrusive in sessions. Farrimond et al [6]
also back these findings in their report. Technology is often seen as
hard to set-up, hard to use, too expensive, hard to store, or not tailored
well for the CYP using them thus equipment gets left to gather dust
on the shelf or is only used by the designated “techie” person.
3.!BACKGROUND
Elements of the design considerations presented here stem from
previous undergraduate research work where the first author
undertook a 9-month placement as a creative technologist alongside
author Luke Woodbury. Woodbury is the embedded interactive
designer of 7 years within the Three Ways School in Bath, a leading
specialist school who are progressive about their usage of
technology. This paper also features a continuation of this research as
part of the engineering doctorate (EngD), with the overriding topic
being music technology for users with complex needs. The research
is in its 2nd year and is also a culmination of several years’
experience within the field, accounts from several sources, and
numerous exchanges between the authors, and a variety of other
professionals within the SEN field. This research has originated from
frustrations with current hardware and software systems used to
allow CYP access to music within an SEN setting and as such is
following an action research (AR) methodology. AR allows the
research to be participatory, with those using the technology being at
the centre of the research as co-researcher stakeholders. The aim of
AR is to try and create new solutions that can be left within the
context they are developed, and with co-researcher stakeholders
taking ownership and autonomously carrying forward the created
technology.
4.!EXISTING FRAMEWORKS FOR
DESIGN
Several existing frameworks and considerations for design have
informed this paper. Hunt et al’s [14] considerations for the
improvement of technological solutions suggest three areas needing
improvement to create enticing devices for users and those
facilitating their use: “audiovisual instrument design”- creating
instruments for users with minimal movement with the same variety
and feedback as acoustic instruments. “Technical infrastructure
refinement”- localising the sound via external amplification and
having the ability to make new sound worlds comparable to
conventional instruments by customising to the individual’s needs.
“Clinical practice integration” - having an open-ended flexible
toolkit that is inspiring but not frightening to users.
Jorda [15] provides guidance in creating instruments with longevity
that can draw users in, by looking at issues of balance (complexity vs
simplicity), playability, learning curve and instrument efficiency, and
how they come together to allow for a meaningful experience.
!
Figure 1 “Approximate learning curve for the kazoo,
kalimba, piano, and violin over a period of 10 years” [15].
The seven heuristics as proposed by Wallis et el [16] describe
the qualities of musical instruments that inspire long-term
engagement; incrementality (learning curve to encourage flow),
complexity (ceiling of expertise), immediacy (how accessible
the instrument is), ownership (personal configurability to
achieve own style), operational freedom (affordances offered
by the interface to allow for interactive complexity),
demonstrability (the ability to perform and share with others)
and cooperation (the opportunity to play as ensemble). Each
heuristic can be used to inform the design process of creating
new instruments within the SEN setting.
There are also elements presented here that connect with
Morreale et al’s [17] MINUET framework for musical interface
design grounded in the experience of the player. The
framework is guided by looking at how people, activities,
contexts and technologies combine, and uses a two-stage design
process consisting of goals and specifications to help designers
“position, shape, and evaluate their system” (p467).
Figure 2 - MINUET framework [17]
5.!DESIGN CONSIDERATIONS
Presented below are 18 design considerations for instruments for
users with complex needs in SEN settings. Considerations developed
from literature reviewed, practice based work by the authors directly
and via reports of similar work, and continuing doctoral research by
the first author. Much like Perry Cook’s (2001) design principles
some are human/artistic and some are technical, or in different terms
some relate to the instrument, some to the user and others relate to the
context of use. They begin with a focus on the design of the
instrument itself, move out into the design of the system and then into
designing for the context of use.
1. Consider each layer of the system – There is commonly a
modular 3-part description to DMIs. Moog [18] identified “the sound
generator, the interface between the musician and the sound
generator, and the tactile and visual reality of the instrument that
makes a musician feel good when using it” (p214), Pressing [19]; the
control interface, the processor, and the output, and Hunt et al [20]
the interface, abstract, and synthesis mapping layers. Thinking of the
separate elements creates a modular system where each element can
be enhanced, replaced [6], adapted, modified, or automated
depending on the need of the musician. This enables a tailoring to an
individual’s specific needs and capabilities, both in terms of how they
can interact with the system (sensor inputs, gestural capability, or
other ways the individual can provide energy to the system), and
what the system provides back (feedback mechanism and also
content of that mechanism). Making interactions meaningful with
mapping between the player’s control of the instrument and the
sound produced being one of the most dominant issues in the
creation of new musical interfaces [21] and each layer of a system
allows for meaning to be added and also allows for the system to
provide support where needed in a flexible way.
2. Decoupling the action and sound production – In DMIs
the excitation-sonification relationship is broken. This can lead to
opportunities but can also create problems. The dislocation of
excitation and sonification is exciting [22], in that any small
movement can be used to produce large sonic changes but can also
cause problems with cause and effect for some users as dislocation of
action and reaction can be an abstract concept for some. Feedback is
often provided separately from where the excitation occurred and, if
not delivered in a way that can be accessed by the user, can render
gestures meaningless. According to the stakeholders at Three Ways
school to mitigate this feedback should be placed close to creation of
sound, either embedded or with an amplifier, for example, touching
to the musician’s seat for vibration [personal communication].
3. Expression vs Constraint – How much expression is offered
can affect how engaging the instrument is depending on the user.
“The one-for-one (mapping) scheme may be inspired by a wish on
the part of the instrument designer to make the instrument ‘easy to
play’, but it is a debatable point whether this simplicity is in fact a
desirable thing, or whether this results in an instrument lacking in
expressive capability” [23 p1023]. Mappings which are not one-to-
one are more engaging for users [24] however “good musical
instruments must strike the right balance between challenge,
frustration and boredom” [15 p174]. Rich experiences tend not to
come from devices that are too simple, however devices that are too
complex can “alienate the user before their richness can be extracted”
[25] so there needs to be a balance between both elements that suit
the musician playing. Instruments such as the Skoog (a tactile
‘squishy’ controller based on physical modelling) offer virtuosic
control for musicians with high functioning cognitive ability and low
motor skill however may not be suited to an individual with low
cognitive ability and low motor skill. In a SEN setting expression vs
constraint are better expressed as scalability and configurability, used
to provide a system that suits the individual’s needs and is
empowering vs overpowering. Scalability and configurability can be
provided at the interface level by using flexible modular input
mechanisms, by dynamic interfaces that can be configured to the
user’s abilities to create potentially complex and expressive
musical gesture from simple inputs, and/or at the content level by
being able to map these inputs to meaningful content. There is an
important balance to strike here as teachers Kirsty Hafford and Ben
Edwards say ‘opening up expression means it takes longer to get
outcomes and in an environment driven by outcomes things can get
done for people which can lead to an unsatisfactory learning
experience [personal communication]. Instruments should be able to
scale in content to suit the user’s ability and allow for improvement
over time. Making things configurable and scalable to the
individuals using them is paramount in this context as there is no
typical user.
4. Continuum of control – Johnston et al [26] identify three
modes of interaction characterising the musicians approach to virtual
instruments. Each offer different levels of control over the system;
Instrumental: where the musician prioritises detailed control,
ornamental: where the musician surrenders detailed control to allow
for the software to transform the sound, and conversational: a two-
way conversation between the musician and the virtual instrument
that shapes the musical direction the musician takes. In the SEN
setting there needs to be more of a continuum of control. This
continuum of computer control vs human control of the system can
be used to scaffold the capabilities of the individual and provide
support when needed whilst allow maximum control of the
instrument. For example, consider playing a melody; a switch (which
is a very common assistive technology tool) could be used to scroll
through a melody note by note, or a movement in and out of an
ultrasonic beam, such as those featured on the Soundbeam [2], could
provide the same potential but the musician has to successfully select
the right zone to break on the beam, both these musicians are being
supported to different degrees to achieve the same outcome. Systems
can support those with different levels of needs to play together.
5. Natural interaction (when I move you move) – This
principle relates to matching the gesture of excitation to the
sonification in a way that makes sense to the player. “a direct
relationship is established between the physical gesture, the nature of
the stimuli and the perceived outcome. The resulting awareness is
multifaceted and has been at the core of musical performance for
centuries” [22]. “The gesture used has to have an intuitive result from
the sound; e.g. you can hit a snare drum in a multitude of ways and
produce a variety of sounds and dynamics. The sound should
genuinely express the nature of the movement in a ‘symbiotic’
relationship” [27] i.e. if you push harder the sound is louder; what a
player might naturally expect from an interaction of that where the
form and function link with the shape of the design style. Instruments
that mimic a natural interaction to traditional instruments (for
example using valve style buttons for recreating a trumpet valve) can
offer an experience close to the traditional instrument, giving a sense
of familiarity to the user as to what is expected from the interaction.
Another important add-on is the ability to stop all the sound. Hewitt
[27] suggests that “being able to make no sound without having to
withdraw from the motion-sensing field – like stopping a bow on a
cello string without lifting it up” is of high importance. Gesture to
sonification should be tailored to the individual and their range of
movement or capability allows mapping of an interaction that is
natural to that individual.
6. Form should inspire interaction – Acoustic instruments
are naturally pleasing to look at and feel. They are enjoyable artefacts
with history to them and are formed from natural materials. Tactile
materials with a shape, texture, feel, smell and feedback can draw
users in and stimulate all the senses. Instruments designed with new
materiality and form provide new opportunities to inspire interaction
and allow configuration of the instrument to suit the individual’s
preference and need, both in terms of look and feel. Some CYP may
be averse to touching certain textures and others may have favourite
colours and textures that can be used to encourage engagement. One
of the criticisms of the Skoog was that it was very child-like in
appearance, something that has been rectified with the Skoog 2.0 [3].
7. Robust/Durable/Stable – “Construction can never be solid
enough, especially when it is to be used by children” [28]. Designs
should be as robust as possible to ensure they have the durability to
cope with the context they will be used in. There is also a need for the
instruments and any accompanying software to be as stable as
possible. If there are malfunctions, then this can be discouraging for
the users and those around them and may lead to technologies being
abandoned.
8. Respect the feedback loop – Interaction between the person
and the instrument typically takes place through the aural and visual
feedback loop with the performer making decisions in real-time on
that basis [19]. For users with complex needs these channels of
feedback may be impaired, therefore feedback should be provided in
a way that make sense to the user allowing access and resonance with
the instrument. Within stakeholder meetings tactile/haptic and
vibration feedback were identified as important to reinforce cause
and effect. Light and visuals were also found to provide structure and
stimulate responses.
As well as the feedback from playing the instrument there
should be adequate feedback for the navigation of the instruments
configuration. To allow for navigation feedback should be visual,
audible, and/or tactile allowing for scalability to physical, cognitive,
and sensory ability [6].
9. Make it meaningful to those involved – This means
creating technology that allows for the user to add their own
content/samples and give input for how the instrument works in a
customisable way, thus having some ownership over the instrument
design, and not only making it work based on individual needs in
terms of their cognitive/sensory/motor skills but also making it carry
meaning for them in terms of content. One of the criticisms of some
previous DMI’s specifically aimed at the SEN market is that their
sound palettes are impersonal and lacking in sophistication [7]. Mike
Whitlock suggests this can be negated by leaving the sound palette
open enabling users to add their own sounds that carry meaning for
the individuals using them [personal communications].
10. If you can add a microphone- do it – Use of voice is very
important in an SEN context. It can provide an avenue for exploring
self-experience, communication and relational possibilities [29]. A
microphone can provide access to allow for those that cannot interact
with a system in any other way. Stakeholders from the Three Ways
School say that voice and voice manipulation is a good avenue for
engagement for some CYP that would otherwise be unable to
physically interact with a system and also allows for addition of
sampled sounds from the environment to be input into the system
[personal communication].
11. Think of sound quality – Make the sound quality high.
The overriding use of the MIDI protocol and the general MIDI sound
range in the past has left a lot to be desired with the type of sounds
offered and the inherent lack of expressive potential offered. The
“lack of subtlety has meant that timbres can wear thin [20]”. Hewitt
[27] suggests that ideally there should be “an option to be polyphonic
– played with multiple movements simultaneously”. Whitlock also
suggests this is useful for building up rich sonic soundscapes by
layering triggered sounds [personal communication]. The quality of
onboard sound and the quality and option for outboard sound is
important as sound may be amplified through a PA system or via
amps or monitors or headphones. The ability to adjust sound levels to
suit the user is important as some CYP may be very sensitive to
sound and others may have hearing impairments. Localising the
sound by placing amps or monitors close to the player is common
practice within the school setting to reinforce cause and effect.
12. Facilitate choice/ offer consistency – Instruments in the
main are set up by the musician playing them, in the school context
this is not the case. Rather, there is a tendency for those facilitating
the musician or the session to choose the setup of the technology both
in terms of how gestures are captured and the musical output of the
system. When decisions are made for people, this leads to two
problems; relinquished choice of both interaction style and output
received, and potential for moving of the goalposts or in other words
programmability is a curse [9]. Within the context of musicians with
complex needs there can be a tendency of involuntarily relinquished
choice meaning that things are often chosen for people instead of
with them. Enabling users to select for themselves, if they can, the
level and type of control they have should be paramount. Hunt et al
speak of the dangers of configurable instruments in that the
“goalposts are constantly being moved” [14 p364]. They say
traditional instruments do not change character from one session to
the next and musicians undergo a process of learning to configuring
their instrument. Changing goalposts can mean that some users never
have the chance to get to grips with their instrument, this can be
particularly damaging if their needs mean that predictability is a
strong motivator. There could also be the danger of learned
helplessness with users not feeling like they have control over the
system or feeling like it is their fault that the instrument is responding
differently. Hunt et al suggest perhaps setting up an instrument with
the same configuration for each particular situation [14]. This can be
made more difficult if the particular situation changes often as can be
the case in the school setting with different locations and staff being
used to facilitate sessions on a pragmatic basis. A built-in system to
recall configurations would help with this.
13. Participatory design – Teacher Kirsty Hafford says that
creating with the user provides a more authentic picture, Woodbury
adds that this is important to establish where the design should go and
highlights issues that may not be obvious to the digital musical
instrument designer [personal communication]. Only the users and
those who work closely with them will best know their needs in
terms of interacting with an instrument. Working in a participatory
way can allow for rapidly working out kinks and problems with any
designs. In our experience a designer cannot possibly guess at
how a user with an alternative thought process will respond to a
particular design which may have taken hours of work, so
participatory design also means a reduction in wasted time.
14. Small, cheap and easy to use – Barry Farrimond
describes the first instrument he designed for users with complex
needs and how it was only revealed to be big, expensive, and hard to
use upon its maiden voyage of use [30]. Typically, in a school there
is limited space and budgets, both in terms of time spent by staff
training to use the technology and money available to buy technology
[4]. Having things that are off-the-shelf/affordable, easy to
programme with minimal set-up needed, that can be made compact
are paramount [27, 6]. Expense and need for insider knowledge lead
to tools being abandoned [7]. Plug-and-play is the ideal in terms of
allowing the system to work within the context as ease of use is
currently a barrier to technology usage. Gallin and Sirguy [31] give 6
points that impacted on the design process of their plug-and-play
system that can be useful to consider; “1) the technical side must be
transparent to the user; 2) the design is focused on the way the
interface will be used; 3) the accessible parameters are only “visible”
setting parameters; 4) it imposes a wide compatibility with existing
OS, softwares, MIDI devices and other hardware interfaces; 5) it
requires different levels of use: ready-to-use; internal parameter
access via the editor; and Max programming; 6) it requires
compatibility with other communication protocols (p437)”. These
points cover several important areas that allow these systems to work
in context and with other systems already in place whilst not
overwhelming those facilitating the use of the system. Once the
system is up and running technology can be adapted to the situation
by adapting equipment as needed in practice, and allowing
equipment to work alongside other equipment. The more familiarity
that can be provided as part of the design the better as then users
won’t be so fearful of using the technology, for example allow
switches that are already used in the school to be plugged into the
designed modular instrument. This allows for components to be
added as user’s familiarity with the system grows. To enhance ease
of use, remove unnecessary complexity like jargon, convolution, big
manuals, and hard configuration should be avoided [6] and any
terminology or language used should be familiar to the user. Designs
should be easy to use physically (for example jack sockets and
connectors can be hard to pull apart) in making sure the system is
suitable for the amount of strength the user is capable of. Instruments
should also be able to be mounted, with standard mounting fixtures
and arms, to enable easy positioning.
15. Wires are not awesome – Instruments that are wireless
enable easier sharing and cut down on health and safety issues, they
also mean that there can be a distance between the computer at the
centre of the sound processing and where the action is. The music
therapy space is best kept clear of electrical leads and this is
especially important with users that are unable to reach a computer or
their equipment prevents them from easily accessing wired devices.
Some equipment vital to some CYP or wheel chairs do not easily
travel over wires and for others having the computer and its screen
nearby can provide distractions. However, there is danger of adding
complexity to the system and opportunities for technical failure by
making things wireless.
16. Think of the whole context – Designing a DMI in itself is
a challenge but when this design process is placed in a school setting
it can be even more challenging. There is a need to find out how best
to communicate with those involved and how to disseminate what
you are creating. The school environment may restrict what can be
done, with the time of day and year affecting the ability to access the
users. Very often instruments designed are not accessible or
configured by the target users directly but by those facilitating access.
There may be several practitioners involved in the use of the
technology; from music leaders to music therapists, to teachers and
teaching assistants all with various goals. Sessions could focus on
“education in music, education through music, music therapy, or
music as a leisure activity” [6 p11] with goals to play as an ensemble,
to feel a sense of intimacy with an instrument [21], to provide a
therapeutic or educational experience, or playing for fun. There is a
need for user friendly systems that understand requirements and
attitudes of facilitators in order to be inviting to use [7]. Often DMIs
developed for the school setting are taken away when the research
finishes, “leaving something behind is preferable as is keeping the
tech neutral with no brand, open source and widely available” [32]. If
DMIs are taken away there in no opportunity to practice with the
instrument removing the chance to progress.
17. Providing educational context for use – One of the
larger problems, certainly for the uptake of technology within the
area of music therapy, is incorporating technology into practice [33,
5] and having confidence with doing so [7]. Cevasco and Hong [34]
suggest giving provision to providing examples of how to
incorporate technology into practice with better training on how to
enhance music making with technology to make it less daunting.
Linking with requirements of the curriculum, learning outcomes, or
other curriculum subject areas can also be useful at showing the
spectrum of how the technology can be put into practice.
Frameworks such as the Sounds of Intent [35] have made progress in
this area. This can create a context for use especially if linked into
teaching schemes. If teachers and facilitators cannot see how the
technology can be put into practice they may leave it on the shelf.
18. Tech and do you even need it? – Technology should be
unique to an individual’s needs [32]. and not just be used as
“technology for technologies sake” [11 p151]. Bott identifies that a
key issue to consider when determining musical possibilities for
individual musicians is to try and distinguish between: a) Access
Needs, and b) Learning Needs [6]. For physical barriers, the
emphasis of provision should aim to maximize individual physical
abilities and for cognitive barriers, an emphasis on tools that adapt to
the individual’s cognitive level should be paramount. The technology
should primarily meet the creative preference of the musician [6].
Stakeholders from the Three Ways School say technology can also
be combined with acoustic instruments by using this interplay to
encourage motivation, interaction, and engagement [personal
communication].
6.!IMPLEMENTATION EXAMPLE
Following is an example of the design considerations in use as
part of the EngD research leading to this paper. The research
has used participatory design (13) to lead to the development of
the modular accessible musical instrument (MAMI) system,
which is in its prototype stage. The system will be described
with numbered links to the design considerations which helped
inform its design. Consisting of both hardware and software
components, MAMI is aimed at providing a flexible way to
create accessible instruments (1). The software component
allows for connection of bespoke and existing technologies via
various communication protocols (serial, MIDI, OSC or Human
Interface) and for them to be mapped to musical parameters (9),
or connected through to existing software (11). So far two
bespoke instruments have also been developed to connect to the
MAMI software: filterBox –a wireless handheld wooden box
featuring a LDR, two buttons, and an FSR; and squishyDrum –
a wireless circular wooden drum style enclosure with an
elasticated fabric ‘skin’ which can be played by both deforming
the surface and percussively hitting the enclosure (6). The
development of these instruments is aimed at creating robust
(7) wireless instruments (15) that inspire natural interaction (5)
and are affordable to make (14), and easy to use for both the
user and facilitators (16). Future work will look to; enable
localised feedback (2,8); create graphical user interfaces that
allow ease of customisation both in terms of sounds produced
(12), expressivity offered (3), and level of user control (4); and
to potentially add on-board recording (10). The system will be
tested within an educational context (17).
7.!CONCLUSION
This paper has offered 18 design considerations for instruments
for users with complex needs in SEN Settings. With the aim of
aiding DMI designers with the creation of new instruments and
systems. The considerations focus not only on the instruments
being designed, but also the system as a whole and the system
in relation to the context of use. An implementation example is
also given. When employed in practice these principles should
be adapted to their contexts of use to allow for technology to be
accepted and used, and to facilitate better access to music
making not just for those whom technology provides an
unparalleled access point to music, but also for those around
them who facilitate the technology in use.
8.!ACKNOWLEDGMENTS
Funding for the CDE granted by the EPSRC (code: EP/L016540/1)
Thanks go to Three Ways School, Victoria Education Centre, and Dr
Ann Bevan, Bournemouth University, for providing supervisory
support of the research.
9.!REFERENCES
[1]!Magee, W. (ed)., Music Technology in Therapeutic and Health
Settings. London: Jessica Kingsley Publishers, 2014
[2]!Soundbeam,. Soundbeam. http://www.soundbeam.co.uk/
(Retrieved 16th March 2016), 2016.
[3]!Skoogmusic Ltd. Meet the Skoog.
http://www.skoogmusic.com/skoog (Retrieved 10th October
2015), 2016.
[4]!Hahna, N. D., Hadley, S., Miller, V. H., and Bonaventura, M.
Music technology usage in music therapy: A survey of practice.
Arts in Psychotherapy, 39 (5), pp.456–464, 2012.
[5]!Magee, W.L. Electronic technologies in clinical music
therapy: A survey of practice and attitudes. Technology
and Disability, 18, pp.139–146, 2006
[6]!Farrimond, B., Gillard, D., Bott, D., and Lonie, D. Engagement
with Technology in Special Educational & Disabled Music
Settings. Youth Music, (December), 1–40, 2011.
[7]!Streeter, E. Reactions and Responses from the Music Therapy
Community to the Growth of Computers and Technology -
Some Preliminary Thoughts.
https://voices.no/index.php/voices/article/view/467/376
(Retrieved 10th March 2015), 2007.
[8]!Misje,R. Music Technology in Music Therapy. Masters
thesis, 2013.
[9]!Cook, P. R., 2001. Principles for designing computer music
controllers. Proceedings of the International Conference on
New Interfaces for Musical Expression, 1–4.
[10]!Scope. Special Educational Needs. Scope About disability,
Scope. Available from:
http://www.scope.org.uk/support/families/education-
sen?gclid=CjwKEAiAm8nCBRD7xLj-
2aWFyz8SJAAQNalaVETn-
Osg3KB2HTThZ__cQZ7Y4jqXCcWBP-
pLpJkOmBoCptbw_wcB [Accessed 4 January 2017], 2017.
[11]!Magee, W. L. Music Technology for Health and Well-Being:
The Bridge Between the Arts and Science. Music and
Medicine, 3, 131–133, 2011.
[12]!MacDonald, R. a. R. and Miell, D. Music for individuals with
special needs: a catalyst for developments in identity,
communication and musical ability. [online]. Available from:
http://www.oup.com/uk/catalogue/?ci=9780198509325, 2002.
[13]!Larsen, J. V., Overholt, D., and Moeslund, T. B. The Prospects
of Musical Instruments For People with Physical Disabilities.
In: Proceedings of the International Conference on New
Instruments for Musical Expression. 327–331, 2016.
[14]!Hunt, a., Kirk, R., Abbotson, M., and Abbotson, R. Music
therapy and electronic technology. Conference Proceedings of
the EUROMICRO, 2, 362–367, 2000.
[15]!Jordà, S. Digital Lutherie Crafting musical computers for new
musics ’ performance and improvisation. Departament de
Tecnologia [online], 26 (3), 531. Available from:
http://dialnet.unirioja.es/servlet/tesis?codigo=19509, 2005.
[16]!Wallis, I., Ingalls, T., Campana, E., Vuong, C. Amateur
Musicians, Long-Term Engagement, and HCI. In: Holland, S.,
Wilkie, K., Mulholland, P., Seago, A. eds. Music and Human-
Computer Interaction. London: Springer, 49-66, 2013.
[17]!Morreale, F., Angeli, A. De, and O’Modhrain, S. Musical
Interface Design: An Experience-oriented Framework.
Proceedings of the International Conference on New Interfaces
for Musical Expression [online], 467–472. Available from:
http://www.nime.org/proceedings/2014/nime2014_437.pdf,
2014.
[18]!Moog, R. The musician: alive and well in the world of
electronics. In The Biology of Music Making: Proceedings of
the 1984 Denver Conference. St Louis, MMB Music, Inc (pp.
214-220), 1988.
[19]!Pressing, J. Cybernetic Issues in Interactive Performance
Systems. Computer Music Journal, 14(1), 12-25, 1990.
[20]!Hunt, A., Kirk, R. & Neighbour, M. Multiple media interfaces
for therapy. IEEE Multimedia, 11, pp.50–58, 2004.
[21]!Fels, S. Designing for intimacy: Creating new interfaces for
musical expression. Proceedings of the IEEE, 92(4), pp.672–68,
2004.
[22]!Paine, G. Towards Unified Design Guidelines for New
Interfaces for Musical Expression. Organised Sound, 14 (2),
142, 2009.
[23]!Kirk, R., Hunt, A., Hildred, M., Neighbour, M., North, F.
Electronic Musical Instruments- a role in Music Therapy? In:
Fachner, J., Aldrige, D (eds), Music Therapy in the 21st
Century: Contemporary Force for Change.
MusicTherapyWorld.Net, Oxford, 2002.
[24]!Hunt, A. and Kirk, R. Mapping strategies for musical
performance. Trends in Gestural Control of Music [online],
231–258. Available from:
http://www.music.mcgill.ca/~mwanderley/MUMT-
615/Papers/Class10/P.HunKir.pdf, 2000.
[25]!Jordà, S. Digital Instruments and Players: Part I – Efficiency
and Apprenticeship. In: Nagashima, Y., Ito, Y., and Furuta, Y.,
eds. Proceedings of the International Conference on New
Interfaces for Musical Expression [online]. Hamamatsu, Japan,
59–63. Available from:
http://www.nime.org/proceedings/2004/nime2004_059.pdf,
2004.
[26]!Johnston, A., Candy, L., and Edmonds, E. Designing and
evaluating virtual musical instruments: facilitating
conversational user interaction. Design Studies, 29 (6), 556–
571, 2008.
[27]!Hewitt, G. Hackday Accessible Music. drakemusic.org, Drake
Music. Available from:
http://www.drakemusic.org/blog/gawain-hewitt/hackday-
accessible-music [Accessed 22 December 2014], 2014.
[28]!Jensenius, A. R. and Voldsund, A. The Music Ball Project!:
Concept , Design , Development , Performance. NIME 2012
Proceedings of the International Conference on New Interfaces
for Musical Expression, (Figure 2), 300–303, 2012.
[29]!Andersson, A.-P. and Cappelen, B. Designing Empowering
Vocal and Tangible Interaction. Proceedings of the
International Conference on New Interfaces for Musical
Expression [online], 406–412. Available from:
http://nime2013.kaist.ac.kr/, 2013.
[30]!Farrimond, B. The Clarion - an accessible musical instrument 8
years in the making. Youth Music Network, Youth Music.
Available from:
http://network.youthmusic.org.uk/posts/clarion-accessible-
musical-instrument-8-years-making [Accessed 5th January
2017], 2016.
[31]!Gallin, E. and Sirguy, M. Eobody3: A ready-to-use pre-mapped
amp; multi-protocol sensor interface. Proceedings of the
International Conference on New Interfaces for Musical
Expression [online], (June), 437–440. Available from:
http://www.nime2011.org/proceedings/papers/M06-Gallin.pdf,
2011.
[32]!Nagler, J.C. Music Therapy Methods With Hand-Held Music
Devices in Contemporary Clinical Practice: A Commentary.
Music and Medicine, 3, pp.196–199, 2011.
[33]!Crowe, B.J. & Rio, R. Implications of technology in music
therapy practice and research for music therapy education: a
review of literature. Journal of music therapy, 41(4), pp.282–
320, 2004.
[34]!Cevasco, A. & Hong, A. Utilizing Technology in Clinical
Practice: A Comparison of Board-Certified Music Therapists
and Music Therapy Students. Music Therapy Perspectives,
29(1), pp.65–73, 2011.
[35]!Welch, G.F., Ockelford, A., Carter, F-C., Zimmermann, S-
A., & Himonides, E. ‘Sounds of Intent’: Mapping Musical
Development in Children and Young People with
Complex Needs. Psychology of Music, 37 (3), 348–370,
2009.
10.!Appendix 1
Stakeholders:
Three Ways school: Interactive designer Luke Woodbury, teachers
Kirsty Hafford and Ben Edwards, music therapist Adrian Snell.
Victoria Education Centre: Music technologist Mike Whitlock.
Website for ongoing research: http://www.dotlib.org/