PresentationPDF Available

EAVI EMG board

Authors:
EAVI EMG board
Balandino Di Donato, Atau Tanaka and
Michael Zbyszy´
nski
Embodied Audiovisual Interaction Group
Goldsmiths, University of London
SE14 6NW, London, UK
[b.didonato, a.tanaka, m.zbyszynski]@gold.ac.uk
Martin Klang
Rebel Technology
London, UK
mars@pingdynasty.com
1. DEMO
Electromyography (EMG) has been widely adopted to build
new interfaces for musical expression by the community [10,
4]. Muscular activity is inherently noisy, making EMG signals
potentially difficult to map to audio parameters, and work with
when designing interactions with audiovisual systems. For
decades, musicians and technologists have explored different
solutions – from costly medical devices to do-it-yourself (DIY)
packages – to find reliable hardware for capturing the best
EMG signal in order to facilitate the music and instrument
making process.
In 2014 Thalmic Labs released the Myo, a wireless 8-channel
EMG armband with a built-in inertial measurement unit (IMU)
designed specifically for multimodal human-computer interac-
tions. This device, together with custom software developed by
the community [3, 2, 6, 7], has allowed the NIME community
to easily take advantage of EMG technology in a variety of ap-
plications involving interactive audiovisual control [8, 5, 1, 9].
Unfortunately, the Myo armband was discontinued in October
2018 and is no longer available on the market. Thus, the com-
munity is once again facing the problem of generating its own
EMG solutions. For this reason, we decided to build the EAVI
EMG board (Figure 1).
Figure 1: Board prototype worn on right forearm.
The EAVI EMG board features six EMG channels and a 3-
axis accelerometer. Dry electrodes are attached to a Plux Snap-
Licensed under a Creative Commons Attribution 4.0
International License (CC BY 4.0). Copyright re-
mains with the author(s).
NIME’19, June 3-6, 2019, Federal University of Rio Grande do Sul,
Porto Alegre, Brazil.
Bit Trio1with an EMG sensor2seated on top of it (Figure 2)
and housed in a custom case. Each sensor is connected the
board via micro-USB. In contrast to the Myo armband, which
could be worn on the forearm only, our solution enables the
positioning of the electrodes on any part of the body through
our custom electrode case (Figure 3) which supports the use of
bands of any length and material. The accelerometer is on the
main circuit board.
Figure 2: Plux’s Snap Bit Trio with soldered EMG
module.
Figure 3: Snap Bit Custom Case.
The EAVI EMG board captures gestural data at a sample
rate of 16 kHz and a resolution of 20 bits, and streams to a
computer via USB and Bluetooth Low-Energy (BLE).
As it is possible to observe in Figure 4, at the current stage,
the board suffer from a high signal-noise ratio. However, the
board is currently under constant development, thus we aim to
present to the NIME community the latest improvements made
on this work.
Figure 4: EMG signal.
We will demonstrate the potential of the EAVI EMG board
for NIME applications by using it to control parameters of a
bespoke software synthesizer. We will invite the public try the
board and welcome discussions related to potential applications
in music, dance, building new bio musical instruments and re-
lated themes.
This technology represents an alternative solution for re-
searchers and musicians interested in implementing EMG tech-
nology in their work. Because this board supports flexible place-
ment of electrodes on muscles other than the forearm, it opens
1https://store.plux.info/breakout-boards/
220-snapbit-trio.html
2https://store.plux.info/bitalino-sensors/
8-electromyography-emg-sensor.html
up a new range of possibilities for performers (e.g. dancers)
when compared to previous EMG implementations.
Source code and more technical details about the board will
be released through an open repository, thus facilitating the
customisation and implementation of the board in different con-
texts by the community.
2. TECHNICAL AND SPACE REQUIRE-
MENTS
Table
Multi-plug (3 plugs min)
Computer monitor (optional)
3. BIOGRAPHIES
3.1 Balandino Di Donato
Balandino is a Research Assistant at Goldsmiths, University of
London, where he is currently involved in BioMusic ERC-POC
funded project. BioMusic focuses on the creation of BioMu-
sical instruments using EMG based techonlogies. His under-
graduates works have been focused on the realisation of the
Tangible User Interface (TUI) ‘Metis’, and in contemporary
music composition. He worked at Centro Ricerche Musicali di
Roma (CRM) as artistic and research assistant, and for na-
tional and international musical productions as sound engineer.
In 2013, he was involved in the development of Integra Live at
the Royal Birmingham Conservatoire, where he later realised
his PhD with a thesis regarding the embodied control of au-
diovisual feedback during musical performance using IMU and
EMG technology.
3.2 Atau Tanaka
Atau Tanaka conducts research in embodied musical interac-
tion. This work takes place at the intersection of Human Com-
puter Interaction and gestural computer music performance.
He studies our encounters with sound, be they in music or
in the everyday, as a form of phenomenological experience.
This includes the use of physiological sensing technologies, no-
tably muscle tension in the electromyogram signal, and machine
learning analysis of this complex, organic data. He is Professor
of Media Computing in the Embodied Audiovisual Interaction
unit (EAVI) at Goldsmiths.
3.3 Michael Zbyszy ´
nski
Michael Zbyszy´nski is a lecturer at Goldsmiths, University of
London, where he is co-leader of the Electronic Music, Comput-
ing and Technology program. His research involves application
of interactive machine learning to musical instrument design
and performance. As a musician, his work spans from brass
bands to symphony orchestras, including composition and im-
provisation with woodwinds and electronics. He has been a soft-
ware developer at Avid, SoundHound, Cycling ’74, and Keith
McMillen Instruments, and was Assistant Director of Pedagogy
at UC Berkeley’s Center for New Music and Audio Technolo-
gies (CNMAT). He holds a PhD from UC Berkeley and studied
at the Academy of Music in Krak´ow on a Fulbright Grant. His
work has been included in Make Magazine, the Rhizome Art-
base, and on the ARTSHIP recording label.
3.4 Martin Klang
Martin Klang is a software developer, electronics designer,
start-up entrepreneur and improvising musician. Having stud-
ied at University of Gothenburg, Chalmers University of Tech-
nology and Universit´e Paris-Sorbonne, he spent 10 years as a
software engineer and systems architect before setting up his
own consultancy. His work for large and small clients has spe-
cialised in developing state-of-the-art compilers and designing
bespoke programming languages. Martin has always been com-
mitted to Free and Open Source, and is a founding member of
the London Music Hackspace. He now runs Rebel Technology,
a London-based manufacturer of innovative and unique music
electronics.
4. ACKNOWLEDGEMENT
We acknowledge our funding body H2020-EU.1.1. - EXCEL-
LENT SCIENCE - European Research Council (ERC) - ERC-
2017-Proof of Concept (PoC) - Project name: BioMusic -
Project ID: 789825. We acknowledge the work of Geert Roks,
student at HKU - HKU University of the Arts Utrecht, on the
electrodes casing prototype.
5. REFERENCES
[1] C. Benson, B. Manaris, S. Stoudenmier, and T. Ward.
Soundmorpheus: A myoelectric-sensor based interface for
sound spatialization and shaping. In Proceedings of the
International Conference on New Interfaces for Musical
Expression, volume 16 of 2220-4806, pages 332–337,
Brisbane, Australia, 2016. Queensland Conservatorium
Griffith University.
[2] B. Caramiaux. Myo-maxpd. Available from:
https://github.com/bcaramiaux/Myo-maxpd, 2016.
Accessed: 13 February 2015.
[3] B. Di Donato, J. Bullock, and A. Tanaka. Myo mapper: a
myo armband to osc mapper. In T. M. Luke Dahl,
Douglas Bowman, editor, Proceedings of the International
Conference on New Interfaces for Musical Expression,
pages 138–143, Blacksburg, Virginia, USA, June 2018.
Virginia Tech.
[4] M. Donnarumma, B. Caramiaux, and A. Tanaka.
Muscular interactions. combining EMG and mmg sensing
for musical practice. In Proceedings of the International
Conference on New Interfaces for Musical Expression,
pages 128–131, Daejeon, Republic of Korea, May 2013.
Graduate School of Culture Technology, KAIST.
[5] A. R. Jensenius, V. G. Sanchez, A. Zelechowska, and
K. A. V. Bjerkestrand. Exploring the myo controller for
sonic microinteraction. In Proceedings of the International
Conference on New Interfaces for Musical Expression,
pages 442–445, Copenhagen, Denmark, 2017. Aalborg
University Copenhagen.
[6] F. Jules. myo-for-max. Available from:
https://github.com/JulesFrancoise/myo-for-max,
2016. Accessed: 1 February 2017.
[7] S. Kamkar. MyoOSC. Available from:
https://github.com/samyk/myo-osc, 2015. Accessed: 13
February 2015.
[8] C. P. Martin, A. R. Jensenius, and J. Torresen.
Composing an ensemble standstill work for myo and bela.
In T. M. Luke Dahl, Douglas Bowman, editor,
Proceedings of the International Conference on New
Interfaces for Musical Expression, pages 196–197,
Blacksburg, Virginia, USA, June 2018. Virginia Tech.
[9] K. Nymoen, M. R. Haugen, and A. R. Jensenius. Mumyo
- evaluating and exploring the myo armband for musical
interaction. In E. Berdahl and J. Allison, editors,
Proceedings of the International Conference on New
Interfaces for Musical Expression, pages 215–218, Baton
Rouge, Louisiana, USA, May 2015. Louisiana State
University.
[10] A. Tanaka and R. B. Knapp. Multimodal Interaction in
Music Using the Electromyogram and Relative Position
Sensing. In Proceedings of the 2002 Conference on New
Interfaces for Musical Expression, NIME ’02, pages 1–6,
Dublin, Ireland, 2002.
ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
We present an innovative sound spatialization and shaping interface, called SoundMorpheus, which allows the placement of sounds in space, as well as the altering of sound characteristics, via arm movements that resemble those of a conductor. The interface displays sounds (or their attributes) to the user, who reaches for them with one or both hands, grabs them, and gently or forcefully sends them around in space, in a 360° circle. The system combines MIDI and traditional instruments with one or more myoelectric sensors. These components may be physically collocated or distributed in various locales connected via the Internet. This system also supports the performance of acousmatic and electronic music, enabling performances where the traditionally central mixing board, need not be touched at all (or minimally touched for calibration). Finally, the system may facilitate the recording of a visual score of a performance, which can be stored for later playback and additional manipulation. We present three projects that utilize SoundMorpheus and demonstrate its capabilities and potential.
Conference Paper
Full-text available
Myo Mapper is a free and open source cross-platform application to map data from the gestural device Myo armband into Open Sound Control (OSC) messages. It provides an easy to use tool for musicians to explore the Myo's potential for creating new gesture-based musical interfaces. Together with details of the software, this paper reports on projects realised with the Myo Mapper as well as a qualitative evaluation. We propose guidelines for using Myo data in interactive artworks based on insight gained from the works described and the evaluation. We show that Myo Mapper empowers artists and non-skilled developers to easily take advantage of raw data from the Myo data and work with high-level signal features for the realisation of interactive artistic and musical works. Myo Mapper: 1) Solves an IMU drift problem to allow multimodal interaction; 2) Facilitates an clear workflow for novice users; 3) Includes feature extraction of useful EMG features; and 4) Connects to popular machine learning software for bespoke gesture recognition.
Conference Paper
Full-text available
This paper describes the process of developing a standstill performance work using the Myo gesture control armband and the Bela embedded computing platform. The combination of Myo and Bela allows a portable and extensible version of the standstill performance concept while introducing muscle tension as an additional control parameter. We describe the technical details of our setup and introduce Myo-to-Bela and Myo-to-OSC software bridges that assist with prototyping compositions using the Myo controller.
Conference Paper
Full-text available
This paper describes a technique of multimodal, multichannel control of electronic musical devices using two control methodologies, the Electromyogram (EMG) and relative position sensing. Requirements for the application of multimodal interaction theory in the musical domain are discussed. We introduce the concept of bidirectional complementarity to characterize the relationship between the component sensing technologies. Each control can be used independently, but together they are mutually complementary. This reveals a fundamental difference from orthogonal systems. The creation of a concert piece based on this system is given as example.
Muscular interactions. combining EMG and mmg sensing for musical practice
  • M Donnarumma
  • B Caramiaux
  • A Tanaka
M. Donnarumma, B. Caramiaux, and A. Tanaka. Muscular interactions. combining EMG and mmg sensing for musical practice. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 128-131, Daejeon, Republic of Korea, May 2013. Graduate School of Culture Technology, KAIST.
Exploring the myo controller for sonic microinteraction
  • A R Jensenius
  • V G Sanchez
  • A Zelechowska
  • K A V Bjerkestrand
A. R. Jensenius, V. G. Sanchez, A. Zelechowska, and K. A. V. Bjerkestrand. Exploring the myo controller for sonic microinteraction. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 442-445, Copenhagen, Denmark, 2017. Aalborg University Copenhagen.
Mumyo -evaluating and exploring the myo armband for musical interaction
  • K Nymoen
  • M R Haugen
  • A R Jensenius
K. Nymoen, M. R. Haugen, and A. R. Jensenius. Mumyo -evaluating and exploring the myo armband for musical interaction. In E. Berdahl and J. Allison, editors, Proceedings of the International Conference on New Interfaces for Musical Expression, pages 215-218, Baton Rouge, Louisiana, USA, May 2015. Louisiana State University.