PosterPDF Available

Myo Mapper: a Myo armband to OSC mapper

Authors:

Abstract

A 'quick and easy' solution for exploring Thalmic Labs Myo's potential for realising new interfaces for musical expression.
A preview of the PDF is not available
... Biometrics were acquired in Max 8 and then the MAV (as discussed in Section 2.2) was calculated for each EMG reading of the eight Myo electrodes (16 in total over both Myos worn on the left and right arms). We also used MyoMapper to capture Euler orientation data [11]. In total we captured the following data fields during acquisition for both Myos, stored in respective Myo CSV files: index number, time(ms), EMGs 1-8 without MAV calculation, EMGs 1-8 with MAV calculation, Euler orientation (Roll, Pitch, Yaw), recorded audio signal information (detected MIDI note, audio frequency (Hz), frequency confidence, audio file duration (ms)), recorded video information (frame number), global time information (hours, minutes, seconds), musical information (bar number, beat number, set metronome value of exercise, and expected MIDI notes of exercise); comprised of 33 column fields for each CSV file. ...
Preprint
Full-text available
Within the last few years, wearable sensor technologies have allowed us to access novel biometrics that give us the ability to connect musical gesture to computing systems. Doing this affords us to study how we perform musically and understand the process at data level. However, biometric information is complex and cannot be directly mapped to digital systems. In this work, we study how guitar performance techniques can be captured/analysed towards developing an AI which can provide real-time feedback to guitar students. We do this by performing musical exercises on the guitar whilst acquiring and processing biometric (plus audiovisual) information during their performance. Our results show: there are notable differences within biometrics when playing a guitar scale in two different ways (legato and staccato) and this outcome can be used to motivate our intention to build an AI guitar tutor.
Article
Full-text available
Gesture-based musical performance with on-body sensing represents a particular case of wearable connection. Gloves and hand-sensing interfaces connected to real-time digital sound production and transformation processes enable empty-handed, expressive musical performance styles. In this article, the origins and developments of this practice as well as a specific use case are investigated. By taking the technical, cognitive, and cultural dimensions of this media performance as foundation, a reflection on the value, limitations, and opportunities of computational approaches to movement translation and analysis is carried out. The insights uncover how the multilayered, complex artistic situations produced by these performances are rich in intersections and represent potent amplifiers for investigating corporeal presence and affective engagement. This allows to identify problems and opportunities of existing research approaches and core issues to be solved in the domain of movement, music, interaction technology, and performance research.
Conference Paper
Full-text available
Transference is a hybrid computational system for improvised violin performance. With hardware sensors and digital signal processing (DSP), the system shapes live acoustic input and computer-generated sound. An electromyographic (EMG) sensor unobtrusively monitors movements of the left hand, while a custom glove controller tracks bowing gestures of the right arm. Through continuous musical gesture the performer is able to actuate and perturb streams of computationally transmuted audio. No additional layers of windowing or semantically-inflected processes of machine learning mediate this process. Remaining at the level of signal processing, the lack of windowed and/or statistical mediation creates a sense of fine-grain tactility and physical transduction for the performer. The strategies employed are sufficiently generalizable to apply to situations beyond those imagined and implemented here within the scope of augmented violin performance.
Conference Paper
Full-text available
This paper describes a technique of multimodal, multichannel control of electronic musical devices using two control methodologies, the Electromyogram (EMG) and relative position sensing. Requirements for the application of multimodal interaction theory in the musical domain are discussed. We introduce the concept of bidirectional complementarity to characterize the relationship between the component sensing technologies. Each control can be used independently, but together they are mutually complementary. This reveals a fundamental difference from orthogonal systems. The creation of a concert piece based on this system is given as example.
Mumyo -evaluating and exploring the myo armband for musical interaction
  • K Nymoen
  • M R Haugen
  • A R Jensenius
K. Nymoen, M. R. Haugen, and A. R. Jensenius. Mumyo -evaluating and exploring the myo armband for musical interaction. In Proceedings of the International Conference on New Interfaces for Musical Expression, NIME '15, pages 215-218, Baton Rouge, Louisiana, USA, 2015.