Content uploaded by John Desnoyers-Stewart
Author content
All content in this area was uploaded by John Desnoyers-Stewart on Oct 10, 2017
Content may be subject to copyright.
Mixed Reality MIDI Keyboard
Demonstration
John Desnoyers-Stewart
Faculty of Media, Art, and Performance,
University of Regina
desnoyej@uregina.ca
David Gerhard
Department of Computer Science,
University of Regina
Megan Smith
Faculty of Media, Art, and Performance,
University of Regina
ABSTRACT
The Mixed Reality MIDI Keyboard is a prototype designed to augment virtual reality experiences
through the inclusion of a physical interface which aligns the user’s senses with the virtual environ-
ment. It also serves as a platform on which the uses of virtual reality in music interaction and art
installations can be experimented with. The main problem is that of synchronizing the real and virtual
environments in a convincing way that makes the user feel more connected to the experience. To
accomplish this a system of devices including an HTC Vive, Leap Motion hand tracker, and MIDI
Keyboard are used together to produce a convincing mixed reality instrument that aligns with the
user’s visual, tactile and proprioceptive senses. The system is being developed as both a mixed reality
musical instrument for use with common digital audio workstations, and as an installation piece
which allows users to explore the nature of perception which this virtual reality system itself takes
advantage of.
CCS CONCEPTS
•Human-centered computing →Mixed / augmented reality
;Keyboards;
•Applied computing
→Fine arts;Sound and music computing;
AM ’17, August 23–26, 2017, London, United Kingdom
©2017 Copyright held by the owner/author(s).
This is the author’s version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of
Record was published in Proceedings of AM ’17, August 23–26, 2017, hps://doi.org/10.1145/3123514.3123560.
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
KEYWORDS
Virtual Reality; Mixed Reality; MIDI; Interface; HTC Vive; Unity; Leap Motion; Arduino
ACM Reference Format:
John Desnoyers-Stewart, David Gerhard, and Megan Smith. 2017. Mixed Reality MIDI Keyboard Demonstration.
In Proceedings of AM ’17, London, United Kingdom, August 23–26, 2017, 5 pages.
hps://doi.org/10.1145/3123514.3123560
INTRODUCTION
The Mixed Reality MIDI Keyboard is a technical demo, musical instrument, installation artwork, and
public forum for the development of mixed reality interfaces. It is being developed primarily as a novel
musical interface for use with Digital Audio Workstations (DAWs). It will also serve as the focal point
of an interactive installation which experiments with Mixed Reality (MR) interfaces and explores the
potential of Virtual Reality (VR) as an artistic medium. Hansen suggests that human perception is an
embodied experience, one which cannot be reduced to any one sense, but exists through the complex
interaction of all of the senses [
3
]. As identified by Homan, the keyboard has been developed to
align user’s senses with their virtual experience by including real objects which connect the real and
virtual [
4
]. Sra and Schmandt also state that including touch produces a closer connection to the
virtual environment and enhances presence [5, 6].
Unity
Game Engine
User
HTC Vive
Headset
MIDI
Keyboard
Leap Motion
Hand Tracker
USB/HDMI
USB/MIDI
USB
Head Position & Visuals
Hand Position
Physical Input &
Haptic Feedback
Keyboard
Position
HTC Vive
Tracker
USB/BT
Digital Audio
Workstation
Figure 1: Interaction Model of the Mixed
Reality MIDI Keyboard System.
DESCRIPTION
The Mixed Reality MIDI Keyboard interface consists of an integrated virtual reality system with
numerous inputs and outputs (see Fig. 1). It uses a custom designed MIDI Keyboard, an HTC Vive VR
system, a Leap Motion hand tracker, and a VR capable computer. All components communicate via
USB with the Unity game engine which integrates the data and handles the simulation. To generate
audio, the MIDI signal is first sent to a DAW such as Bitwig Studio or Ableton Live, and is then sent
to the Unity engine using loopMIDI and MIDIjack [
2
,
7
]. This allows the real keyboard interface
to control the DAW like any other MIDI controller, but restricts the virtual interfaces to the virtual
environment. Future versions will either reverse this flow, or integrate the program into a Virtual
Studio Technology (VST) plug-in to enable virtual interfaces to control the DAW.
The project centers around the custom built, open-source
1
MIDI keyboard which provides the
1
hps://github.com/jdesnoyers/
Mixed-Reality-MIDI-Keyboard
physical interface to connect the user to the virtual experience. The keyboard, shown in Fig. 2, is
made of a wood and acrylic frame and features keys coated with black paint to make them less
visible to the Leap Motion’s infrared cameras, enhancing contrast and improving accuracy (see Fig. 3).
In addition to the 61-key keybed, the keyboard includes various inputs such as potentiometers, IR
distance sensors, buons, and capacitive sensors along the top surface where synthesizer controls
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
Figure 2: Mixed Reality MIDI Keyboard Prototype.
would typically be found; however, much of the keyboard surface remains empty, leaving room for
numerous virtual interfaces.
Figure 3: Screenshot showing infrared
data from the Leap Motion controller.
The primary focus of this research is the synchronization of the real and virtual environments. To
accomplish this the system uses a combination of MIDI data, location tracking, and hand tracking. The
MIDI data allows for the virtual instrument’s key-presses and controls to match the real instrument.
For example, a slide potentiometer’s position is synchronized using the MIDI data sent via the
corresponding control change command. Location tracking is accomplished using a Leap Motion
controller along with four infrared LEDs strategically placed along the surface of the keyboard to
locate the keyboard using a script available on the Leap Motion forums to capture the location of the
LEDs [
1
]. Alternately, tracking can be switched to an HTC Vive Tracker for beer spatial alignment
but at the cost of reduced accuracy of the hands’ alignment due to the discrepancy between the two
tracking systems. The use of hand tracking to locate and display the user’s hands in virtual space
allows for visual feedback which aligns the experience with the user’s proprioceptive senses.
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
Figure 4: Screenshot of the keyboard as seen in the VR HMD
In virtual space the keyboard is displayed with alternate textures and augmented with a number of
virtual interfaces from virtual potentiometers and capacitive sensors to novel 3-dimensional interfaces
(see Fig. 4). The output of these controls will be used to control seings in a DAW via MIDI or a VST plug-
in in the complete instrumentalized version. In the installation version they will control a simulation
of various layers of the physical perception of light, allowing the viewer to move through various
levels of "abstraction"–from a quantum particle view at one end, through each level of perception.
AUDIENCE
This demonstration is a preliminary iteration of a new musical interface which will enable creative
expression in a virtual environment. The primary audience in this form are musicians, composers,
artists, and creators. As part of a public art installation it will also serve as an opportunity to expose the
general public to this emergent technology while allowing them to influence its future development.
To promote the further technical development, participants are encouraged to complete a survey
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
as they leave the space, and may opt to be interviewed further in order to gather data about their
experience. This will allow the work to be substantiated into useful information which can be used in
the development of further VR experiences.
APPLICATION
This work directly applies to music interaction design through its novel use of a conventional, physical
musical interface to connect users with the rapidly developing world of Virtual Reality. This device
and the methods used in its development can be directly applied to the creation of new mixed reality
musical instruments, as well as creative interfaces that immerse VR users by linking them more
closely their real environments. This project also serves as an example of ways in which visual arts,
technology, and music can be combined to develop new methods of creative expression.
CONCLUSION
This demonstration of a Mixed Reality MIDI Keyboard is a display of new technology and an artistic
exploration into the capacity of this new medium. It enables improved immersion and encourages
interaction and participation. Furthermore, by exposing this developing technology to the public and
providing a venue for discussion it will allow for a collaborative shaping of this emerging field.
ACKNOWLEDGMENTS
Much of the research involved in this project has been made possible by the ongoing RCMP VR and
Simulation Research led by Dr. Megan Smith.
REFERENCES
[1]
2015. Retroreflective Marker Tracking Script for Unity. hps://community.leapmotion.com/t/
retroreflective-marker- tracking-script- for-unity/1596. (2015).
[2] T. Erichsen. 2015. loopMIDI. hps://www.tobias-erichsen.de/soware/loopmidi.html. (2015).
[3] Mark B. N. Hansen. 2004. New Philosophy for New Media. The MIT Press.
[4]
H. G. Homan. 1998. Physically touching virtual objects using tactile augmentation enhances the realism of virtual
environments. In Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180). 59–63.
[5]
M. Slater, A. Steed, and M. Usoh. 2013. Being there together: Experiments on presence in virtual environments (1990s).
Technical Report, Department of Computer Science, University College London, UK,.
[6]
M. Sra and C. Schmandt. 2016. Bringing real objects, spaces, actions, and interactions into social VR. In 2016 IEEE Third
VR International Workshop on Collaborative Virtual Environments (3DCVE). 16–17.
[7] K. Takahashi. 2015. MidiJack on GitHub. hps://github.com/keijiro/MidiJack. (2015).