Conference PaperPDF Available

Abstract and Figures

The Mixed Reality MIDI Keyboard is a prototype designed to augment virtual reality experiences through the inclusion of a physical interface which aligns the user's senses with the virtual environment. It also serves as a platform on which the uses of virtual reality in music interaction and art installations can be experimented with. The main problem is that of synchronizing the real and virtual environments in a convincing way that makes the user feel more connected to the experience. To accomplish this a system of devices including an HTC Vive, Leap Motion hand tracker, and MIDI Keyboard are used together to produce a convincing mixed reality instrument that aligns with the user's visual, tactile and proprioceptive senses. The system is being developed as both a mixed reality musical instrument for use with common digital audio workstations, and as an installation piece which allows users to explore the nature of perception which this virtual reality system itself takes advantage of.
Content may be subject to copyright.
Mixed Reality MIDI Keyboard
Demonstration
John Desnoyers-Stewart
Faculty of Media, Art, and Performance,
University of Regina
desnoyej@uregina.ca
David Gerhard
Department of Computer Science,
University of Regina
Megan Smith
Faculty of Media, Art, and Performance,
University of Regina
ABSTRACT
The Mixed Reality MIDI Keyboard is a prototype designed to augment virtual reality experiences
through the inclusion of a physical interface which aligns the user’s senses with the virtual environ-
ment. It also serves as a platform on which the uses of virtual reality in music interaction and art
installations can be experimented with. The main problem is that of synchronizing the real and virtual
environments in a convincing way that makes the user feel more connected to the experience. To
accomplish this a system of devices including an HTC Vive, Leap Motion hand tracker, and MIDI
Keyboard are used together to produce a convincing mixed reality instrument that aligns with the
user’s visual, tactile and proprioceptive senses. The system is being developed as both a mixed reality
musical instrument for use with common digital audio workstations, and as an installation piece
which allows users to explore the nature of perception which this virtual reality system itself takes
advantage of.
CCS CONCEPTS
Human-centered computing Mixed / augmented reality
;Keyboards;
Applied computing
Fine arts;Sound and music computing;
AM ’17, August 23–26, 2017, London, United Kingdom
©2017 Copyright held by the owner/author(s).
This is the author’s version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of
Record was published in Proceedings of AM ’17, August 23–26, 2017, hps://doi.org/10.1145/3123514.3123560.
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
KEYWORDS
Virtual Reality; Mixed Reality; MIDI; Interface; HTC Vive; Unity; Leap Motion; Arduino
ACM Reference Format:
John Desnoyers-Stewart, David Gerhard, and Megan Smith. 2017. Mixed Reality MIDI Keyboard Demonstration.
In Proceedings of AM ’17, London, United Kingdom, August 23–26, 2017, 5 pages.
hps://doi.org/10.1145/3123514.3123560
INTRODUCTION
The Mixed Reality MIDI Keyboard is a technical demo, musical instrument, installation artwork, and
public forum for the development of mixed reality interfaces. It is being developed primarily as a novel
musical interface for use with Digital Audio Workstations (DAWs). It will also serve as the focal point
of an interactive installation which experiments with Mixed Reality (MR) interfaces and explores the
potential of Virtual Reality (VR) as an artistic medium. Hansen suggests that human perception is an
embodied experience, one which cannot be reduced to any one sense, but exists through the complex
interaction of all of the senses [
3
]. As identified by Homan, the keyboard has been developed to
align user’s senses with their virtual experience by including real objects which connect the real and
virtual [
4
]. Sra and Schmandt also state that including touch produces a closer connection to the
virtual environment and enhances presence [5, 6].
Unity
Game Engine
User
HTC Vive
Headset
MIDI
Keyboard
Leap Motion
Hand Tracker
USB/HDMI
USB/MIDI
USB
Head Position & Visuals
Hand Position
Physical Input &
Haptic Feedback
Keyboard
Position
HTC Vive
Tracker
USB/BT
Digital Audio
Workstation
Figure 1: Interaction Model of the Mixed
Reality MIDI Keyboard System.
DESCRIPTION
The Mixed Reality MIDI Keyboard interface consists of an integrated virtual reality system with
numerous inputs and outputs (see Fig. 1). It uses a custom designed MIDI Keyboard, an HTC Vive VR
system, a Leap Motion hand tracker, and a VR capable computer. All components communicate via
USB with the Unity game engine which integrates the data and handles the simulation. To generate
audio, the MIDI signal is first sent to a DAW such as Bitwig Studio or Ableton Live, and is then sent
to the Unity engine using loopMIDI and MIDIjack [
2
,
7
]. This allows the real keyboard interface
to control the DAW like any other MIDI controller, but restricts the virtual interfaces to the virtual
environment. Future versions will either reverse this flow, or integrate the program into a Virtual
Studio Technology (VST) plug-in to enable virtual interfaces to control the DAW.
The project centers around the custom built, open-source
1
MIDI keyboard which provides the
1
hps://github.com/jdesnoyers/
Mixed-Reality-MIDI-Keyboard
physical interface to connect the user to the virtual experience. The keyboard, shown in Fig. 2, is
made of a wood and acrylic frame and features keys coated with black paint to make them less
visible to the Leap Motion’s infrared cameras, enhancing contrast and improving accuracy (see Fig. 3).
In addition to the 61-key keybed, the keyboard includes various inputs such as potentiometers, IR
distance sensors, buons, and capacitive sensors along the top surface where synthesizer controls
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
Figure 2: Mixed Reality MIDI Keyboard Prototype.
would typically be found; however, much of the keyboard surface remains empty, leaving room for
numerous virtual interfaces.
Figure 3: Screenshot showing infrared
data from the Leap Motion controller.
The primary focus of this research is the synchronization of the real and virtual environments. To
accomplish this the system uses a combination of MIDI data, location tracking, and hand tracking. The
MIDI data allows for the virtual instrument’s key-presses and controls to match the real instrument.
For example, a slide potentiometer’s position is synchronized using the MIDI data sent via the
corresponding control change command. Location tracking is accomplished using a Leap Motion
controller along with four infrared LEDs strategically placed along the surface of the keyboard to
locate the keyboard using a script available on the Leap Motion forums to capture the location of the
LEDs [
1
]. Alternately, tracking can be switched to an HTC Vive Tracker for beer spatial alignment
but at the cost of reduced accuracy of the hands’ alignment due to the discrepancy between the two
tracking systems. The use of hand tracking to locate and display the user’s hands in virtual space
allows for visual feedback which aligns the experience with the user’s proprioceptive senses.
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
Figure 4: Screenshot of the keyboard as seen in the VR HMD
In virtual space the keyboard is displayed with alternate textures and augmented with a number of
virtual interfaces from virtual potentiometers and capacitive sensors to novel 3-dimensional interfaces
(see Fig. 4). The output of these controls will be used to control seings in a DAW via MIDI or a VST plug-
in in the complete instrumentalized version. In the installation version they will control a simulation
of various layers of the physical perception of light, allowing the viewer to move through various
levels of "abstraction"–from a quantum particle view at one end, through each level of perception.
AUDIENCE
This demonstration is a preliminary iteration of a new musical interface which will enable creative
expression in a virtual environment. The primary audience in this form are musicians, composers,
artists, and creators. As part of a public art installation it will also serve as an opportunity to expose the
general public to this emergent technology while allowing them to influence its future development.
To promote the further technical development, participants are encouraged to complete a survey
Mixed Reality MIDI Keyboard Demonstration AM ’17, August 23–26, 2017, London, United Kingdom
as they leave the space, and may opt to be interviewed further in order to gather data about their
experience. This will allow the work to be substantiated into useful information which can be used in
the development of further VR experiences.
APPLICATION
This work directly applies to music interaction design through its novel use of a conventional, physical
musical interface to connect users with the rapidly developing world of Virtual Reality. This device
and the methods used in its development can be directly applied to the creation of new mixed reality
musical instruments, as well as creative interfaces that immerse VR users by linking them more
closely their real environments. This project also serves as an example of ways in which visual arts,
technology, and music can be combined to develop new methods of creative expression.
CONCLUSION
This demonstration of a Mixed Reality MIDI Keyboard is a display of new technology and an artistic
exploration into the capacity of this new medium. It enables improved immersion and encourages
interaction and participation. Furthermore, by exposing this developing technology to the public and
providing a venue for discussion it will allow for a collaborative shaping of this emerging field.
ACKNOWLEDGMENTS
Much of the research involved in this project has been made possible by the ongoing RCMP VR and
Simulation Research led by Dr. Megan Smith.
REFERENCES
[1]
2015. Retroreflective Marker Tracking Script for Unity. hps://community.leapmotion.com/t/
retroreflective-marker- tracking-script- for-unity/1596. (2015).
[2] T. Erichsen. 2015. loopMIDI. hps://www.tobias-erichsen.de/soware/loopmidi.html. (2015).
[3] Mark B. N. Hansen. 2004. New Philosophy for New Media. The MIT Press.
[4]
H. G. Homan. 1998. Physically touching virtual objects using tactile augmentation enhances the realism of virtual
environments. In Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180). 59–63.
[5]
M. Slater, A. Steed, and M. Usoh. 2013. Being there together: Experiments on presence in virtual environments (1990s).
Technical Report, Department of Computer Science, University College London, UK,.
[6]
M. Sra and C. Schmandt. 2016. Bringing real objects, spaces, actions, and interactions into social VR. In 2016 IEEE Third
VR International Workshop on Collaborative Virtual Environments (3DCVE). 16–17.
[7] K. Takahashi. 2015. MidiJack on GitHub. hps://github.com/keijiro/MidiJack. (2015).
... 77. Graham and Cook,Rethinking Curating,128. 78. Graham and Cook,Rethinking Curating,130. ...
Thesis
Full-text available
As interest in Virtual Reality (VR) grows, there is a need to critically engage with it, to explore the possibilities created by it, and to understand the technology and content required for its success. VR is considered in terms of its purpose and function rather than being fixed to any particular technology. To keep pace with the rapid progression of this field, VR is explored with a vision towards its future. VR is a communication medium that connects more directly to the senses than any prior media, mediating perception almost directly. VR requires that the user’s unavoidable connection to reality be acknowledged while simultaneously expanding upon that reality. Through a stronger connection with the virtual, VR can expand the domain of natural experience as the virtual becomes more directly integrated in reality. The proliferation of virtual technology has produced a shift towards the body as framer of information. VR attempts to remove the frame entirely, connecting directly with the viewer’s senses. VR is intrinsically participative and public installations are inherently social. Their combination into public VR translates a technology that might otherwise be private and disconnected into a site of participative social production. Following a practice-based research and design method informed by the arts and engineering, the limitations and requirements of VR production are confronted. New interfaces are developed and concepts for the application of VR established through an open, continuously iterative process. This exploration culminates in the exhibition of the technology, concepts, and theory encountered through a public VR installation that encourages social interaction and play and provides an opportunity to experience and understand the potential of VR.
... The HTC Tracker and SteamVR HDK offer methods of locating arbitrary real objects, while the Leap Motion and Kinect track the body without worn trackers. We have employed these technologies to produce a Mixed Reality (MR) MIDI keyboard [1]. While the physical keyboard improves presence in VR, it requires the development of virtual interfaces that build upon this presence to create a more immersive and creative experience in VR. ...
Article
In this paper we present a collection of virtual interfaces used to augment a MIDI keyboard synchronized in physical and virtual space. Several virtual interfaces are developed and evaluated. Some utilize the tactility offered by the keyboard's surface, while others rely on the improved presence offered by the keyboard. An evaluation of these virtual interfaces is made with respect to learnability and utility to identify the successes and failures of these interfaces.
Article
Full-text available
This work presents the development of a mixed reality (MR) application that uses color Passthrough for learning to play the piano. A study was carried out to compare the interpretation outcomes of the participants and their subjective experience when using the MR application developed to learn to play the piano with a system that used Synthesia (N = 33). The results show that the MR application and Synthesia were effective in learning piano. However, the students played the pieces significantly better when using the MR application. The two applications both provided a satisfying user experience. However, the subjective experience of the students was better when they used the MR application. Other conclusions derived from the study include the following: (1) The outcomes of the students and their subjective opinion about the experience when using the MR application were independent of age and gender; (2) the sense of presence offered by the MR application was high (above 6 on a scale of 1 to 7); (3) the adverse effects induced by wearing the Meta Quest Pro and using our MR application were negligible; and (4) the students showed their preference for the MR application. As a conclusion, the advantage of our MR application compared to other types of applications (e.g., non-projected piano roll notation) is that the user has a direct view of the piano and the help elements appear integrated in the user’s view. The user does not have to take their eyes off the keyboard and is focused on playing the piano.
Chapter
This paper studies the realization of the virtual piano system based on a monocular inverted camera. Compared to the traditional physical piano with the high price and large size, the virtual piano based on computer vision is economical and portable for novices to learn playing piano. However, while reducing learning cost, the virtual piano has to make compromises in terms of user experience and interactivity. On one hand, the software virtual piano only uses a computer keyboard, which cannot meet the players’ cognition of the positions of physical piano keys. On the other hand, the virtual piano system which uses external binocular cameras is still at the theoretical stage. In practice, it is greatly limited by the problems of mutual coverage by left and right hands, large depth detection error, poor real-time performance, and so on. Therefore, in order to simulate the real experience of physical piano playing, this paper proposes the realization method of the virtual piano system based on a monocular inverted camera and a transparent plate with piano keys pattern. In detail, we use the offline modeling method to get the position of the piano keys and use the color threshold segmentation algorithm to realize the fingertips detection. Then, we detect the keys’ position and depth according to fingertips pressing the keys, in turn, realize the virtual piano functions and complete the human-computer interaction. The experiments verify the practicality of the virtual piano system.
Conference Paper
The study explored the impact of physically touching a virtual object on how realistic the VE seems to the user. Subjects in a "no touch" group picked up a 3D virtual image of a kitchen plate in a VE, using a traditional 3D wand. "See and touch" subjects physically picked up a virtual plate possessing solidity and weight, using a mixed-reality force feedback technique. Afterwards, subjects made predictions about the properties of other virtual objects they saw but did not interact with in the VE. "See and touch" subjects predicted these objects would be more solid, heavier, and more likely to obey gravity than the "no touch" group. Results provide converging evidence for the value of adding physical qualities to virtual objects. The study first empirically demonstrates the effectiveness of mixed reality as a simple, safe, inexpensive technique for adding physical texture and force feedback cues to virtual objects with large freedom of motion. Examples of practical applications are discussed.
Being there together: Experiments on presence in virtual environments
  • M Slater
  • A Steed
  • M Usoh
M. Slater, A. Steed, and M. Usoh. 2013. Being there together: Experiments on presence in virtual environments (1990s). Technical Report, Department of Computer Science, University College London, UK,.
MidiJack on GitHub. https://github.com/keijiro/MidiJack
  • K Takahashi
K. Takahashi. 2015. MidiJack on GitHub. https://github.com/keijiro/MidiJack. (2015).