James Dooley

James Dooley
Birmingham City University | BCU · Royal Birmingham Conservatoire

Doctor of Philosophy

About

12
Publications
2,083
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
23
Citations

Publications

Publications (12)
Article
The goal of our research is to provide harpists with the tools to control and transform the sounds of their instrument in a natural and musical way. We consider the development of music with live electronics, with particular reference to the harp repertoire, and include interviews with six harpists that use technology in their professional performa...
Data
Poster presentation for the Reach project at the NIME 2019 conference in Porto Alegre, Brazil.
Conference Paper
This paper presents Reach, a keyboard-based gesture recognition system for live piano sound modulation. Reach is a system built using the Leap Motion Orion SDK, Pure Data and a custom C++ OSC mapper1. It provides control over the sound modulation of an acoustic piano using the pianist’s ancillary gestures. The system was developed using an iterativ...
Conference Paper
Full-text available
Taking inspiration from research into deliberately constrained musical technologies and the emergence of neurodiverse, child-led musical groups such as the Artism Ensemble, the interplay between design-constraints, inclusivity and appropriation is explored. A small scale review covers systems from two prominent UK-based companies, and two iteration...
Conference Paper
Full-text available
Article
This article explores and proposes new ways of performing in a technology-mediated environment. We present a case study that examines feedback loop relationships between a dancer and a pianist. Rather than using data from sensor technologies to directly control and affect musical parameters, we captured data from a dancer’s arm movements and mapped...
Conference Paper
Full-text available
We present MyoSpat, an interactive system that enables performers to control sound and light projections through hand-gestures. MyoSpat is designed and developed using the Myo armband as an input device and Pure Data (Pd) as an audiovisual engine. The system is built upon human-computer interaction (HCI) principles; specifically, tangible computing...
Conference Paper
Full-text available
MyoSpat is an interactive audiovisual system that aims to augment musical performances by empowering musicians and allowing them to directly manipulate sound and light through hand gestures. We present the second iteration of the system that draws from the research findings to emerge from an evaluation of the first system [1]. MyoSpat 2 is designed...
Article
Full-text available
Teaching live electronic music techniques to instrumental performers presents some interesting challenges. Whilst most higher music education institutions provide opportunities for composers to explore computer-based techniques for live audio processing, it is rare for performers to receive any formal training in live electronic music as part of th...

Network

Cited By

Projects

Projects (2)
Archived project
PhD exploring how ancillary gestures can be successfully used by pianists and keyboard players to intuitive manipulate audio processing parameters at a first encounter.
Project
An interactive live performance for harp and sound spatialisation, which explores human-computer interaction in live musical performance.