January 2011
·
31 Reads
·
9 Citations
Recent developments have led to the availability of consumer devices capable of recognising certain human movements and gestures. This paper is a study of novel gesture-based audio interfaces. The authors present two prototypes for interacting with audio/visual experiences. The first allows a user to 'conduct' a recording of an orchestral performance, controlling the tempo and dynamic. The paper describes the audio and visual capture of the orchestra and the design and construction of the audio-visual playback system. An analysis of this prototype, based on testing and feedback from a number of users, is also provided. The second prototype uses the gesture tracking algorithm to control a three-dimensional audio panner. This audio panner is tested and feedback from a number of professional engineers is analysed.