Recent developments in Augmented Reality (AR) technology are opening up new modes of representation and interaction with virtual objects; at the same time, increase in processing power of portable devices is enabling a wide diffusion of applications until recently usable only in very specific situations (like motion-capture labs). This study aims to describe an AR environment created for musical performance: LINEAR (Live-generated Interface and Notation Environment in Augmented Reality), where the author explored some perspectives made possible by the current state of AR technology applied to music. In LINEAR, one dedicated performer using an AR iPhone app, can create virtual objects (rendered in real-time and superimposed to the real environment) according to the movement of the device; they are used both as virtual interfaces for electronics (sending OSC message to Max/MSP on a computer) and as forms of live-generated graphic notation. LINEAR allows, with some limitations, the representation of gestural movements with an exact 3-D placement in space: we can now have an analogic notation of gestures, rather than a symbolic one. For the iPhone performer , the act of notation corresponds to the notated act. The resulting representations can be also approached as graphic animated notation by other performers (the iPhone screen is mirrored to a projector). The multiple perspectives on the notation and the possibilities of interaction with virtual bodies allow a high level of flexibility, while introducing some almost unprecedented resources and foreseeing a very rich scenario.
Figures - uploaded by
Giovanni SantiniAuthor contentAll figure content in this area was uploaded by Giovanni Santini
Content may be subject to copyright.