Graham Wakefield

Graham Wakefield
York University · Department of Computational Arts

PhD Media Arts & Technology

About

62
Publications
17,412
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
413
Citations
Introduction
Graham Wakefield is an Associate Professor in the Department of Computational Arts, York University. Graham leads research in the Alice lab for Computational Worldmaking, focused in computational arts, virtual and mixed reality interactivity, and software engineering.

Publications

Publications (62)
Book
Generating Sound & Organizing Time is about the astonishing things you can do—and the insights you can find—when you work at the atomic sample-by-sample structure of digital audio. Whether you are a musician, sound designer, composer, or an experimentalist interested in creating music and tools to generate and modulate audio, our aim is to reveal...
Conference Paper
Full-text available
gestural music compositions through painted animations in VR
Article
Full-text available
Despite decades of virtual reality (VR) research, current creative workflows remain far from VR founder Jaron Lanier’s musically inspired dream of collaboratively ‘improvising reality’ from within. Drawing inspiration from modular synthesis as a distinctive musically immersed culture and practice, this article presents a new environment for visual...
Conference Paper
“Infranet” is a generative artwork interweaving data visualization and sonification, artificial intelligence, and evolutionary algorithms in a population of artificial life creatures, thriving upon geospatial data of the infrastructure of a city as its sustenance and canvas. Each exhibit of Infranet utilizes public data available on the host city;...
Conference Paper
Full-text available
This article describes a site-specific interactive mixed reality installation artwork involving a network of over a hundred motor-actuated bells, projections upon a 4x6m bed of salt, and a dual motion tracked virtual reality perspective inhabited by artificial life and integrating real-time volume capture. This work responds to very specific histor...
Chapter
Artificial Nature is a research-creation collaboration co-founded by Haru Hyunkyung Ji and Graham Wakefield in 2007. It has led to a decade of immersive installations in which the invitation is to become part of an alien ecosystem rich in feedback networks. Here we present four recent works in this series between 2017 and 2018.
Conference Paper
Full-text available
Inhabitat is a mixed-reality artwork in which participants become part of an imaginary ecology through three simultaneous perspectives of scale and agency; three distinct ways to see with other eyes. This imaginary world was exhibited at a children's science museum for five months, using an interactive projection-augmented sculpture, a large screen...
Article
Inhabitat is a mixed-reality artwork in which participants become part of an imaginary ecology through three simultaneous perspectives of scale and agency; three distinct ways to see with other eyes. This imaginary world was exhibited at a children’s science museum for five months, using an interactive projection-augmented sculpture, a large screen...
Chapter
Our research examines the use and potential of native web technologies for musical expression. We introduce two JavaScript libraries towards this end: Gibberish.js, a heavily optimized audio DSP library, and Interface.js, a GUI toolkit that works with mouse, touch and motion events. Together these libraries provide a complete system for defining mu...
Chapter
Full-text available
In live coding performance, performers create time-based works by programming them while these same works are being executed. The high cognitive load of this practice, along with differing ideas about how it should be addressed, results in a plurality of practices and a number of tensions at play. In this chapter we use a lens of five recurrent ten...
Conference Paper
Full-text available
The 3D modeling methods and approach presented in this paper attempt to bring the richness and spontaneity of human kinesthetic interaction in the physical world to the process of shaping digital form, by exploring playfully creative interaction techniques that augment gestural movement. The principal contribution of our research is a novel dynamic...
Conference Paper
Full-text available
We describe two new versions of the gibberwocky live-coding system. One integrates with Max/MSP while the second targets MIDI output and runs entirely in the browser. We discuss commonalities and differences between the three environments, and how they fit into the live-coding land- scape. We also describe lessons learned while performing with the...
Conference Paper
Full-text available
The growth of the live-coding community has been coupledwith a rich development of experimentation in new domain-specific languages, sometimes idiosyncratic to the interests of their performers. Nevertheless, programming language design may seem foreboding to many, steeped in computer science that is distant from the expertise of music performance....
Chapter
Full-text available
Since 2007 the authors have been pursuing a line of research-creation that utilizes installations of highly-immersive mixed reality and interactive generative art to investigate new relationships with a future that is increas- ingly immersed in computation, but which draws more inspiration from the complex sense of open-ended continuation found in...
Conference Paper
Full-text available
We describe a new live-coding system, named Gibberwocky, which integrates the Ableton Live digital audio workstation with a browser-based textual coding environment derived from the Gibber project. e live-coding interface emphasizes sequencing of events and parameter changes as opposed to audio synthesis, but also affords rapid construction of aud...
Article
Full-text available
Audio feedback is defined as a positive feedback of acoustic signals where an audio input and output form a loop, and may be utilized artistically. This article presents new context-based controls over audio feedback, leading to the generation of desired sonic behaviors by enriching the influence of existing acoustic information such as room respon...
Article
Since 2007, Graham Wakefield and Haru Ji have looked to nature for inspiration as they have created a series of "artificial natures," or interactive visualizations of biologically inspired complex systems that can evoke nature-like aesthetic experiences within mixed-reality art installations. This article describes how they have applied visualizati...
Conference Paper
Full-text available
This paper presents the interactive enhancement of audio feed- back through context-based control, leading to the genera- tion of desired sonic behaviors by augmenting the effects of physical space in the feedback sound. Our prototype maps approximations of room reverberation to tempo-scale char- acteristics of the audio feedback. These characteris...
Conference Paper
Full-text available
We document techniques and insights gained through the creation of interactive visualizations of biologically-inspired complex sys- tems that have been exhibited as mixed-reality art installations since 2007. A binding theme is the importance of endogenous accounts: that all perceivable forms have dynamic ontological ca- pacities within the world;...
Conference Paper
‘Poetry of Separation’ is a media artwork that utilizes an algorithmic generative editing system, selecting shots in real-time to be rendered over four screens arranged in layers. Editing in cinema reconstructs images by montage, deriving meaning from the juxtaposition of multiple shots. Although multiscreen projections have been used to present se...
Conference Paper
Full-text available
Aiming for high-level intentional control of audio feedback, though microphones, loudspeakers and digital signal processing, we present a system adapting toward chosen sonic features. Users control the system by selecting and changing feature objectives in real-time. The system has a second-order structure in which the internal signal processing al...
Article
Native Web technologies provide great potential for musical expression. We introduce two JavaScript libraries towards this end: Gibberish.js, providing heavily optimized audio DSP, and Interface.js, a GUI toolkit that works with mouse, touch, and motion events. Together they provide a complete system for defining musical instruments that can be use...
Conference Paper
Lost Fragments of Night' is a poetic documentary film that utilizes an algorithmic generative editing system to preselect shots to be rendered over four screens arranged in layers. The artwork's subject is the chaotic and paradoxical sensation found by night in the city of Seoul. These author themes of disconnected and paradoxical images in urban p...
Conference Paper
Full-text available
In this paper, we present a new kind of wearable augmented reality (AR) 3D sculpting system called AiRSculpt in which users could directly translate their fluid finger movements in air into expressive sculptural forms and use hand gestures to navigate the interface. In AiRSculpt, as opposed to VR-based systems, users could quickly create and manipu...
Conference Paper
The instant messenger has developed as an important communication media platform. However, because of the nature of instant communication, instant messenger services place many limitations on communicating with nuance. We believe that the easy nature of digital communications tends to weaken serious aspects of personal communication such as patienc...
Conference Paper
Full-text available
We discuss live coding audio-visual worlds for large-scale virtual reality environments. We describe Alive, an instru- ment allowing multiple users to develop sonic and visual behaviors of agents in a virtual world, through a browser- based collaborative code interface, accessible while being immersed through spatialized audio and stereoscopic dis-...
Article
This paper describes our research in full-surround, multimodal, multi-user, immersive instrument design in a large VR instrument. The three-story instrument, designed for large-scale, multimodal representation of complex and potentially high-dimensional information, specifically focuses on multi-user participation by facilitating interdisciplinary...
Article
The AlloSphere provides multiuser spatial interaction through a curved surround screen and surround sound. Two projects illustrate how researchers employed the AlloSphere to investigate the combined use of personal-device displays and the shared display. Another two projects combined multiuser interaction with multiagent systems. These projects poi...
Conference Paper
Full-text available
We discuss an evolving series of interactive artworks with respect to theoretical perspectives regarding the concept of presence.
Conference Paper
Our research examines the use and potential of native web technologies for musical expression. We introduce two Java- Script libraries towards this end: Gibberish.js, a heavily op- timized audio DSP library, and Interface.js, a GUI toolkit that works with mouse, touch and motion events. Together these libraries provide a complete system for definin...
Chapter
Full-text available
Time of Doubles is an immersive, interactive art installation, and an instantiation of contemporary art research on the creation of possible worlds. It invites visitors to experience mirror existences of themselves taking upon new roles as sources of energy and kinetic disturbance within a perpetually changing virtual ecosystem. This world displays...
Thesis
In the interactive computer arts, any advance that significantly amplifies or extends the limits and capacities of software can enable genuinely novel aesthetic experiences. Within compute-intensive media arts, flexibility is often sacrificed for needs of efficiency, through the total separation of machine code optimization and run-time execution....
Conference Paper
Full-text available
Spatial music can create immersive experiences of alter-nate realities. To what degree can this be expanded to the composition of immersive, navigable, audio-visual worlds? We present an integrated collection of exten-sions to the Max/MSP/Jitter environment to assist the tightly integrated construction of such worlds, designed for use within audio-...
Conference Paper
Full-text available
We present the Device Server, a framework and applica-tion driving interaction in the AlloSphere virtual reality environment. The motivation and development of the De-vice Server stems from the practical concerns of managing multi-user interactivity with a variety of physical devices for disparate performance and virtual reality environments housed...
Conference Paper
Full-text available
We describe LuaAV, a runtime library and applica- tion which extends the Lua programming language to support computational composition of temporal, sound, visual, spatial and other elements. In this paper we document how we have attempted to maintain several core principles of Lua itself - extensibility, meta-mechanisms, eciency, portabil- ity - wh...
Conference Paper
How does artificial-life art adapt to its environment? What is the significance of a computational ecosystem proposed as contemporary art? These are some of the ideas examined in this bio-inspired immersive art installation.
Article
Full-text available
This paper describes the creation of the Allobrain project, an interactive, stereographic, 3D audio, immersive virtual world constructed from fMRI brain data and installed in the Allosphere, one of the largest virtual reality spaces in existence. This paper portrays the role the Allobrain project played as an artwork driving the technological infra...
Conference Paper
Full-text available
We discuss the potential of just-in-time compilation for computer music software to evade compromises of flexibil-ity and efficiency due to the discrepancies between the re-spective natures of composition and computation and also to augment exploratory and generative capacity. We present a range of examples and approaches using LLVM compiler infras...
Conference Paper
Full-text available
Artificial Nature is a trans-disciplinary multimodal interactive art installation and a research subject investigating the application of bio-inspired system theories towards engaging aesthetic world-making. Our motivation is to develop a deeper understanding of emergence and creativity as a form of art, study and play, by taking inspiration from n...
Conference Paper
Full-text available
Artificial Nature is a trans-disciplinary research project drawing upon bio-inspired system theories in the production of engaging immersive worlds as art installations. Embodied world making and immersion are identified as key components in an exploration of creative ecosystems toward art-as-it-could-be. A detailed account of the design of a succe...
Conference Paper
Full-text available
In this paper we put forward a notion of computational composition: an exploratory approach to creativity uniquely available by means of computation. The implications for an expanded overlap of the intellectual and computational are discussed and developed into a design strategy. Finally we describe progress on our implementation.
Article
In the current No Child Left Behind era, K-12 teachers and principals are expected to have a sophisticated understanding of standardized test results, use them to improve instruction, and communicate them to others. The goal of our project, funded by the National Science Foundation, was to develop and evaluate three Web-based instructional modules...
Conference Paper
Full-text available
This document describes the AlloBrain, the debut content created for presentation in the AlloSphere at the University of California, Santa Barbara, and the Cosm toolkit for the prototyping of interactive immersive environments using higher-order Ambisonics and stereographic projections. The Cosm toolkit was developed in order to support the prototy...
Chapter
Full-text available
We describe extensions to the Lua programming language constituting a novel platform to support practice and investigation in computational audiovisual composition. Significantly, these extensions enable the tight real-time integration of computation, time, sound and space, and follow a modus operandi of development going back to immanent propertie...
Conference Paper
Full-text available
The UCSB Allosphere is a 3-story-high spherical instrument in which virtual environments and performances can be experienced in full immersion. The space is now being equipped with high-resolution active stereo projectors, a 3D sound system with several hundred speakers, and with tracking and interaction mechanisms. The Allosphere is at the same t...
Conference Paper
In this paper, we present new opportunities to overcome some of the inherent limitations of a visual data-flow environment such as Max/MSP/Jitter, by using domain specific (audio and graphical) extensions of the Lua programming language as libraries (externals). Lua is flexible, extensible and efficient, making it an ideal choice for designing a pr...
Thesis
Full-text available
The rich new terrains offered by computer music invite the exploration of new techniques to compose within them. The computational nature of the medium has suggested algorithmic approaches to composition in the form of generative musical structure at the note level and above, and audio signal processing at the level of individual samples. In the re...
Conference Paper
In this paper, a new interface for programming multimedia compositions in Max/MSP/Jitter using the Lua scripting language is presented. Lua is extensible and efficient making it an ideal choice for designing a programmatic interface for multimedia compositions. First, we discuss the distinctions of graphical and textual interfaces for composition a...
Conference Paper
Full-text available
This paper describes a package of extensions (externals) for Cycling '74's Max/MSP software to facilitate the exploration of Ambisonic techniques of up to third order. Areas of exploration well suited to the Max/MSP environment and techniques for composition within the Ambisonic domain using the presented externals are described.

Network

Cited By