Alexis Zerroug

The University of Tokyo, Tōkyō, Japan

Are you Alexis Zerroug?

Claim your profile

Publications (9)0 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: "scoreLight" and "scoreBots" are two experimental platforms for performative sound design and manipulation. Both are essentially synesthetic interfaces - synesthetic musical instruments - capable of translating free-hand drawings into a sonic language of beats and pitches, all in real time. While scoreLight uses a modified "smart" laser scanner to track the figure's relevant features (in particular contours), scoreBots rely on one or more tiny line-follower robots to do the same. We present here some of our latest experimentations in an informal way.
    05/2012;
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Light Arrays project explores the extension of the body through an array of visible light beams projecting on the environment a dynamic representation of the body, its movement and posture. Interestingly, these light cues are visible both for the user wearing the device as well as for others. The result is an experiential bridge between what we see and what we feel or know about the dynamic, moving body. The Light Arrays afford augmented proprioception, generated through the artificial visual feedback system; enhanced body interaction prompted by the interactively augmented body image (in time and space); as well as a clear visual representation of interpersonal and inter-structural | architectural space.
    01/2012;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We study here a wearable device capable of translating bio-sensed data into cartoon-like graphics projected in the physical surround-ing. We hypothesize that such 'expressive laser aura' (LA) may serve for biofeedback purposes; more interestingly, as the display extends past the wearer's intimate/personal space it can comple-ment non-verbal communication by giving others an instant cue about that person's real inner state (or unformulated needs), at so-cial or even at public range. In this preliminary work, we demon-strate a (non wearable) LA that changes its shape as a function of the level of anxiety of an office worker. In the future we plan to make the device wearable, as well as give it the ability to represent more complex states such as emotions.
    01/2011;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: scoreLight is a playful musical instrument capable of gen-erating sound from the lines of drawings as well as from the edges of three-dimensional objects nearby (including every-day objects, sculptures and architectural details, but also the performer's hands or even the moving silhouettes of dancers). There is no camera nor projector: a laser spot explores shapes as a pick-up head would search for sound over the surface of a vinyl record -with the significant dif-ference that the groove is generated by the contours of the drawing itself. Keywords H5.2 [User Interfaces] interaction styles / H.5.5 [Sound and Music Computing] Methodologies and techniques / J.5 [Arts and Humanities] performing arts
    06/2010;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The 'Smart Laser Projector' (SLP) is a modified laser-based projector capable of displaying while simultaneously using the laser beam (at the same or different wavelength or polarization) as a LIDAR probe gathering information about the projection surface (its borders, 3d shape, relative position and orientation, as well as fine texture and spectral reflectance). This information can then be used to correct perspective warp, perform per-pixel contrast compensation, or even reroute the scanning/projecting path altogether (for tracking, feature discovery or barcode reading for instance). We demonstrate here raster-scan and vector graphics applications on two different prototypes. The first relies on a pair of galvanomirrors, and is used for demonstrating simultaneous tracking and display on the palm of the hand, depth-discriminating active contours (for spatially augmented reality surveying), and interactive games. The other relies on a single 2-axis MEMS mirror working in resonant mode, and is used to demonstrate edge enhancement of printed material and 'artificial fluorescence' - all with perfect projection-to-real-world registration by construction.
    International Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2010, Los Angeles, California, USA, July 26-30, 2010, Emerging Technologies Proceedings; 01/2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We introduce here our latest 'smart laser projector' prototype, i.e, a modified laser-based projector capable of augmenting all kind of surfaces while simultaneously using the laser as a scanning beam used for gathering information about that surface shape, texture, re-flectivity, relative motion, etc. 'Augmenting' surfaces (including tables, desktops, walls and floors, but also human skin, paintings, market products on a shelf, etc) means here alphanumeric or iconic annotation, highlighting features invisible to the naked eye (for in-stance veins under the skin, small scratches or oily spots on sur-faces), cueing (using flashes of light for marking secure perimeters or indicating dangerous obstacles) and line and contour enhance-ment for practical or aesthetic purposes. This device fits in many ways the definition of a 'smart projector' as described in [1] but presents a number of advantages with respect to the classical pro-jector/camera configuration. Figure 1: Combined green and red lasers trapped on a drawing We demonstrate the prototype through a couple of interactive ap-plications. The first was previously demonstrated on an earlier (bulkier) system [2]. The application is playful in nature and con-sists on generating a laser spot that seems alive as it keeps running on contours of drawings or bouncing on flat figures as well as over the edges of three-dimensional objects (Fig.1). The second applica-tion is a simulation of the refraction of light beams on a 2d surface, by simulating the Snell-law computed at the interfaces of gray level regions on a flat drawing -the gray level corresponding to a virtual refraction index (Fig.5).
    01/2010;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present here a first prototype of the Virtual Haptic Radar (VHR), a wearable device helping actors become aware of the presence of invisible virtual objects in their path when evolving in a virtual studio (such as a "bluescreen" filming stage [Figure 1]). The VHR is a natural extension of the Haptic Radar (HR) and its principle [Cassinelli et al. 2006] in the realm of virtual reality: while each module of the HR had a small vibrator and a rangefinder to measure distance to real obstacles, the VHR module lacks the rangefinder but accommodates instead a (cheap) ultrasound-based indoor positioning system that gives it the ability to know exactly where it is situated relatively to an external frame of reference.
    01/2009;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present the development vision of a range of interactive body-worn lighting systems for performance, play, rehabilitation and dis- or altered- ability support. The systems combine experimental and off-the-shelf technologies to arrive at outcomes that require and inspire extended physical and expressive engagement, and afford a range of different learning opportunities. We discuss the context and background, our aims and approach - mixing art, design and engineering methodologies. We then outline a number of scenarios of use and their relevance to ArtAbilitation. Our aim is to open up a dialogue with the ArtAbilitation community in the early stages, to generate collaborative interest and inform development.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: With this paper we wish to promote a discussion about the different forms that an immersive VR system can take. This will be done by reflecting on a controversial concept: that of a totally immersive (understood as multimodal) but partial (understood as limited to a part of the body) virtual reality interface. The proposed concept of total/partial immersive-ness may be seen as a new orthogonal dimension in the taxo-nomic classification of systems in the 'virtuality continuum' introduced in [2]. An interesting aspect of the proposed con-figuration is the possibility for it to be wearable. We will briefly describe the motivation for this new taxonomic di-mension from a theoretical point of view, as well as explain the practical reasons that lead us to this concept. This will be done by discussing earlier work from one of the authors that illustrates the possibilities of a total immersive VR system but also pinpoints a number of inescapable limitations.