Frédéric Bevilacqua

Frédéric Bevilacqua
Institut de Recherche et Coordination Acoustique/Musique | IRCAM · Sound Music Movement Interaction

Ph.D.

About

228
Publications
45,073
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
7,224
Citations
Introduction
Frédéric Bevilacqua is the head of the Sound Music Movement Interaction team at IRCAM in Paris (part of the joint research lab Science & Technology for Music and Sound – IRCAM – CNRS – Sorbonne Université). His research concerns the development of gesture-based interactive systems, and in particular the modeling and the design of interaction between movement and sound. The applications ranges from performing arts (music, dance), human-comptuer interaction, gaming to rehabilitation and wellbeing.

Publications

Publications (228)
Article
Full-text available
The effects of music on bodily movement and feelings, such as when people are dancing or engaged in physical activity, are well-documented—people may move in response to the sound cues, feel powerful, less tired. How sounds and bodily movements relate to create such effects? Here we deconstruct the problem and investigate how different auditory fea...
Article
With the increasing interest in movement sonification and expressive gesture-based interaction, it is important to understand which factors contribute to movement learning and how. We explore the effects of movement sonification and users’ musical background on motor variability in complex gesture learning. We contribute an empirical study in which...
Article
In this article, we discuss some of our research with Local Area Networks (LAN) in the context of sound installations or musical performances. Our systems, built on top of Web technologies, enable novel possibilities of collective and collaborative interaction, in particular by simplifying public access to the artwork by presenting the work through...
Chapter
Does being part of an orchestra from an early age have an impact on cognitive and emotional capacities? Researchers from the University of Geneva (Unige), Switzerland, the University of Genoa (Unige-IT), and the Institute of Research and Coordination Acoustic/Music (IRCAM) in Paris investigated this question in the context of the Demos orchestras o...
Preprint
Full-text available
The effects of music on bodily movement and feelings, such as when people are dancing or engaged in physical activity, are well-documented - people may move in response to the sound cues, feel powerful, less tired. How sounds and bodily movements relate to create such effects? Here we deconstruct the problem and investigate how different auditory f...
Conference Paper
Negative body perceptions are a major predictor of physical inactivity, a serious health concern. Sensory feedback can be used to alter such body perceptions; movement sonifcation, in particular, has been suggested to afect body perception and levels of physical activity (PA) in inactive people. We investigated how metaphorical sounds impact body p...
Article
Software tools for generating digital sound often present users with high-dimensional, parametric interfaces, that may not facilitate exploration of diverse sound designs. In this article, we propose to investigate artificial agents using deep reinforcement learning to explore parameter spaces in partnership with users for sound design. We describe...
Article
Full-text available
Machine learning approaches have seen a considerable number of applications in human movement modeling but remain limited for motor learning. Motor learning requires that motor variability be taken into account and poses new challenges because the algorithms need to be able to differentiate between new movements and variation in known ones. In this...
Article
Full-text available
Most studies on the regulation of speed and trajectory during ellipse drawing have used visual feedback. We used online auditory feedback (sonification) to induce implicit movement changes independently from vision. The sound was produced by filtering a pink noise with a band-pass filter proportional to movement speed. The first experiment was perf...
Preprint
The use of machine learning to model motor learning mechanisms is still limited, while it could help to design novel interactive systems for movement learning or rehabilitation. This approach requires to account for the motor variability induced by motor learning mechanisms. This represents specific challenges concerning fast adaptability of the co...
Conference Paper
Full-text available
This PhD project is multidisciplinary, at the intersection of Education, Design, and Human-Computer Interaction. In collaboration with early childhood stakeholders, the PhD goal aims at developing new approaches for early childhood education focusing on the role of body in the learning process. We leverage digital tangible and interactive technolog...
Conference Paper
We report on a system that enables the synchronization of electroacoustic sounds with conductor gestures. This system was specifically used in the context of the music composition 'IDEA', for instrumental ensemble and live electronics. After a series of experiments and testing, the system was used in two public performances with two different condu...
Preprint
Software tools for generating digital sound often present users with high-dimensional, parametric interfaces, that may not facilitate exploration of diverse sound designs. In this paper, we propose to investigate artificial agents using deep reinforcement learning to explore parameter spaces in partnership with users for sound design. We describe a...
Conference Paper
Full-text available
This PhD project is multidisciplinary, at the intersection of Early Childhood Education, Design, and Human-Computer Interaction. The PhD goal aim at developing new approaches for early childhood education focusing on the role of body in the learning process. We leverage digital tangible and interactive technologies to develop specific scenarios tha...
Article
Full-text available
In this paper, we present a complete framework, both technical and conceptual, aimed at developing and analysing Networked Music Systems. After a short description of our technical framework called soundworks , a JavaScript library especially designed for collective music interaction using web browser of mobile phones, we introduce a new conceptual...
Article
Full-text available
Adults readily make associations between stimuli perceived consecutively through different sense modalities, such as shapes and sounds. Researchers have only recently begun to investigate such correspondences in infants but only a handful of studies have focused on infants less than a year old. Are infants able to make cross-sensory correspondences...
Article
Embodied sound design is a process of sound creation that involves the designer’s vocal apparatus and gestures. The possibilities of vocal sketching were investigated by means of an art installation. An artist–designer interpreted several vocal self-portraits and rendered the corresponding synthetic sketches by using physics-based and concatenative...
Article
Introduction/Background Recent studies showed that auditory feedback, sound and music can improve upper limb motor-recovery after stroke or Traumatic Brain Injury. However, the specific influence of different sound features and musical parameters has never been explored in this context. This study designed and tested different patterns of movement-...
Conference Paper
Full-text available
People, through their bodily actions, engage in sensorimotor loops that connect them to the world and to their own bodies. People's brains integrate the incoming sensory information to form mental representations of their body appearance and capabilities. Technology provides exceptional opportunities to tweak sensorimotor loops and provide people w...
Conference Paper
Full-text available
We describe an interactive system that allows for sonifying arm movements. The aim is to support stroke patients going through rehabilitation by providing them with augmented auditory feedback that reacts to their movements. The system is based on IMU sensors (Inertial Measurements Unit) attached to each arm. The movement data are streamed in real-...
Article
Full-text available
Technologies for sensing movement are expanding toward everyday use in virtual reality, gaming, and artistic practices. In this context, there is a need for methodologies to help designers and users create meaningful movement experiences. This article discusses a user-centered approach for the design of interactive auditory feedback using interacti...
Article
Full-text available
Motor skill acquisition inherently depends on the way one practices the motor task. The amount of motor task variability during practice has been shown to foster transfer of the learned skill to other similar motor tasks. In addition, variability in a learning schedule, in which a task and its variations are interweaved during practice, has been sh...
Conference Paper
Full-text available
KIn the last decade, interdisciplinary research projects focused on the documentation and study of choreography and dance including Transmedia Knowledge Base (TKB) and Motion Bank used annotation as a major part of their approach to both analyses of time-based media recordings and presentation of research results. Researchers from both projects con...
Article
Full-text available
Communicating an auditory experience with words is a difficult task and, in consequence, people often rely on imitative non-verbal vocalizations and gestures. This work explored the combination of such vocalizations and gestures to communicate auditory sensations and representations elicited by non-vocal everyday sounds. Whereas our previous studie...
Data
Analysis grid, Annotations, and Co-occurrence matrices. (PDF)
Chapter
Full-text available
Our body can be seen as an anchor that tightly connects us with the surrounding physical world. Concepts of body-centred interaction have been successfully applied in virtual and augmented reality, however, mainly in the visual domain. Still, we interact with the environment through our body, and these interactions mostly always produce sounds. The...
Conference Paper
Full-text available
Expert musicians' performances embed a timing variability pattern that can be used to recognize individual performance. However, it is not clear if such a property of performance variability is a consequence of learning or an intrinsic characteristic of human performance. In addition, little evidence exists about the role of timing and motion in re...
Conference Paper
Full-text available
Human movement has historically been approached as a functional component of interaction within human computer interaction. Yet movement is not only functional, it is also highly expressive. In our research, we explore how movement expertise as articulated in Laban Movement Analysis (LMA) can contribute to the design of computational models of move...
Article
Full-text available
As eye movements are mostly automatic and overtly generated to attain visual goals, individuals have a poor metacognitive knowledge of their own eye movements. We present an exploratory study on the effects of real-time continuous auditory feedback generated by eye movements. We considered both a tracking task and a production task where smooth pur...
Chapter
We present in this paper a complete gestural interface built to support music pedagogy. The development of this prototype concerned both hardware and software components: a small wireless sensor interface including accelerometers and gyroscopes, and an analysis system enabling gesture following and recognition. A first set of experiments was conduc...
Article
Full-text available
The use of continuous auditory feedback for motor control and learning is still understudied and deserves more attention regarding fundamental mechanisms and applications. This paper presents the results of three experiments studying the contribution of task-, error-, and user-related sonification to visuo-manual tracking and assessing its benefits...
Conference Paper
Designers are used to produce a variety of physical and digital representations at different stages of the design process. These intermediary objects (IOs) do support the externalization of ideas and the mediation with the different stakeholders. In the same manner, sound designers deliver several intermediate sounds to their clients, through itera...
Article
This article presents a set of interactive audio applications that have been realised between 2009 and 2012. The applications have been created to foster the engagement of users in playing sound and music—alone or collectively. After introducing the underlying concepts and technical framework, the article briefly describes each scenario and discuss...
Article
Full-text available
This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with audi...
Conference Paper
Full-text available
This paper presents a unified framework computer vision approach for finger gesture early recognition and interaction that can be applied on sequences of either RGB or depth images without any supervised skeleton extraction. Either RGB or time-of-flight cameras can be used to capture finger motions. The hand detection is based on a skin color model...
Conference Paper
Full-text available
We discuss the notion of movement coarticulation, which has been studied in several fields such as motor control, music performance and animation. In gesture recognition, movement coarticulation is generally viewed as a transition between "gestures" that can be problematic. We propose here to account for movement coarticulation as an informative el...
Conference Paper
Full-text available
We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the p...
Conference Paper
Full-text available
We present GaussBox, a design support tool for prototyping movement interaction using machine learning. In particular, we propose novel, interactive visualizations that expose the behavior and internal values of machine learning models rather than their sole results. Such visualizations have both pedagogical and creative potentials to guide users i...
Conference Paper
Full-text available
This paper discusses novel visualizations that expose the behavior and internal values of machine learning models rather than their sole results. Interactive visualizations have the potential to shift the perception of machine learning models from black-box processes to transparent artifacts that can be experienced and crafted. We discuss how they...
Conference Paper
Full-text available
Machine learning is one of the most important and successful techniques in contemporary computer science. It involves the statistical inference of models (such as classifiers) from data. It is often conceived in a very impersonal way, with algorithms working autonomously on passively collected data. However, this viewpoint hides considerable human...
Article
The notion of “movement qualities” is central in contemporary dance; it describes the manner in which a movement is executed. Movement qualities convey information revealing movement expressiveness; their use has strong potential for movement-based interaction with applications in arts, entertainment, education, or rehabilitation. The purpose of ou...
Article
Full-text available
Communicating about sounds is a difficult task without a technical language, and naïve speakers often rely on different kinds of non-linguistic vocalizations and body gestures (Lemaitre et al. 2014). Previous work has independently studied how effectively people describe sounds with gestures or vocalizations (Caramiaux, 2014, Lemaitre and Rocchesso...
Conference Paper
Full-text available
Movement sequences are essential to dance and expressive movement practice; yet, they remain underexplored in movement and computing research, where the focus on short gestures prevails. We propose a method for movement sequence analysis based on motion trajectory synthesis with Hidden Markov Models. The method uses Hidden Markov Regression for joi...
Article
Full-text available
In a Human-Computer Interaction context, we aim to elaborate an adaptive and generic interaction model in two different use cases: Embodied Conversational Agents and Creative Musical Agents for musical improvisation. To reach this goal, we'll try to use the concepts of adaptation and synchronization to enhance the interactive abilities of our agent...
Conference Paper
Full-text available
This prospective study concerning the perception of audio virtual surfaces (AVSs) was inspired by two different research fields: sensory substitution and haptic and touch perception. We define Audio Virtual Surfaces as regions of space that trigger sounds when the user touches it or moves into it. First, we describe an example of interactive setup...
Conference Paper
Full-text available
In this study, we investigated the ability of blindfolded adults to discriminate between concave and convex auditory virtual surfaces. We used a Leap Motion TM device to measure the movements of the hand and fingers. Participants were asked to explore the space above the device with the palm of one hand and an auditory feedback was produced only wh...
Conference Paper
Full-text available
This prospective study concerning the perception of audio virtual surfaces (AVSs) was inspired by two different research fields: sensory substitution and haptic and touch perception. We define an Audio Virtual Surface as a region of space that triggers sounds when the user touches it or moves into it. First, we describe an example of interactive se...
Article
Full-text available
At SIGGRAPH 2014, the Emerging Technologies venue presented installations stemming from several fields, including displays, input devices, collaborative environments, robotics, haptics, and simulators. Of the 26 displayed installations at the conference (Vancouver, Canada, Aug. 10--14, 2014), we have selected the following four that highlight today...
Article
Full-text available
The audio-feedback resulting from object interaction provides information about the material of the surface and about one's own motor behavior. With the current developments in interactive sonification, it is now possible to digitally change this audio-feedback, and thus, the use of interactive sonification becomes a compelling approach to shape ta...
Article
We present the Collective Sound Checks, an exploration of user scenarios based on mobile web applications featuring motion-controlled sound that enable groups of people to engage in spontaneous collaborative sound and music performances. These new forms of musical expression strongly shift the focus of design from human-computer interactions toward...