Jérôme Nika

Jérôme Nika
Institut de Recherche et Coordination Acoustique/Musique | IRCAM

PhD, MSc, MEng

About

22
Publications
9,385
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
132
Citations
Introduction
Research in human-machine musical interaction & design of generative musical agents ---> jeromenika.com
Additional affiliations
October 2012 - June 2016
Institut de Recherche et Coordination Acoustique/Musique
Position
  • PhD Student

Publications

Publications (22)
Thesis
Full-text available
This thesis focuses on the introduction of authoring and controls in human-computer music improvisation through the use of temporal scenarios to guide or compose interactive performances, and addresses the dialectic between planning and reactivity in interactive music systems dedicated to improvisation. An interactive system dedicated to music impr...
Article
Full-text available
We present the formal model and implementation of a computer-aided composition system allowing for the "composition of musical processes". Rather than generating static data, this framework considers musical objects as dynamic structures likely to be updated and modified at any time. After formalizing a number of basic concepts, this paper describe...
Article
Full-text available
This article focuses on the introduction of control, authoring, and composition in human-computer music improvisation through the description of a guided music generation model and a reactive architecture, both implemented in the software ImproteK. This interactive music system is used with expert improvisers in work sessions and performances of id...
Conference Paper
Full-text available
The collaborative research and development project DYCI2, Creative Dynamics of Improvised Interaction, focuses on conceiving, adapting, and bringing into play efficient models of artificial listening, learning, interaction, and generation of musical contents. It aims at developing creative and autonomous digital musical agents able to take part in...
Conference Paper
Full-text available
Recent research on Automatic Chord Extraction (ACE) has focused on the improvement of models based on machine learning. However, most models still fail to take into account the prior knowledge underlying the labeling alphabets (chord labels). Furthermore, recent works have shown that ACE performances have reached a glass ceiling.Therefore, this pro...
Article
Full-text available
This paper focuses on learning the hierarchical structure of a temporal scenario (for instance, a chord progression) to perform automatic improvisation consistently upon several time scales. We first present how to represent a hierarchical structure with a phrase structure grammar. Such a grammar enables us to analyse a scenario upon several levels...
Preprint
Full-text available
Recent researches on Automatic Chord Extraction (ACE) have focused on the improvement of models based on machine learning. However, most models still fail to take into account the prior knowledge underlying the labeling alphabets (chord labels). Furthermore, recent works have shown that ACE performances are converging towards a glass ceiling. There...
Preprint
This paper studies the prediction of chord progressions for jazz music by relying on machine learning models. The motivation of our study comes from the recent success of neural networks for performing automatic music composition. Although high accuracies are obtained in single-step prediction scenarios, most models fail to generate accurate multi-...
Chapter
Billie Holiday, Edith Piaf, and Elisabeth Schwarzkopf are three great musical ladies, born in 1915. Could we make them singing together? This was the goal of a performance made for a music festival in L’Aquila (Italy) in 2015. This raises musical questions: what kind of sound could link Billie Holiday to Schwarzkopf, Schwarzkopf to Piaf? What kind...
Conference Paper
Full-text available
This paper presents a method taking into account the form of a tune upon several levels of organisation to guide music generation processes to match this structure. We first show how a phrase structure grammar can represent a hierarchical analysis of chord progressions and be used to create multi-level progressions. We then explain how to exploit t...
Conference Paper
Full-text available
Cet article présente une synthèse de travaux réalisés sur le guidage de l’improvisation musicale homme-machine. Ceux-ci s’intéressent plus précisément à la performance interactive dans un contexte "idiomatique" ou composé à l’échelle de la structure temporelle. Ces recherches proposent l’introduction de "scénarios" temporels pour guider ou composer...
Article
Full-text available
Cet article rend compte d'une enquête menée en 2011-2013 auprès du musicien de jazz Bernard Lubat pour le développement du logiciel ImproteK dédié à l’improvisation. Ce logiciel capte le jeu du musicien des phrases enregistrées pour en créer de nouvelles dans un cadre idiomatique (standards de jazz) où l’improvisation est basée sur une pulsation ré...
Conference Paper
Full-text available
This paper describes a reactive architecture handling the hybrid temporality of guided human-computer music improvisation. It aims at combining reactivity and anticipation in the music generation processes steered by a " scenario ". The machine improvisation takes advantage of the temporal structure of this scenario to generate short-term anticipat...
Article
Full-text available
RÉSUMÉ. Cet article propose un modèle pour l’improvisation musicale guidée par une structure formalisée. Il exploite les connaissances a priori du contexte d’improvisation pour introduire de l’anticipation dans le processus de génération. « Improviser » signifie ici articuler une mémoire musicale annotée avec un « scénario » guidant l’improvisation...
Conference Paper
Full-text available
Improvisation intrinsically carries a dialectic between spontaneity/reactivity and long-term planning/organization. This paper transposes this dialectic to interactive human-computer improvisation where the computer has to interleave various generative processes. They require different levels of prior knowledge, and follow a coarser improvisation p...
Article
Full-text available
La mise en mouvement du corps à travers la danse, la présence d'une pulsation régulière qui rythme ces mouvements et le fait de contrecarrer cette pulsation par des événements musicaux placés à côté, ce qu'on appelle la "contramétricité", sont des points communs à la plupart des musiques africaines traditionnelles. Ils témoignent à la fois d'une di...
Conference Paper
Full-text available
We introduce ImproteK, a system integrating a rhythmic framework and an underlying harmonic structure in a context of musical improvisation. In the filiation of the improvisation software OMax, it is built on the factor oracle structure to take advantage of the particularly relevant and rich characteristics of this automaton in a musical environmen...
Conference Paper
Full-text available
Le système ImproteK présenté dans cet article propose d'intégrer un cadre rythmique et une structure harmonique sous-jacente dans un contexte d'improvisation. Dans la filiation du logiciel d'improvisation OMax, il repose sur la structure d'oracle des facteurs pour tirer profit des propriétés particulièrement riches et pertinentes de cet automate da...

Network

Cited By

Projects

Project (1)
Project
DYCI2 is a collaborative research and development project funded by the French National Research Agency (ANR). The project Creative Dynamics of Improvised Interaction focuses on conceiving, adapting, and bringing into play efficient models of artificial listening, learning, interaction, and generation of musical contents. It aims at developing creative and autonomous digital musical agents able to take part in various human projects in an interactive and artistically credible way; and, to the end, at contributing to the perceptive and communicational skills of embedded artificial intelligence. The concerned areas are live performance, production, pedagogy, and active listening. More information: http://repmus.ircam.fr/dyci2/home Videos, music performances, sound examples, interactive listening tests: http://repmus.ircam.fr/dyci2/ressources