Conference PaperPDF Available

Syncopalooza: Manipulating the Syncopation in Rhythmic Performances

Authors:
  • Music Technology Group Pompeu Fabra University Barcelona

Figures

Content may be subject to copyright.
A preview of the PDF is not available
... Our system assists DJs and music producers to explore the creative potential of syncopation. This work is based on a previous formalized algorithm, and corresponding application Syncopalooza [12]. Syncopalooza manipulates the syncopation in symbolic data, i.e. ...
... In the analysis stage we use onset detection to identify the start times of musical events in the audio loop and hence to extract the rhythmic structure in the context of a known time signature and tempo. Next, the transformation step applies the symbolic approach to syncopation manipulation of Sioros [12] to determine how the rhythmic structure of the audio loop can be modified. Finally, reconstruction consists of implementing this transformation in the audio domain. ...
... Recently, Sioros et al. proposed an application for the manipulation of syncopation in music performances, named Syncopalooza. The core of Syncopalooza uses an algorithm that can remove or generate syncopation in rhythmic patterns by displacing the start times of musical events (onsets) with respect to a metrical template [12]. After the metrical template is created according to the meter and tempo, the onsets are aligned to the grid by snapping their time positions to the closest pulse boundary present in the metrical grid. ...
Conference Paper
Full-text available
In this work we present a system that estimates and ma-nipulates rhythmic structures from audio loops in real-time to perform syncopation transformations. The core of our system is a technique for the manipulation of synco-pation in symbolic representations of rhythm. In order to apply this technique to audio signals we must first seg-ment the audio loop into musical events using onset de-tection. Then, we use the symbolic syncopation transfor-mation method to determine how to modify the rhythmic structure in order to change the syncopation. Finally we present two alternative methods to reconstruct the audio loop, one based on time scaling and the other on resampling. Our system, Loopalooza, is implemented as a freely available MaxForLive device to allow musicians and DJs to manipulate syncopation in audio loops in real-time.
... Our system assists DJs and music producers to explore the creative potential of syncopation. This work is based on a previous formalized algorithm, and corresponding application Syncopalooza [12]. Syncopalooza manipulates the syncopation in symbolic data, i.e. ...
... In the analysis stage we use onset detection to identify the start times of musical events in the audio loop and hence to extract the rhythmic structure in the context of a known time signature and tempo. Next, the transformation step applies the symbolic approach to syncopation manipulation of Sioros [12] to determine how the rhythmic structure of the audio loop can be modified. Finally, reconstruction consists of implementing this transformation in the audio domain. ...
... Recently, Sioros et al. proposed an application for the manipulation of syncopation in music performances, named Syncopalooza. The core of Syncopalooza uses an algorithm that can remove or generate syncopation in rhythmic patterns by displacing the start times of musical events (onsets) with respect to a metrical template [12]. After the metrical template is created according to the meter and tempo, the onsets are aligned to the grid by snapping their time positions to the closest pulse boundary present in the metrical grid. ...
... Recently, George Sioros and his colleagues have developed a series of algorithms and software for the real-time manipulation of syncopation in a given pattern [8]. While these rhythmic transformations are based on a deterministic model of musical meter [9] rhythmic transformations may also be approached as a stochastic process [10]. ...
Conference Paper
Full-text available
This paper presents an algorithm and software that implements it for the gradual transformation of musical rhythms through graphical means, as well as the artistic installation Waiting for Response where it was first used. The transformation is based on the manipulation of the time-line of the input rhythm, which is treated as geometric form in constant transformation. The aim of the algorithm is to explore rhythmic relations in an evolutionary manner by generating transformations based on graphical and geometric concepts and independently of the musical character of the initial rhythmical pattern. It provides, relates and generates a genealogy of rhythms based on an initial rhythm, which may be perceptually unrelated. Waiting for Response is an artistic installation that employs the above transformation in the generation of sonic events to enter in an acoustical "dialogue" with the materiality of the exhibition space.
... The events' duration can be decomposed into several layers, starting with the tempo, which depends on the tactus duration and the stratification of the remaining metrical layers (Sioros et al., 2013). Furthermore, inter-onset-intervals (IOIs), i.e., the intervals between the onsets times of sequential note events (Toussaint, 2019), provide a good indication of the structural characteristics of a rhythm, given by the temporal distribution of the events' onsets, while discarding rests (i.e., silences) and performance traits such as legato or staccato. ...
Chapter
In this paper, we review computational methods for the representation and similarity computation of musical rhythms in both symbolic and sub-symbolic (e.g., audio) domains. Both tasks are fundamental to multiple application scenarios from indexing, browsing, and retrieving music, namely navigating musical archives at scale. Stemming from the literature review, we identified three main rhythmic representations: string (sequence of alpha-numeric symbols to denote the temporal organization of events), geometric (spatio-temporal pictorial representation of events), and feature lists (transformation of audio into a temporal series of features or descriptors), and twofold categories of feature- and transformation-based distance metrics for similarity computation. Furthermore, we address the gap between explicit (symbolic) and implicit (musical audio) rhythmic representations stressing that a greater interaction across modalities would promote a holistic view of the temporal music phenomena. We conclude the article by unveiling avenues for future work on (1) hierarchical, (2) multi-attribute and (3) rhythmic layering models grounded in methodologies across disciplines, such as perception, cognition, mathematics, signal processing, and music.
... The events' duration can be decomposed into several layers, starting with the tempo, which depends on the tactus duration and the stratification of the remaining metrical layers [53]. Furthermore, inter-onset-intervals (IOIs), i.e., the intervals between the onsets times of sequential note events [58], provide a good indication of the structural characteristics of a rhythm, given by the temporal distribution of the events' onsets, while discarding rests (i.e., silences) and performance traits such as legato or staccato. ...
Preprint
Full-text available
In this paper, we review computational methods for the representation and similarity computation of musical rhythms in both symbolic and sub-symbolic (e.g., audio) domains. Both tasks are fundamental to multiple application scenarios from indexing, browsing, and retrieving music, namely navigating musical archives at scale. Stemming from the literature review, we identified three main rhythmic representations: string, geometric, and feature lists, and twofold categories of feature- and transformation-based distance metrics for similarity computation. Furthermore, we address the gap between explicit (symbolic) and implicit (musical audio) rhythmic representations stressing that a greater interaction across modalities would promote a holistic view of the temporal music phenomena. We conclude the article by unveiling avenues for future work on 1) hierarchical, 2) multi-attribute and 3) rhythmic layering models grounded in methodologies across disciplines, such as perception, cognition, mathematics, signal processing, and music.
... Recently we presented an algorithm that allows for the manipulation of syncopation, i.e. a computer algorithm that is able to remove or introduce syncopation in a certain rhythmic pattern in a controlled way [20]. Here, we extend the algorithm to a set of formalized generic transformations that can analyze, generate and manipulate the syncopation in binary patterns. ...
Conference Paper
Full-text available
Syncopation is a rhythmic phenomenon present in various musical styles and cultures. We present here a set of simple rhythmic transformations that can serve as a formalized model for syncopation. The transformations are based on fundamental features of the musical meter and syncopation, as seen from a cognitive and a musical perspective. Based on this model, rhythmic patterns can be organized in tree structures where patterns are interconnected through simple transformations. A Max4Live device is presented as a creative application of the model. It manipulates the syncopation of midi “clips” by automatically de-syncopating and syncopating the midi notes.
... As we want to eliminate all other expressive factors of a musical performance, we developed a computer algorithm that generates syncopation in monophonic sound sequences relying on changing metrical positions only, without altering other expressive or structural characteristics. The algorithm is an adaptation of the general syncopation transformations developed by Sioros et al. (2013). While in the recent Witek study (2014) syncopation was measured in pre-existing drum loops, in our study we can generate and vary it in piano melodies using an automatic algorithm over which we have complete control. ...
Article
Full-text available
In order to better understand the musical properties which elicit an increased sensation of wanting to move when listening to music-groove-we investigate the effect of adding syncopation to simple piano melodies, under the hypothesis that syncopation is correlated to groove. Across two experiments we examine listeners' experience of groove to synthesized musical stimuli covering a range of syncopation levels and densities of musical events, according to formal rules implemented by a computer algorithm that shifts musical events from strong to weak metrical positions. Results indicate that moderate levels of syncopation lead to significantly higher groove ratings than melodies without any syncopation or with maximum possible syncopation. A comparison between the various transformations and the way they were rated shows that there is no simple relation between syncopation magnitude and groove.
... There is a need for machine musicianship (Rowe 2001) to assist in the construction of musically interesting and stylistically appropriate performances. To cite just a few representative studies, drumming (Sioros et al. 2013), chord voicings (Hirata 1996), bass lines (Dias and Guedes 2013), and vocal technique (Nakano and Goto 2009) have all been explored and automated to some extent. Even more difficult is the problem of adjusting styles according to other musicians. ...
Article
Full-text available
Computers are often used in performance of popular music, but most often in very restricted ways, such as keyboard synthesizers where musicians are in complete control, or pre-recorded or sequenced music where musicians follow the computer's drums or click track. An interesting and yet little-explored possibility is the computer as highly autonomous performer of popular music, capable of joining a mixed ensemble of computers and humans. Considering the skills and functional requirements of musicians leads to a number of predictions about future human–computer music performance (HCMP) systems for popular music. We describe a general architecture for such systems and describe some early implementations and our experience with them.
Conference Paper
This paper examines the computational problem of taking a classical music composition and algorithmically recomposing it in a ragtime style. Because ragtime music is distinguished from other musical genres by its distinctive syncopated rhythms, our work is based on extracting the frequencies of rhythmic patterns from a large collection of ragtime compositions. We use these frequencies in two different algorithms that alter the melodic content of classical music compositions to fit the ragtime rhythmic patterns, and then combine the modified melodies with traditional ragtime bass parts, producing new compositions which melodically and harmonically resemble the original music. We evaluate these algorithms by examining the quality of the ragtime music produced for eight excerpts of classical music alongside the output of a third algorithm run on the same excerpts; results are derived from a survey of 163 people who rated the quality of the ragtime output of the three algorithms.
Conference Paper
Full-text available
Rhythmic syncopation is one of the most fundamental fea- tures that can be used to characterize music. Therefore it can be applied in a variety of domains such as mu- sic information retrieval and style analysis. During the past twenty years a score of different formal measures of rhythmic syncopation have been proposed in the mu- sic literature. Here we compare eight of these measures with each other and with human judgements of rhythmic complexity. A data set of 35 rhythms ranked by human subjects was sorted using the eight syncopation measures. A Spearman rank correlation analysis of the rankings was carried out, and phylogenetic trees were calculated to vi- sualize the resulting matrix of coefficients. The main find- ing is that the measures based on perception principles agree well with human judgements and very well with each other. The results also yield several surprises and open problems for further research.
Conference Paper
Full-text available
This paper presents a drum transcription algorithm adjusted to the constraints of real-time audio. We introduce an instance filtering (IF) method using sub-band onset detection, which improves the performance of a system having at its core a feature-based K-nearest neighbor classifier (KNN). The architecture proposed allows for adapting different parts of the algorithm for either bass drum, snare drum or hi-hat cymbals. The open-source system is implemented in the graphic programming languages Pure Data (PD) and Max MSP, and aims to work with a large variety of drum sets. We evaluated its performance on a database of audio samples generated from a well known collection of midi drum loops randomly matched with a diverse collection of drum sets. Both of the evaluation stages, testing and validation, show an improvement in the performance when using the instance filtering algorithm.
Article
Full-text available
A study of syncopation in American popular music was carried out based on analyses of sound recordings spanning the period 1890 to 1939. Sample measures were randomly selected and the presence and rhythmic character of syncopations were tabulated. Several trend-related hypotheses were tested. While some changes in the patterns of syncopation were evident over the 50-year period of the study, the principal change was an increase in the quantity of syncopation rather than an increase in the variety of syncopated patterns.
Article
Full-text available
The assignment of a rhythmic interpretation to a piece of metrical music calls for the postulation of an underlying meter and the parsing of the note values according to this meter. In this article we develop the implications of this view, which include the following propositions. 1. Any given sequence of note values is in principle rhythmically ambiguous, although this ambiguity is seldom apparent to the listener. 2. In choosing a rhythmic interpretation for a given note sequence the listener seems to be guided by a strong assumption: if the sequence can be interpreted as the realization of an unsyncopated passage, then that is how he will interpret it. 3. Phrasing can make an important difference to the rhythmic interpretation that the listener assigns to a given sequence. Phrasing can therefore serve a structural function as well as a purely ornamental one.
Article
THIS EDITION HAS BEEN REPLACED BY A NEWER 2003 EDITION This classic reference work is simply the best one-volume music dictionary available today. Its nearly 6,000 entries, written by more than 70 top musicologists, are consistently lucid and based on recent scholarship. "The New Harvard Dictionary of Music" contains among its riches superb articles on music of the 20th century, including jazz, rock, and mixed media as well as twelve-tone, serial, and aleatory music; comprehensive articles on the music of Africa, Asia, Latin America, and the Near East; entries on all the styles and forms in Western art music; and descriptions of instruments enriched by historical background. Short entries for quick reference--definitions and identifications--alternate with encyclopedia-length articles written by experts in each field. More than 220 drawings and 250 musical examples enhance the text. Combining authoritative scholarship with concise, lively prose, "The New Harvard Dictionary of Music" is the essential guide for musicians, students, and everyone who listens to music.
Article
Abstract The cognitive strategies by which humans process complex, metrically-ambiguous rhythmic patterns remain poorly understood. We investigated listeners' abilities to perceive, process and produce complex, syncopated rhythmic patterns played against a regular sequence of pulses. Rhythmic complexity,was varied along a continuum; complexity,was quantified using an objective metric of syncopation suggested by Longuet-Higgins and Lee. We used a recognition memory,task to assess the immediate,and longer-term perceptual salience and memorability,of rhythmic,patterns. The tasks required subjects to (a) tap in time to the rhythms, (b) reproduce these same rhythm patterns given a steady pulse, and (c) recognize these patterns when replayed both immediately after the other tasks, and after a 24-hour delay. Subjects tended to reset the phase of their internally generated pulse with highly complex, syncopated rhythms, often pursuing a strategy of reinterpreting or "re-hearing" the rhythm as less syncopated. Thus, greater complexity in rhythmic stimuli leads to a reorganization of the cognitive representation of the temporal structure of events. Less complex rhythms,were also more,robustly encoded,into long-term memory,than more,complex,syncopated rhythms,in the delayed memory,task. 3
Article
In Experiment 1, six cyclically repeating interonset interval patterns (1,2:1,2:1:1,3:2:1,3:1:2, and 2:1:1:2) were each presented at six different note rates (very slow to very fast). Each trial began at a random point in the rhythmic cycle. Listeners were asked to tap along with the underlying beat or pulse. The number of times a given pulse (period, phase) was selected was taken as a measure of its perceptual salience. Responses gravitated toward a moderate pulse period of about 700 ms. At faster tempi, taps coincided more often with events followed by longer interonset intervals. In Experiment 2, listeners heard the same set of rhythmic patterns, plus a single sound in a different timbre, and were asked whether the extra sound fell on or off the beat. The position of the downbeat was found to be quite ambiguous. A quantitative model was developed from the following assumptions. The phenomenal accent of an event depends on the interonset interval that follows it, saturating for interonset intervals greater than about 1 s. The salience of a pulse sensation depends on the number of events matching a hypothetical isochronous template, and on the period of the template—pulse sensations are most salient in the vicinity of roughly 100 events per minute (moderate tempo). The metrical accent of an event depends on the saliences of pulse sensations including that event. Calculated pulse saliences and metrical accents according to the model agree well with experimental results (r > 0.85). The model may be extended to cover perceived meter, perceptible subdivisions of a beat, categorical perception, expressive timing, temporal precision and discrimination, and primacy/recency effects. The sensation of pulse may be the essential factor distinguishing musical rhythm from nonrhythm.
Article
While study of the social and cultural aspects of popular music has been flourishing for some time, it is only in the last few years that serious efforts have been made to analyse the music itself: what Allan Moore has called ‘the primary text’ (1993, p. 1). These efforts include general studies of styles and genres (Moore, 1993; Bowman, 1995); studies of specific aspects of popular styles such as harmony and improvisation (Winkler 1978; Moore 1992, 1995; Walser 1992), as well as more intensive analyses of individual songs (Tagg 1982; Hawkins 1992). In this paper I will investigate syncopation, a phenomenon of great importance in many genres of popular music and particularly in rock.