Conference PaperPDF Available

Syncopalooza: Manipulating the Syncopation in Rhythmic Performances

Authors:
  • Music Technology Group Pompeu Fabra University Barcelona

Figures

Content may be subject to copyright.
Syncopalooza: Manipulating the Syncopation in
Rhythmic Performances
George Sioros1, Marius Miron2, Diogo Cocharro2 Carlos Guedes1 and Fabien Gouyon2
1 Faculdade de Engenharia da Universidade do Porto
2 INESC-TEC Porto
gsioros@gmail.com
Abstract. In this paper we present a novel method that can effectively
manipulate the syncopation in rhythmic performances. The core of the method
consists of a set of formalized transformations that can effectively remove and
create new syncopations in a binary pattern. One can obtain a multitude of
rhythmic patterns with different degrees of syncopation by successively
applying the transformations to a single input pattern. The “Syncopalooza”
software application implements the method as two Max4Live devices: an
audio device that can transcribe drum performances in real time and a MIDI
device that manipulates the syncopation of the transcribed patterns. In addition,
we evaluate the syncopation transformations and show that they perform as
they are intended.
Keywords: rhythm, syncopation, transformations, meter, automatic rhythm
generation, real time.
1 Introduction
Syncopation is an essential musical phenomenon found in several and diverse styles
of western music, as well as in certain non-western music. It is commonly related to
rhythmic complexity and music tension [1, 2 p. 310]. Definitions of syncopation
often relate syncopation to the musical meter, describing it, for example, as a
contradiction to the prevailing meter [3].
Syncopation is a matter of magnitude and the subject of comparison between
rhythmic patterns: patterns are considered to be more or less syncopated than others.
This idea led to numerous models oriented towards the analysis and measurement of
syncopation [1, 46]. However, only few algorithms have been proposed to generate
syncopation, such as the one by Sioros and Guedes [7]. To our knowledge, no
formalized algorithm has been proposed that allows for the manipulation of
syncopation, e.g. a computer algorithm that is able to remove or introduce
syncopation in a certain rhythmic pattern in a controlled way.
In this paper we present a method and a software tool for musicians which can
manipulate the syncopation of rhythmic performances. In the core of the software lies
a set of novel transformations that systematically remove existing syncopations or
introduce new ones. The software is developed as two Max4Live devices and is
intended to be used as a tool to explore syncopation both in live performance and in
off-line composition.
2 George Sioros et al.
The first Max4Live device is an audio device that takes a real time audio stream of
a drum performance and transcribes it into three streams of event onsets, effectively
separating the kick-drum, snare-drum and cymbals.
The second Max4Live device is a MIDI device. It takes either one of the stream of
onsets produced by the first device or a user defined pattern and 1) removes all
syncopation, and 2) introduces new syncopations, different from the ones in the
original stream. The de-syncopation and re-syncopation is achieved by shifting onsets
to stronger or weaker metrical positions based on a certain metrical template. Finally,
the transformed streams are converted into MIDI note events which can be fed to any
standard MIDI synthesizer, generating new rhythmic performances in real time that
are always in synchrony with the original one. In order to enhance the metrical and
the syncopation feel, we assign dynamic accents to the onsets, either on the beat (thus
enhancing the metrical feel) or off the beat (thus enhancing the syncopation feel). The
transformations are applied in such a way that the syncopation is removed and re-
introduced gradually, resulting in a smooth transition from original pattern, to the de-
syncopated one, to the re-syncopated one.
In section 2, we provide a brief overview of the definitions and models of
syncopation that are the most relevant to the syncopation transformations introduced
in this paper. In section 3, we present the syncopation transformations. In section 4,
we evaluate the syncopation transformations. In section 5, we describe the Max4Live
software devices. In section 6, we close with the conclusions and future work.
2 Background
Longuet-Higgins and Lee presented a syncopation model that identifies the
syncopation in the pairs of notes and the rests or tied notes that follow them [4].
Accordingly, a rest or tied note in a strong metrical position preceded by an event in a
weaker metrical position constitutes a syncopation. David Huron, in his study of the
American popular music [6], used a similar definition using the term “Lacuna” to
describe syncopation. Behind both of the definitions lies the same expectation
principle: an event in a weak metrical position is bound to an event in the following
strong metrical position and when the expected strong note does not occur the weak
one is left hanging [2 p. 295].
Longuet-Higgins and Lee defined syncopation with the aid of metrical weights.
The metrical weights correspond directly to the metrical level that each metrical
position in a bar initiates (see section 3.1 for a more detailed description). The slower
the metrical level the higher the metrical weight. In Fig. 1, the metrical levels of a 4/4
meter are numbered from 0 to 4. The metrical weights are the negative of those
indexes (i.e. from 0 to -4). Syncopations are found when the metrical position of a rest
or tied note has a higher weight than the position of the preceding note onset.
W. T. Fitch and A. J. Rosenfeld derived a syncopation measure [5] (hereafter
LHL) from the Longuet-Higgins and Lee definition. The LHL measure attributes to
each syncopation a score that is the difference between the two metrical weights
described above. The total syncopation in a rhythmic pattern is the sum of all the
syncopation scores. In this paper, when a rhythmic pattern consists of more than one
Syncopalooza: Manipulating Syncopation 3
bar, we divide the sum by the number of bars in order for all the results to have
comparable scales.
David Temperley developed a model for exploring the uses of syncopation in rock
[8], which bares strong resemblance to both, the LHL definition of syncopation and
Huron’s expectation principle. In his model, syncopation is defined as the
displacement of events from metrically strong positions to preceding weaker ones.
The syncopation transformations that we introduce in this paper are inspired by this
model.
3 Syncopation Transformations
In order to remove the syncopation or introduce new syncopations in a stream of
onsets, we constructed a metrical template that describes the meter in which the
rhythmic patterns are heard. The template consists of pulses with different metrical
strength values that represent the alternating strong and weak beats commonly found
in a musical meter (Fig. 1, top). The pulses constitute a metrical grid which quantizes
the time positions of the onsets in a rhythmic input. The pulses belong to the various
metrical levels according to the time signature, e.g. the quarter notes or the sixteenth
notes found in a 4/4 bar. This way, a continuous stream of onsets is converted into a
binary string consisting of 1s and 0s.
Two kinds of transformations are then applied by shifting the onsets (1s) to empty
pulses (0s) similarly to the shifts described by Temperley [8]. First, the onsets are
shifted forward to strong metrical positions to remove the existing syncopations. Each
shift of an event results in a new pattern with decreased syncopation. When all events
have been shifted to their non-syncopating positions the resulting de-syncopated
pattern has no syncopation. Second, new syncopations are introduced. To that end, the
onsets found in strong metrical positions are anticipated to weaker positions. This
time, each shift of an event results in a new pattern with increased syncopation. When
all “strong” onsets have been moved to weaker positions the resulting re-syncopated
pattern has maximum syncopation.
Pulses that belong to slower metrical levels are considered to be stronger, so that
the de-syncopation process moves events from fast metrical levels to slower ones.
Conversely, the re-syncopation process “pushes” events to the faster metrical levels.
Although the initial and final patterns of each transformation, i.e. the original, the de-
syncopated, and the re-syncopated, are uniquely defined, the path that leads from one
to the other depends on the order in which the events are shifted.
Finally, dynamic accents are applied to the onsets. Dynamic accents can enhance
the perception of meter for the onsets that do not syncopate and at the same time
increase the feeling of syncopation for those onsets that are anticipated. The accents
are applied to the de-syncopated pattern but once an onset receives an accent it carries
it to any position it is shifted to, in the original and the re-syncopated version as well
as in any step in between. In that way, accents can be introduced both on the strong
beats as well as in weaker pulses when syncopating.
At a secondary level, “extra” syncopation can also be introduced without the need
to shift onsets. Instead, onsets that are found in weak metrical positions (in the fast
metrical levels) could be accented, while at the same time, onsets in the following
4 George Sioros et al.
strong metrical positions (in the slower metrical levels) would be attenuated. This
secondary mechanism for generating syncopation is especially useful in cases where
moving onsets is either impossible or ineffective due to the specific characteristics of
a rhythmic pattern. Such patterns have usually a high density of events per bar, like
drum rolls where all metrical subdivisions, even the fastest one, are fully occupied by
onsets. In that case, one can reveal the underlying meter and syncopation only by
accenting certain events.
We begin the detailed description of the transformations, in section 3.1, with the
construction of the metrical template needed for the transformations. We continue, in
section 3.2, with the algorithm for removing the syncopation in binary patterns. In
section 3.3, we present the algorithm for generating syncopation by shifting the
onsets. Finally, in section 3.4, we describe how dynamic accents can be used to
enhance the metrical feel and generate syncopation.
3.1 Metrical Template.
The metrical template is constructed automatically for each meter and tempo that the
rhythmic patterns are heard in. The meter determines a hierarchical structure like the
one found in the Generative Theory of Tonal Music by Lerdahl and Jackendoff [9].
Such a structure can be constructed by successively subdividing the bar into faster
metrical levels. For example, the 4/4 meter can be subdivided first into two half notes,
then each half note into two quarter notes, each quarter note into two eight notes and
so on, until the fastest metrical subdivision is reached. The metrical levels are indexed
by numbers, referred to as metrical indexes for easy reference, starting with the
number 0 for the slower one and increasing as one goes to faster levels. The process
of successively subdividing the bar results in alternating weak and strong pulses,
forming a pattern characteristic of each time signature. The stronger a pulse is, the
Fig. 1 Top: Example of the construction of a metrical template for a 4/4 meter. The meter is
successively subdivided generating the 5 metrical levels. The metrical strength (grey
rectangles) of each metrical position corresponds to the metrical levels it belongs to. Bottom: A
continuous stream of onsets is quantized to the fastest subdivision of the metrical template. A
binary pattern is created containing 0s (empty boxes) and 1s (filled boxes) according to the
position of the onsets (vertical lines).
Syncopalooza: Manipulating Syncopation 5
slower the metrical level it belongs to (lower index), so that weak pulses belong to
faster metrical levels (higher indexes). In Fig. 1 an example of such a metrical
template is given. A detailed description of an automatic way of generating such a
metrical template for any given time signature can be found in [7].
The duration of each metrical subdivision depends on the tempo, e.g. the quarter
note at 100bpm has duration of 600ms. The lower threshold for the duration of a
metrical subdivision has been estimated in several studies to be roughly around
100ms [1012]. The fastest metrical subdivision that we included in the metrical
template is the fastest subdivision above that threshold.
We intend to use this template as the basis for the transformations that will remove
or create syncopation in binary patterns. The definition of syncopation, as originally
formulated by Longuet-Higgins and Lee [4] or Huron [2 p. 295] and also adopted
here, attributes syncopation in the pair of an onset on a weak pulse with the following
silent strong pulse. One can, therefore, talk about and define the duration of the
syncopation as the duration between those two pulses. The duration of the
syncopation depends, on one hand, on the metrical levels of the corresponding pulses,
and on the other hand, on the tempo. One question that arises is: how does the
duration of the syncopation affect the feeling of syncopation?
Think of the example of Fig. 2, where a quarter note is tight to the next half note.
Although, according to the definition of syncopation, this is a clear case of
syncopation, the actual syncopation felt by the listener depends on the tempo at which
this pattern is performed. For example, at 100 bpm (q.n. = 600ms) the pattern is not
felt as syncopated since all events fall on the beat. Nevertheless, if the pattern is
played at double speed (q.n. = 300ms), it is felt as strongly syncopated. The actual
beat level would now be that of the half note, i.e. on the 1st and 3rd quarter notes, so
that the 2nd and 4rth would be off-beat. The metrical hierarchy in both cases is the
same, what changes is the salience of each metrical level.
Fig. 2: Syncopation at slow metrical levels. Top: Metrical template for a 4/4 meter. The most
salient pulse is notated i) at 100bpm with black rectangles and ii) at 200bpm with grey
rectangles. Bottom: A syncopated rhythmic pattern. Syncopation is depicted with th e curved
arrow line.
Fig. 3: Example of how tempo affects the metrical template. Very fast metrical levels (below
100ms duration) and very slow ones (above 1s duration) are disregarded.
6 George Sioros et al.
This aspect of syncopation and its relation to pulse salience has not been
thoroughly studied. However, based on our experience, we understand that
syncopation involving only slower metrical levels is not felt as strong. Metrical levels
have a peak salience in the region between 500ms 1s [11]. Syncopations with
duration longer than that of the most salient level seem to be felt significantly less
strong. The LHL syncopation measure [4] takes this effect into account indirectly, by
giving to the syncopation that involves adjacent metrical levels a relatively small
weight. Other syncopation measures, such as the weight-note-to-beat distance
(WNBD) [13], relate syncopation directly to the beat level, ignoring all slower
metrical levels.
In order to take the above effect of tempo into consideration in the construction of
the template, we employ a similar approach to the WNBD by essentially chopping
off” the slow metrical levels. We chose the level that falls in the range between
500ms and 1s as the slowest metrical level represented in our structure. For example,
in the case of a 4/4 meter at 160bpm (q.n. = 375ms), the metrical template of Fig. 1
will become as in Fig. 3. At this tempo, only 3 metrical levels survive with
corresponding durations of 750ms (0), 375ms (1) and 187.5ms (2).
3.2 De-syncopation.
The de-syncopation algorithm is a direct consequence of the operational definition of
syncopation followed in this article [2 p. 295, 4]: syncopation in the binary patterns is
found in those events that are placed in a relatively weak metrical position (faster
metrical levels, higher metrical indexes) and are not followed by an event in the
following stronger position (slow metrical level, low metrical index). In order to de-
syncopate such an event, we shift its onset to a position where it no longer syncopates,
i.e. to the following strong metrical position that was initially empty, thus evoking the
feeling of syncopation.
An onset is not shifted if one or more other onsets block its way. This rule ensures
that the order that the events are performed is preserved. In such cases, the onset that
is blocking the shifting is also syncopating and should be de-syncopated first. For that
reason, the de-syncopation process is a recursive process that stops when all
syncopating events have been moved to positions where they no longer syncopate.
The above rule might seem at first glance as having no purpose. After all, the
onsets in the binary string are indistinguishable from one another, so that the order in
which they are played back is unimportant. Nevertheless, we will later attribute to the
onsets dynamic accents. Each one will receive a specific amplitude value and it will
carry it in all the generated variations. Thus, onsets will be eventually distinct and
therefore their order must be preserved. Moreover, this way the process is more
general and can be expanded to include other types of phenomenal accents, such as
pitch or timbre changes, or other properties of the events that can make them distinct.
Fig. 4 illustrates the process through an example. In the example the pattern is
scanned from right to left and each event is examined and shifted when appropriate.
However, the order in which the onsets are examined and shifted can be, in principle,
freely chosen. The first event found is denoted by 󰈆. It belongs to the fastest
metrical level in pulse 2. Since pulse 3 belongs to a slower metrical level and it is not
occupied by another event, 󰈆 is pushed forward to pulse 3. The next event is found
Syncopalooza: Manipulating Syncopation 7
in pulse 7. The following pulse belonging to a slower metrical level is pulse 9 but it is
already occupied by an event, so that event 󰈇 in not to be shifted as it is not
syncopating. The 3rd event found in pulse 9 already belongs to the slowest metrical so
that it has no pulse to be shifted to. The process re-starts from the beginning. Event
󰈆 is now found in pulse 3 and is pushed to pulse 5. This last shift ends the process
since there is no longer any event in the pattern that syncopates. The following pseudo
code illustrates the basic steps in the de-syncopation algorithm:
Repeat
For each onset N
P = position of N
Find next stronger position R
if positions P to R are empty
Shift onset N to position R
Output Pattern
Until no onsets can be shifted
The described process guarantees that the resulting pattern will not syncopate.
Something that is not true for other similar transformations. In some cases, one could
de-syncopate off-beat events by shifting them to the preceding on-beat position
instead of the following one. However, the two directions are not equivalent. Our
operational definition of syncopation attributes the syncopation feel to the lack of an
event in the following strong position. The forward shifting of events always
guarantees the resulting pattern will not syncopate, as the “empty” strong pulse that
was causing previously the syncopation will now receive an onset.
3.3 Re-syncopation.
The re-syncopation process takes a binary pattern and creates syncopations by
anticipating the events found in strong metrical positions. It is essentially the reverse
process of de-syncopation. Onsets in pulses that belong to strong metrical positions
(slow metrical levels, low metrical indexes) are shifted to preceding pulses belonging
to weaker metrical positions (faster metrical levels, higher level indexes). Again, as in
Fig. 4: Illustration of a de-syncopation process. The metrical template shown corresponds to a
4/4 meter at 100bpm (q.n. = 600ms). Each onset of the binary pattern is numbered in a circle.
The de-syncopated process is shown as arrows.
8 George Sioros et al.
the de-syncopation process, the order in which events are heard should be preserved,
so that, an onset cannot be shifted when it is blocked by some other onset(s) found
“on the way” of the shift (in between the initial and destination pulse, or at the
destination pulse).
Contrary to the de-syncopation process, in the re-syncopation there might be more
than one option for where to shift the onsets in question. In other words, there might
be more than one way to syncopate at a certain metrical position. A strong pulse
might be preceded by more than one weak pulse, e.g. a pulse that belongs to the
quarter note level is preceded by a pulse at the eighth note level and another one at the
sixteenth note level, both belonging in faster metrical levels than the initial one (Fig. 5
A). The decision of where such an onset should be shifted to is a “stylistic” one, as all
the options result in syncopating patterns but each one syncopates in a different way.
Several “stylistic” rules can be invented in order to take such decisions in an
automatic way. Stylistic rules can be “learned”, e.g. by analyzing a collection of
rhythms in a certain music style. The de-syncopation process can serve as an analysis
that associates metrical positions with particular syncopation shifts. In that way, one
can apply the particular syncopation of a music style in any rhythmic pattern.
Stylistic rules can also be created by a performer, even during the performance. In
that case the rules must be simpler and should have a generic form that can be easily
controlled.
We decided to include in our algorithm one such rule that we believe can work
well in a variety of music styles and can be controlled in real time through a single
parameter. Our stylistic rule allows shifts only between certain metrical levels
according to the difference of the metrical indexes. For example, if the difference is
set to 1, onsets would be shifted to the immediately faster metrical subdivision, while
if it is to 2, onsets would be shifted two metrical subdivisions faster. An onset found
at the quarter note level, in the first case, would be shifted to the preceding eight note
metrical position. In the second case, it would be shifted to the preceding sixteenth
note position. In cases when the level difference results to a metrical level faster than
the fastest metrical subdivision included in the metrical template, then this fastest
metrical subdivision is used disregarding the rule. For example, if the level difference
is set to 2, an onset at the eight note level should be shifted to the preceding 32nd note.
If the template only goes as fast as the sixteenth note level, then the onset should be
shifted to the preceding sixteenth note, disregarding the rule.
The re-syncopation process comprises two main stages: 1) the pattern is scanned
and the events that are not blocked and can be shifted are marked alongside with their
possible shifts, and 2) the actual shifts are applied. In Fig. 5 B we present an example
illustrative of the process. In the example, the pattern is scanned from right to left.
Fig. 5 A: Example of multiple choices of syncopation shifts for a single onset. B: Illustration of
a re-syncopation process. Each onset of the binary pattern is numbered in a circle. The re-
syncopated process is shown as arrows.
Syncopalooza: Manipulating Syncopation 9
However, the order in which the onsets are examined and shifted can be, in principle,
freely chosen. The stylistic rule is set so that the metrical level difference should
always be equal to 1. In the first stage, the pattern is scanned without making any
transformations. First, event is examined. It belongs to metrical level 0 and
therefore should be shifted to pulse 6 that belongs to level 1 (difference = 1). Since
pulse 6 is already occupied, event cannot be shifted. Second, event is
examined. It can be moved from pulse 6 (level 1) to pulse 5 (level 2). The shift is
stored but not applied until all events have been examined. Event is examined. It
can be moved from pulse 4 (level 0) to pulse 2 (level 1). It can be further moved, in a
second syncopation step, from pulse 4 to pulse 1 (level 2). The examination of event
ends the first stage of the process. In the second stage, the transformations are
applied. First, event is shifted to pulse 5. Then, event is shifted to pulse 2.
Finally, event is moved to pulse 1.
The two separate stages are used to ensure that syncopations created by shifting
one onset are not cancelled by shifting another onset to the silent pulse of the
syncopation. For example, if event was shifted to pulse 6 in the final pattern,
then the syncopation of event would be weakened.
The following pseudo code illustrates the basic steps in the re-syncopation algorithm:
T = Array of Transformation
For each onset N
P = position of N
Repeat
Find previous weaker position R
if positions P to R are empty
Store transformation [P R] into T
Set P = R
Until P does NOT change
end
For each element [P R] in array T
Apply transformation [P R]
Output Pattern
end
3.4 Dynamic Accents.
We introduce dynamic accents to the binary patterns for two main reasons. First, the
metrical feel is enhanced by accenting the onsets that exist on strong pulses. Second,
we can syncopate onsets that could not be shifted during the re-syncopation process
by moving the accents to off-beat onsets. Each onset is assigned a specific dynamic
accent in the range between 0 (meaning a silent onset) and 1 (loudest possible). In that
way, the binary string that is created based on the drum transcription becomes now a
10 George Sioros et al.
sequence of amplitudes, which can take values other than just 0s and 1s. In the
paragraphs that follow we will describe first, how the onsets are assigned amplitudes
that enhance the metrical feel, and second, the generation of syncopation by
manipulating the amplitudes of the onsets.
Enhancing the Metrical Feel. The metrical feel is enhanced when phenomenal
accents coincide with the metrical accents, i.e. when changes in the loudness, pitch
etc. coincide with the alternation of strong and weak pulses characteristic of the
meter. Based on this principle, each onset is assigned an amplitude value relative to
the metrical strength of the pulse that the onset belongs to. However, the same onset
can be found in different pulses in the different versions of the patterns, i.e. the
original, the de-syncopated or the re-syncopated versions. Syncopation is described as
a contradiction to the underlying meter [10], which means that the version of our
binary pattern that represents best the meter would be the fully de-syncopated one.
We assign the amplitudes to the de-syncopated positions of the onsets. The
amplitudes are assigned to the onsets themselves and not to the pulses, so that when
the onsets are shifted to different positions in the various syncopated versions, they
still carry their accents from the de-syncopated version. For example, an onset found
at the beat or quarter note level in the de-syncopated version will still be a loud onset
with a high amplitude value even when it is shifted to a faster metrical level. In that
way, the feeling of syncopation is intensified. The syncopating event is distinct from
other events that are found in fast metrical levels but that do not syncopate. Those
non-syncopating events receive lower amplitudes since they belong in fast metrical
levels in the de-syncopated version.
The following mathematical formula is used to calculate the amplitudes of onsets:
 
where A(i) (range: 0 to 1) is the amplitude of event i, C is a parameter that controls the
contrast between strong and weak metrical positions, pi is the pulse that event i
belongs to in the de-syncopated version and L(pi) is the metrical level of that position.
The parameter C ranges between 0 (only the slowest metrical level survives) and 1
(all metrical levels receive the same amplitude).
Generating Syncopation. Dynamic accents can also be employed to create
syncopation when events cannot be shifted to faster metrical levels. Imagine the
extreme case where all pulses, even at the fastest metrical subdivision, have an onset.
That would be, for example, the case of drum rolls. In such cases, onsets cannot be
shifted since there are no empty pulses available and the only way to reveal the meter
or contradict it is through accents. In order to introduce syncopation to such a pattern,
we stress certain off the beat onsets while attenuating the ones that follow them on the
beat. For those onsets, the phenomenal accents would not coincide with the metrical
accents, and that mismatch of phenomenal to metrical accents would be felt as
syncopation.
Looking closer at the metrical template and the way it is constructed, one can see
that strong and weak pulses are always alternating. Strong pulses, which belong to
slow metrical levels, are always followed by weak pulses, which belong to faster
metrical levels. Noticing that and in order to syncopate certain onsets, we shift the
Syncopalooza: Manipulating Syncopation 11
metrical template to the left by one pulse before assigning them amplitudes. In effect,
the dynamic accents get anticipated. In that way, the events occurring off the beat
would be accented relative to the events on the beat.
The generation of syncopation based on dynamic accents is a secondary
mechanism, needed only when the onsets themselves cannot be shifted. When the
onsets can be shifted, they carry with them the dynamic accents, so that onsets and
accents are anticipated together.
The example in Fig. 6 illustrates how dynamic accents are used. Metrical accents
are calculated according to the metrical template (A). They are, then, assigned to the
onsets of a de-syncopated pattern (B and C). Finally, syncopation is generated in two
ways (D): 1) by shifting the onsets which also carry their accents, and 2) by shifting
the metrical accents instead of the onsets themselves.
4 Evaluation of Syncopation Transformations
In this section we evaluate the syncopation transformations described in 3.2 and 3.3 in
practice, i.e. with actual musical material and not just theoretical binary patterns. We
applied the transformations on a collection of MIDI drum loops. Then, we used the
LHL syncopation metric described in section 2 to measure the syncopation in the
original loops and in three more versions that have undergone the syncopation
transformations: 1) the de-syncopated version, 2) a re-syncopated version with 30%
of the total transformations applied, and 3) a re-syncopated version with 70% of the
total transformations applied. The onsets that were shifted in the re-syncopated
versions were chosen randomly from the total of onsets that could be shifted in each
pattern.
The MIDI collection was taken from [14] and consists of 160 drum loops of
different music styles. The MIDI files were segregated into three separate MIDI
streams according to the MIDI note numbers that correspond to the kick drums,
snare drums and hi-hats. All 480 loops (3 x 160) were in a 4/4 meter and quantized
to the 16th note grid and they were converted to binary patterns before applying any
transformation. In all cases we used a metrical template that corresponds to a 4/4
Fig. 6: Example illustrating the process of assigning dynamic accents. A: Metrical accents
calculated based on the metrical template. B: A binary, de-syncopated pattern. C: Dynamic
accents on the events in their de-syncopated positions. D: Re-syncopated pattern. The event on
pulse 2 is now shifted to pulse 1 (dashed arrow). The events on pulses 3 and 4 cannot be
shifted. Instead, the accents on pulse 3 and 4 are shifted (solid arrow).
12 George Sioros et al.
meter at 100bpm. First, they were de-syncopated according to the process described
in 3.2. Second, the resulted de-syncopated patterns were re-syncopated twice (with
30% and 70% of the transformations applied) according the process described in
3.3. The “stylistic rule was set to allow syncopation shifts only to two metrical
subdivisions faster. We used the LHL metric to measure the syncopation in all
versions of each of the 480 loops. An overview of the results is shown in Fig. 7.
The de-syncopation and re-syncopation algorithms performed as expected. The de-
syncopation algorithm removed completely the syncopation in all 480 binary patterns
(syncopation score = 0). This was expected as the de-syncopation process directly
corresponds to the way onsets are matched to following silent pulses in the definition
of syncopation in the LHL algorithm.
The re-syncopated process increased gradually the syncopation. The 30%
transformation increased the syncopation in the majority of the patterns (in 469 out of
480 patterns). The 70% transformation increased further the syncopation for the
majority of the patterns (in 445 out of 480 patterns). No pattern had less syncopation in
the 70% than in the 30% transformations. In a few exceptions, the patterns had the same
syncopation score for the de-syncopated and 30% transformation (11 patterns), or for
the 30% and 70% transformations (35 patterns) (horizontal lines in Fig. 7). In 6 patterns
out of the total 480 the re-syncopation transformations did not create any syncopation.
The exact relation between the syncopation scores and the percentage of the
applied transformations depends on two factors: 1) the number of onsets per bar in the
pattern that can actually be shifted and are not blocked by other onsets and 2) their
metrical positions. When only a small number of events can be shifted there are
accordingly few steps between the de-syncopated version and the fully re-syncopated
version (100% of the transformations applied). The number of events that can be
shifted in the pattern strongly depends on the density of events per bar. For very low
density (e.g. only a couple of events per bar) or very high density (when almost all
metrical positions are occupied) the onsets available for syncopation are very few.
The evaluation shows that the re-syncopation transforms increases the syncopation
with each step, even when the available steps are very few. In those cases, the
dynamic accents are employed to increase further the syncopation.
Fig. 7: Syncopation measurements using the LHL algorithm on 4 different versions of the patterns
found in the 480 binary patterns. Each circle represents a single measurement. The grey lines
connect the measurements for the 4 different versions of each pattern. As the number of
measurements is large, several lines naturally overlap. The darker the line is, the more the overlaps.
Syncopalooza: Manipulating Syncopation 13
5 The Syncopalooza Max4Live Devices
Our system comprises two Max4Live devices: 1) an audio device, loaded in an audio
track and 2) a MIDI device loaded in a MIDI track. The audio device transcribes a
real time audio stream of a drum performance into three separate streams of onsets,
corresponding to the kick drum, snare drums and hi-hats. The MIDI device can
receive these streams of onsets and stores one bar of one of the streams. It then
transforms it according to the syncopation transformations of section 3. After
applying the transformations it plays back the resulted pattern as a sequence of MIDI
note events that can be fed to any standard virtual or external synthesizer.
Alternatively, the MIDI device can be used independently of the audio device. In that
case, the user can directly input a rhythmic pattern into the device, using the user
interface.
In section 5.1, we provide a description of the audio device together with the drum
transcription algorithm. In section 5.2, we describe the MIDI device.
5.1 Audio Device: Drum Transcription
Our audio device is driven by live audio comprising drum sounds. It assumes
analyzing the input signal and labeling each detected event as either bass drum (BD),
snare drum (SD) or hi-hat cymbal (HH), from a mixture of bass drums, snare drums,
hi-hat cymbals and toms. Each time an event is detected, it is “broadcasted” to any
other Max4Live device loaded in the same Ableton Live Set.
We are using the drum transcription algorithm described in [14]. Along the event
detection and k-nearest neighbor classification, the method proposes an additional
instance filtering stage (IF), which filters events in the corresponding bands for BD,
SD and HH, before feeding them to the classifier.
The transcription assumes four steps. First, onset candidates are detected with a
high frequency content onset detector (HFC). The window size is 11.6 ms and the hop
size is 2.9 ms. Thus, this initial onset detector has the advantage that it can detect
onsets quickly, with a delay of approximately 11 ms from the original onset.
Second, for every event detected by the HFC onset detector, a set of spectral
features are extracted for BD, SD and HH.
In the third step, three additional sub-band onset detectors with higher time
resolution are used to filter onsets for each of the BD, SD or HH classes. The audio is
filtered with a low-pass filter with a cut-off frequency of 90 Hz for BD, a band-pass
filter with a central frequency of 280Hz and a bandwidth of 20 Hz for SD, and a high-
pass filter with a cut-off frequency of 9000 Hz for HH. Three complex onset detection
functions are detecting onsets in these frequency bands. Because the window size is
23.2 ms and the hop size is 5.8 ms, these additional onsets are always coming after the
initial HFC onset. An HFC onset must be matched with one of the sub-band onsets, in
order to be fed to the classifier.
In the fourth step, the filtered events and the corresponding feature vectors are fed
to three K-nearest neighbor classifiers, which label each event as a member or a non-
member of the three classes, BD, SD and HH.
14 George Sioros et al.
Several factors make the drum transcription task challenging. First, the system
must transcribe events as fast as possible. The IF stage optimizes the performance
with respect to real time-constraints by combining a fast onset detector with a sub-
band onset filtering. Furthermore, we are dealing with a mixture of sounds which
might overlap and not occur exactly at the same time, causing the system to over-
detect events. The IF stage described in [14] is useful in eliminating instances which
can be classified incorrectly. The evaluation showed that using the IF stage decreased
the number of incorrectly labeled events and increased the performance.
5.2 MIDI Device: Syncopation Transformations
Our MIDI device manipulates effectively and in real time the syncopation in a binary
pattern. It systematically removes the syncopation and creates new syncopations in
the pattern according to the transformations described in section 3.
The user interface of the device is shown in Fig. 8. There are two main control
areas on the device: 1) the step sequencer interface where the user inputs a pattern
and, and 2) an area for the manipulation of the syncopation consisting mainly of a
slider controlling the syncopation and a dial controlling the dynamic accents. In the
following paragraphs we describe in more detail each of these controls.
Step Sequencer Interface. The user can input a pattern directly on the user interface or
“record” the last pattern received through the MIDI input or from the drum transcription
device. The length of the pattern is always a single bar. The device constructs
automatically a metrical template that corresponds to the time signature and tempo
settings of the Ableton Live Set. The template defines the quantization grid, thus the
number of pulses or steps in the pattern. After a pattern is recorded from the received
onset stream, it is treated in exactly the same way as if it was input manually by the
user. The user can edit the input patter at any time and in real time. The transformations
are applied automatically and playback continues without interruption.
The transformed pattern is shown directly below the input pattern. The transformed
pattern is the one that is been played back and output as MIDI notes. The user cannot
directly interact with this pattern since it is controlled by the syncopation controls
described in the following paragraph.
Fig. 8: The MIDI Max4Live device that manipulates the syncopation of a binary rhythmic
pattern. Dashed rectangles and arrows indicate the main controls
Syncopalooza: Manipulating Syncopation 15
Syncopation Controls. The user can control the syncopation transformations through
a stepped slider and a dial found on the user interface (Fig. 8). The slider controls the
syncopation transformations and the dial controls the amplitude of the dynamic
accents.
The syncopation slider ranges between the de-syncopated version (top position),
and the fully re-syncopated version (bottom position). In between those positions,
there exist versions of the rhythmic pattern in which only a certain number of the
available shifts is applied. Each step of the slider corresponds to a single shift of an
onset (section 3.2 and 3.3). In-between the top (de-syncopated) and bottom (fully re-
syncopated) positions, the original input pattern is found. Its position varies according
to how many steps are required to de-syncopate the input pattern (moving towards the
top) or to fully re-syncopate it (moving towards the bottom). For the onsets that
cannot be shifted, their dynamic accents are shifted as described in section 3.4. When
one reaches the bottom position, all onsets and accents have been shifted to their
syncopating positions. The transformations are performed based on an automatically
constructed metrical template.
The order in which the onsets or accents are shifted is random. However, the order
is forced so that the transformations are passing through the original input pattern.
The user can totally randomize the order of the transformations through the button
above the syncopation slider. In that case, the input pattern will not appear on the
slider; the available versions of the rhythmic pattern will be generated by gradually
applying the re-syncopation transformations on the de-syncopated version in a totally
random order, until a fully re-syncopated version is reached.
A secondary control, a number box found on the top left corner of the device,
controls the syncopation style. It controls the stylistic rule described in section 3.3 and
corresponds directly to the difference of metrical levels that the onsets are allowed to
be shifted to.
The dial found on the left bottom corner of the device controls the dynamic accents
applied. When the dial is turned to the left, no accents are applied and all onsets have
maximum amplitude. When the dial is turned fully right, the amplitude of the onsets
is relative to the metrical strength of their positions.
6 Conclusions / Future Work
In this paper we presented a set of formalized transformations that can manipulate
the syncopation in binary patterns in an automatic way. We also presented an
algorithm for applying dynamic accents in order to enhance the metrical and
syncopation feels. The transformations can effectively remove and create
syncopations by shifting the onsets according to a metrical template constructed for
each time signature and tempo.
The transformations were evaluated using the Longuet-Higgins and Lee
syncopation metric [4, 5]. We showed that the transformations can be used to
gradually remove and create new syncopations.
We used the above transformations in a Max4Live MIDI device that manipulates
in real time the syncopation of rhythmic performances, through a simple user
interface. The device is accompanied by a Max4Live audio device that can transcribe
in real time a drum performance into streams of drum onsets. The two devices can be
16 George Sioros et al.
used by musicians together or independently, in order to explore the uses of
syncopations during music performances or in offline compositions.
We believe that syncopation styles can be modeled. To that end, we propose to use the
syncopation transformations to “learn” syncopation styles and reapply them to a rhythmic
pattern. The de-syncopation process can associate particular syncopation shifts to specific
metrical positions. For example, the de-syncopation of a collection of patterns in a certain
music style can build probabilistic rules about how onsets are shifted in that style.
Applying the “learned” syncopation style on a pattern would then consist in shifting onsets
according to the rules. The applications of the syncopation transformations to music and
syncopation styles are left to be explored in the future work of our group.
The Max4Live devices and externals are available for download at our website:
http://smc.inescporto.pt/shakeit/
Acknowledgments. This work is financed by the ERDF European Regional Development
Fund through the COMPETE Program (operational program for competitiveness) and by
National Funds through the FCT Fundão para a Ciência e a Tecnologia within project
«PTDC/EAT-MMU/112255/2009-(FCOM-01-0124-FEDER-014732)».
7 References
1. Gómez, F., Thul, E., Toussaint, G.T.: An experimental comparison of formal measures of
rhythmic syncopation. Proceedings of the International Computer Music Conference. pp.
101104 (2007).
2. Huron, D.: Sweet anticipation: music and the psychology of expectation. The MIT Press
(2006).
3. Randel, D.M.: The Harvard Dictionary of Music. Belknap Press of Harvard University
Press, Cambridge, MA (1986).
4. Longuet-Higgins, H.C., Lee, C.S.: The rhythmic interpretation of monophonic music. Music
Perception. 1, 424441 (1984).
5. Fitch, W.T., Rosenfeld, A.J.: Perception and Production of Syncopated Rhythms. Music
Perception. 25, 4358 (2007).
6. Huron, D., Ommen, A.: An Empirical Study of Syncopation in American Popular Music,
1890-1939. Music Theory Spectrum. 28, 211231 (2006).
7. Sioros, G., Guedes, C.: A formal approach for high-level automatic rhythm generation.
Proceedings of the BRIDGES 2011 Mathematics, Music, Art, Architecture, Culture
Conference. , Coimbra, Portugal (2011).
8. Temperley, D.: Syncopation in rock: a perceptual perspective. Popular Music. 18, 19 40
(1999).
9. Lerdahl, F., Jackendoff, R.: A Generative Theory of Tonal Music. The MIT Press,
Cambridge (1983).
10. Repp, B.H.: Rate Limits of Sensorimotor Synchronization. Advances in Cognitive
Psychology. 2, 163181 (2006).
11. Parncutt, R.: A perceptual model of pulse salience and metrical accent in musical rhythms.
Music Perception. 11, 409464 (1994).
12. London, J.: Hearing in Time. Oxford University Press (2012).
13. Gómez, F., Melvin, A., Rappaport, D., Toussaint, G.T.: Mathematical measures of syncopation.
Proc. BRIDGES: Mathematical Connections in Art, Music and Science. pp. 7384. Citeseer (2005).
14. Miron, M., Davies, M.E.P., Gouyon, F.: An open-source drum transcription system for pure
data and max msp. The 38th International Conference on Acoustics, Speech, and Signal
Processing. , Vancouver, Canada (2013).
... Our system assists DJs and music producers to explore the creative potential of syncopation. This work is based on a previous formalized algorithm, and corresponding application Syncopalooza [12]. Syncopalooza manipulates the syncopation in symbolic data, i.e. ...
... In the analysis stage we use onset detection to identify the start times of musical events in the audio loop and hence to extract the rhythmic structure in the context of a known time signature and tempo. Next, the transformation step applies the symbolic approach to syncopation manipulation of Sioros [12] to determine how the rhythmic structure of the audio loop can be modified. Finally, reconstruction consists of implementing this transformation in the audio domain. ...
... Recently, Sioros et al. proposed an application for the manipulation of syncopation in music performances, named Syncopalooza. The core of Syncopalooza uses an algorithm that can remove or generate syncopation in rhythmic patterns by displacing the start times of musical events (onsets) with respect to a metrical template [12]. After the metrical template is created according to the meter and tempo, the onsets are aligned to the grid by snapping their time positions to the closest pulse boundary present in the metrical grid. ...
... Our system assists DJs and music producers to explore the creative potential of syncopation. This work is based on a previous formalized algorithm, and corresponding application Syncopalooza [12]. Syncopalooza manipulates the syncopation in symbolic data, i.e. ...
... In the analysis stage we use onset detection to identify the start times of musical events in the audio loop and hence to extract the rhythmic structure in the context of a known time signature and tempo. Next, the transformation step applies the symbolic approach to syncopation manipulation of Sioros [12] to determine how the rhythmic structure of the audio loop can be modified. Finally, reconstruction consists of implementing this transformation in the audio domain. ...
... Recently, Sioros et al. proposed an application for the manipulation of syncopation in music performances, named Syncopalooza. The core of Syncopalooza uses an algorithm that can remove or generate syncopation in rhythmic patterns by displacing the start times of musical events (onsets) with respect to a metrical template [12]. After the metrical template is created according to the meter and tempo, the onsets are aligned to the grid by snapping their time positions to the closest pulse boundary present in the metrical grid. ...
Conference Paper
Full-text available
In this work we present a system that estimates and ma-nipulates rhythmic structures from audio loops in real-time to perform syncopation transformations. The core of our system is a technique for the manipulation of synco-pation in symbolic representations of rhythm. In order to apply this technique to audio signals we must first seg-ment the audio loop into musical events using onset de-tection. Then, we use the symbolic syncopation transfor-mation method to determine how to modify the rhythmic structure in order to change the syncopation. Finally we present two alternative methods to reconstruct the audio loop, one based on time scaling and the other on resampling. Our system, Loopalooza, is implemented as a freely available MaxForLive device to allow musicians and DJs to manipulate syncopation in audio loops in real-time.
... Recently, George Sioros and his colleagues have developed a series of algorithms and software for the real-time manipulation of syncopation in a given pattern [8]. While these rhythmic transformations are based on a deterministic model of musical meter [9] rhythmic transformations may also be approached as a stochastic process [10]. ...
Conference Paper
Full-text available
This paper presents an algorithm and software that implements it for the gradual transformation of musical rhythms through graphical means, as well as the artistic installation Waiting for Response where it was first used. The transformation is based on the manipulation of the time-line of the input rhythm, which is treated as geometric form in constant transformation. The aim of the algorithm is to explore rhythmic relations in an evolutionary manner by generating transformations based on graphical and geometric concepts and independently of the musical character of the initial rhythmical pattern. It provides, relates and generates a genealogy of rhythms based on an initial rhythm, which may be perceptually unrelated. Waiting for Response is an artistic installation that employs the above transformation in the generation of sonic events to enter in an acoustical "dialogue" with the materiality of the exhibition space.
... The events' duration can be decomposed into several layers, starting with the tempo, which depends on the tactus duration and the stratification of the remaining metrical layers (Sioros et al., 2013). Furthermore, inter-onset-intervals (IOIs), i.e., the intervals between the onsets times of sequential note events (Toussaint, 2019), provide a good indication of the structural characteristics of a rhythm, given by the temporal distribution of the events' onsets, while discarding rests (i.e., silences) and performance traits such as legato or staccato. ...
Chapter
In this paper, we review computational methods for the representation and similarity computation of musical rhythms in both symbolic and sub-symbolic (e.g., audio) domains. Both tasks are fundamental to multiple application scenarios from indexing, browsing, and retrieving music, namely navigating musical archives at scale. Stemming from the literature review, we identified three main rhythmic representations: string (sequence of alpha-numeric symbols to denote the temporal organization of events), geometric (spatio-temporal pictorial representation of events), and feature lists (transformation of audio into a temporal series of features or descriptors), and twofold categories of feature- and transformation-based distance metrics for similarity computation. Furthermore, we address the gap between explicit (symbolic) and implicit (musical audio) rhythmic representations stressing that a greater interaction across modalities would promote a holistic view of the temporal music phenomena. We conclude the article by unveiling avenues for future work on (1) hierarchical, (2) multi-attribute and (3) rhythmic layering models grounded in methodologies across disciplines, such as perception, cognition, mathematics, signal processing, and music.
... The events' duration can be decomposed into several layers, starting with the tempo, which depends on the tactus duration and the stratification of the remaining metrical layers [53]. Furthermore, inter-onset-intervals (IOIs), i.e., the intervals between the onsets times of sequential note events [58], provide a good indication of the structural characteristics of a rhythm, given by the temporal distribution of the events' onsets, while discarding rests (i.e., silences) and performance traits such as legato or staccato. ...
Preprint
Full-text available
In this paper, we review computational methods for the representation and similarity computation of musical rhythms in both symbolic and sub-symbolic (e.g., audio) domains. Both tasks are fundamental to multiple application scenarios from indexing, browsing, and retrieving music, namely navigating musical archives at scale. Stemming from the literature review, we identified three main rhythmic representations: string, geometric, and feature lists, and twofold categories of feature- and transformation-based distance metrics for similarity computation. Furthermore, we address the gap between explicit (symbolic) and implicit (musical audio) rhythmic representations stressing that a greater interaction across modalities would promote a holistic view of the temporal music phenomena. We conclude the article by unveiling avenues for future work on 1) hierarchical, 2) multi-attribute and 3) rhythmic layering models grounded in methodologies across disciplines, such as perception, cognition, mathematics, signal processing, and music.
... Recently we presented an algorithm that allows for the manipulation of syncopation, i.e. a computer algorithm that is able to remove or introduce syncopation in a certain rhythmic pattern in a controlled way [20]. Here, we extend the algorithm to a set of formalized generic transformations that can analyze, generate and manipulate the syncopation in binary patterns. ...
Conference Paper
Full-text available
Syncopation is a rhythmic phenomenon present in various musical styles and cultures. We present here a set of simple rhythmic transformations that can serve as a formalized model for syncopation. The transformations are based on fundamental features of the musical meter and syncopation, as seen from a cognitive and a musical perspective. Based on this model, rhythmic patterns can be organized in tree structures where patterns are interconnected through simple transformations. A Max4Live device is presented as a creative application of the model. It manipulates the syncopation of midi “clips” by automatically de-syncopating and syncopating the midi notes.
... As we want to eliminate all other expressive factors of a musical performance, we developed a computer algorithm that generates syncopation in monophonic sound sequences relying on changing metrical positions only, without altering other expressive or structural characteristics. The algorithm is an adaptation of the general syncopation transformations developed by Sioros et al. (2013). While in the recent Witek study (2014) syncopation was measured in pre-existing drum loops, in our study we can generate and vary it in piano melodies using an automatic algorithm over which we have complete control. ...
Article
Full-text available
In order to better understand the musical properties which elicit an increased sensation of wanting to move when listening to music-groove-we investigate the effect of adding syncopation to simple piano melodies, under the hypothesis that syncopation is correlated to groove. Across two experiments we examine listeners' experience of groove to synthesized musical stimuli covering a range of syncopation levels and densities of musical events, according to formal rules implemented by a computer algorithm that shifts musical events from strong to weak metrical positions. Results indicate that moderate levels of syncopation lead to significantly higher groove ratings than melodies without any syncopation or with maximum possible syncopation. A comparison between the various transformations and the way they were rated shows that there is no simple relation between syncopation magnitude and groove.
... There is a need for machine musicianship (Rowe 2001) to assist in the construction of musically interesting and stylistically appropriate performances. To cite just a few representative studies, drumming (Sioros et al. 2013), chord voicings (Hirata 1996), bass lines (Dias and Guedes 2013), and vocal technique (Nakano and Goto 2009) have all been explored and automated to some extent. Even more difficult is the problem of adjusting styles according to other musicians. ...
Article
Full-text available
Computers are often used in performance of popular music, but most often in very restricted ways, such as keyboard synthesizers where musicians are in complete control, or pre-recorded or sequenced music where musicians follow the computer's drums or click track. An interesting and yet little-explored possibility is the computer as highly autonomous performer of popular music, capable of joining a mixed ensemble of computers and humans. Considering the skills and functional requirements of musicians leads to a number of predictions about future human–computer music performance (HCMP) systems for popular music. We describe a general architecture for such systems and describe some early implementations and our experience with them.
Conference Paper
This paper examines the computational problem of taking a classical music composition and algorithmically recomposing it in a ragtime style. Because ragtime music is distinguished from other musical genres by its distinctive syncopated rhythms, our work is based on extracting the frequencies of rhythmic patterns from a large collection of ragtime compositions. We use these frequencies in two different algorithms that alter the melodic content of classical music compositions to fit the ragtime rhythmic patterns, and then combine the modified melodies with traditional ragtime bass parts, producing new compositions which melodically and harmonically resemble the original music. We evaluate these algorithms by examining the quality of the ragtime music produced for eight excerpts of classical music alongside the output of a third algorithm run on the same excerpts; results are derived from a survey of 163 people who rated the quality of the ragtime output of the three algorithms.
Conference Paper
Full-text available
Rhythmic syncopation is one of the most fundamental fea- tures that can be used to characterize music. Therefore it can be applied in a variety of domains such as mu- sic information retrieval and style analysis. During the past twenty years a score of different formal measures of rhythmic syncopation have been proposed in the mu- sic literature. Here we compare eight of these measures with each other and with human judgements of rhythmic complexity. A data set of 35 rhythms ranked by human subjects was sorted using the eight syncopation measures. A Spearman rank correlation analysis of the rankings was carried out, and phylogenetic trees were calculated to vi- sualize the resulting matrix of coefficients. The main find- ing is that the measures based on perception principles agree well with human judgements and very well with each other. The results also yield several surprises and open problems for further research.
Conference Paper
Full-text available
This paper presents a drum transcription algorithm adjusted to the constraints of real-time audio. We introduce an instance filtering (IF) method using sub-band onset detection, which improves the performance of a system having at its core a feature-based K-nearest neighbor classifier (KNN). The architecture proposed allows for adapting different parts of the algorithm for either bass drum, snare drum or hi-hat cymbals. The open-source system is implemented in the graphic programming languages Pure Data (PD) and Max MSP, and aims to work with a large variety of drum sets. We evaluated its performance on a database of audio samples generated from a well known collection of midi drum loops randomly matched with a diverse collection of drum sets. Both of the evaluation stages, testing and validation, show an improvement in the performance when using the instance filtering algorithm.
Article
Full-text available
A study of syncopation in American popular music was carried out based on analyses of sound recordings spanning the period 1890 to 1939. Sample measures were randomly selected and the presence and rhythmic character of syncopations were tabulated. Several trend-related hypotheses were tested. While some changes in the patterns of syncopation were evident over the 50-year period of the study, the principal change was an increase in the quantity of syncopation rather than an increase in the variety of syncopated patterns.
Article
Full-text available
The assignment of a rhythmic interpretation to a piece of metrical music calls for the postulation of an underlying meter and the parsing of the note values according to this meter. In this article we develop the implications of this view, which include the following propositions. 1. Any given sequence of note values is in principle rhythmically ambiguous, although this ambiguity is seldom apparent to the listener. 2. In choosing a rhythmic interpretation for a given note sequence the listener seems to be guided by a strong assumption: if the sequence can be interpreted as the realization of an unsyncopated passage, then that is how he will interpret it. 3. Phrasing can make an important difference to the rhythmic interpretation that the listener assigns to a given sequence. Phrasing can therefore serve a structural function as well as a purely ornamental one.
Article
THIS EDITION HAS BEEN REPLACED BY A NEWER 2003 EDITION This classic reference work is simply the best one-volume music dictionary available today. Its nearly 6,000 entries, written by more than 70 top musicologists, are consistently lucid and based on recent scholarship. "The New Harvard Dictionary of Music" contains among its riches superb articles on music of the 20th century, including jazz, rock, and mixed media as well as twelve-tone, serial, and aleatory music; comprehensive articles on the music of Africa, Asia, Latin America, and the Near East; entries on all the styles and forms in Western art music; and descriptions of instruments enriched by historical background. Short entries for quick reference--definitions and identifications--alternate with encyclopedia-length articles written by experts in each field. More than 220 drawings and 250 musical examples enhance the text. Combining authoritative scholarship with concise, lively prose, "The New Harvard Dictionary of Music" is the essential guide for musicians, students, and everyone who listens to music.
Article
Abstract The cognitive strategies by which humans process complex, metrically-ambiguous rhythmic patterns remain poorly understood. We investigated listeners' abilities to perceive, process and produce complex, syncopated rhythmic patterns played against a regular sequence of pulses. Rhythmic complexity,was varied along a continuum; complexity,was quantified using an objective metric of syncopation suggested by Longuet-Higgins and Lee. We used a recognition memory,task to assess the immediate,and longer-term perceptual salience and memorability,of rhythmic,patterns. The tasks required subjects to (a) tap in time to the rhythms, (b) reproduce these same rhythm patterns given a steady pulse, and (c) recognize these patterns when replayed both immediately after the other tasks, and after a 24-hour delay. Subjects tended to reset the phase of their internally generated pulse with highly complex, syncopated rhythms, often pursuing a strategy of reinterpreting or "re-hearing" the rhythm as less syncopated. Thus, greater complexity in rhythmic stimuli leads to a reorganization of the cognitive representation of the temporal structure of events. Less complex rhythms,were also more,robustly encoded,into long-term memory,than more,complex,syncopated rhythms,in the delayed memory,task. 3
Article
In Experiment 1, six cyclically repeating interonset interval patterns (1,2:1,2:1:1,3:2:1,3:1:2, and 2:1:1:2) were each presented at six different note rates (very slow to very fast). Each trial began at a random point in the rhythmic cycle. Listeners were asked to tap along with the underlying beat or pulse. The number of times a given pulse (period, phase) was selected was taken as a measure of its perceptual salience. Responses gravitated toward a moderate pulse period of about 700 ms. At faster tempi, taps coincided more often with events followed by longer interonset intervals. In Experiment 2, listeners heard the same set of rhythmic patterns, plus a single sound in a different timbre, and were asked whether the extra sound fell on or off the beat. The position of the downbeat was found to be quite ambiguous. A quantitative model was developed from the following assumptions. The phenomenal accent of an event depends on the interonset interval that follows it, saturating for interonset intervals greater than about 1 s. The salience of a pulse sensation depends on the number of events matching a hypothetical isochronous template, and on the period of the template—pulse sensations are most salient in the vicinity of roughly 100 events per minute (moderate tempo). The metrical accent of an event depends on the saliences of pulse sensations including that event. Calculated pulse saliences and metrical accents according to the model agree well with experimental results (r > 0.85). The model may be extended to cover perceived meter, perceptible subdivisions of a beat, categorical perception, expressive timing, temporal precision and discrimination, and primacy/recency effects. The sensation of pulse may be the essential factor distinguishing musical rhythm from nonrhythm.
Article
While study of the social and cultural aspects of popular music has been flourishing for some time, it is only in the last few years that serious efforts have been made to analyse the music itself: what Allan Moore has called ‘the primary text’ (1993, p. 1). These efforts include general studies of styles and genres (Moore, 1993; Bowman, 1995); studies of specific aspects of popular styles such as harmony and improvisation (Winkler 1978; Moore 1992, 1995; Walser 1992), as well as more intensive analyses of individual songs (Tagg 1982; Hawkins 1992). In this paper I will investigate syncopation, a phenomenon of great importance in many genres of popular music and particularly in rock.