ArticlePDF Available

The Cortical Topography of Tonal Structures Underlying Western Music

Authors:

Abstract and Figures

Western tonal music relies on a formal geometric structure that determines distance relationships within a harmonic or tonal space. In functional magnetic resonance imaging experiments, we identified an area in the rostromedial prefrontal cortex that tracks activation in tonal space. Different voxels in this area exhibited selectivity for different keys. Within the same set of consistently activated voxels, the topography of tonality selectivity rearranged itself across scanning sessions. The tonality structure was thus maintained as a dynamic topography in cortical areas known to be at a nexus of cognitive, affective, and mnemonic processing.
Content may be subject to copyright.
DOI: 10.1126/science.1076262
, 2167 (2002);298 Science
, et al.Petr Janata
Music
The Cortical Topography of Tonal Structures Underlying Western
This copy is for your personal, non-commercial use only.
clicking here.colleagues, clients, or customers by
, you can order high-quality copies for yourIf you wish to distribute this article to others
here.following the guidelines
can be obtained byPermission to republish or repurpose articles or portions of articles
): May 23, 2011 www.sciencemag.org (this infomation is current as of
The following resources related to this article are available online at
http://www.sciencemag.org/content/298/5601/2167.full.html
version of this article at:
including high-resolution figures, can be found in the onlineUpdated information and services,
http://www.sciencemag.org/content/suppl/2002/12/12/298.5601.2167.DC1.html
can be found at: Supporting Online Material
http://www.sciencemag.org/content/298/5601/2167.full.html#related
found at:
can berelated to this article A list of selected additional articles on the Science Web sites
http://www.sciencemag.org/content/298/5601/2167.full.html#ref-list-1
, 4 of which can be accessed free:cites 24 articlesThis article
91 article(s) on the ISI Web of Sciencecited by This article has been
http://www.sciencemag.org/content/298/5601/2167.full.html#related-urls
24 articles hosted by HighWire Press; see:cited by This article has been
http://www.sciencemag.org/cgi/collection/neuroscience
Neuroscience
subject collections:This article appears in the following
registered trademark of AAAS.
is aScience2002 by the American Association for the Advancement of Science; all rights reserved. The title
CopyrightAmerican Association for the Advancement of Science, 1200 New York Avenue NW, Washington, DC 20005.
(print ISSN 0036-8075; online ISSN 1095-9203) is published weekly, except the last week in December, by theScience
on May 23, 2011www.sciencemag.orgDownloaded from
74. J. Garcia-Ferna`ndez, P. W. H. Holland, Nature 370,
563 (1994).
75. D. E. Ferrier, C. Minguillon, P. W. H. Holland, J. Garcia-
Fernandez, Evol. Dev. 2, 284 (2000).
76. A. Di Gregorio et al.,Gene 156, 253 (1995).
77. M. Gionti et al.,Dev. Genes. Evol. 207, 515 (1998).
78. D. Chourrout, R. Di Lauro, personal communication.
79. O. Hobert, H. Westphal, Trends Genet. 16,75
(2000).
80. S. I. Tomarev, Int. J. Dev. Biol.41, 835 (1997).
81. G. Krishnan, Indian J. Exp. Biol. 13, 172 (1975).
82. S. M. Read, T. Bacic, Science 295, 59 (2002).
83. J. Zuo et al.,Plant Cell 12, 1137 (2000).
84. N. Lo et al.,Curr. Biol. 10, 801 (2000).
85. D. R. Nobles, D. K. Romanovicz, R. M. Brown Jr.,Plant
Physiol. 127, 529 (2001).
86. R. C. Hardison, Proc. Natl. Acad. Sci. U.S.A. 93, 5675
(1996).
87. K. E. van Holde, K. I. Miller, H. Decker, J. Biol. Chem.
276, 15563 (2001).
88. Y. Satou et al.,Development 128, 2893 (2001).
89. T. Kusakabe et al.,Dev. Biol. 242, 188 (2002).
90. N. Harafuji, D. N. Keys, M. Levine, Proc. Natl. Acad.
Sci. U.S.A. 99, 6802 (2002).
91. D. N. Keys et al., in preparation.
92. This work was performed under the auspices of the
U.S. Department of Energy’s Office of Science, Bio-
logical and Environmental Research Program; by the
University of California, Lawrence Livermore National
Laboratory under Contract No. W-7405-Eng-48,
Lawrence Berkeley National Laboratory under con-
tract no. DE-AC03-76SF00098, and Los Alamos Na-
tional Laboratory under contract no. W-7405-ENG-
36; and by MEXT, Japan (grants 12201001 to Y.K.,
12202001 to N.S.), Japan Society for the Promotion
of Science (to Y.S.), Human Frontier Science Program
(to N.S. and M.L.), and NIH (HD-37105 and NSF
IBN-9817258 to M.L.)
Supporting Online Material
www.sciencemag.org/cgi/content/full/298/5601/2157/
DC1
SOM Text
Tables S1 to S9
Figures S1 and S2
References
1 November 2002; accepted 20 November 2002
The Cortical Topography of
Tonal Structures Underlying
Western Music
Petr Janata,
1,2
* Jeffrey L. Birk,
1
John D. Van Horn,
2,3
Marc Leman,
4
Barbara Tillmann,
1,2
Jamshed J. Bharucha
1,2
Western tonal music relies on a formal geometric structure that determines
distance relationships within a harmonic or tonal space. In functional magnetic
resonance imaging experiments, we identified an area in the rostromedial
prefrontal cortex that tracks activation in tonal space. Different voxels in this
area exhibited selectivity for different keys. Within the same set of consistently
activated voxels, the topography of tonality selectivity rearranged itself across
scanning sessions. The tonality structure was thus maintained as a dynamic
topography in cortical areas known to be at a nexus of cognitive, affective, and
mnemonic processing.
The use of tonal music as a stimulus for
probing the cognitive machinery of the hu-
man brain has an allure that derives, in part,
from the geometric properties of the theo-
retical and cognitive structures involved in
specifying the distance relationships among
individual pitches, pitch classes (chroma),
pitch combinations (chords), and keys (1–
3). These distance relationships shape our
perceptions of music and allow us, for ex-
ample, to notice when a pianist strikes a
wrong note. One geometric property of
Western tonal music is that the distances
among major and minor keys can be repre-
sented as a tonality surface that projects
onto the doughnut shape of a torus (1, 4 ). A
piece of music elicits activity on the tonal-
ity surface, and harmonic motion can be
conceptualized as displacements of the ac-
tivation focus on the tonality surface (3).
The distances on the surface also help gov-
ern expectations that actively arise while
one listens to music. Patterns of expecta-
tion elicitation and fulfillment may underlie
our affective responses to music (5).
Two lines of evidence indicate that the
tonality surface is represented in the human
brain. First, when one subjectively rates
how well each of 12 probe tones, drawn
from the chromatic scale (6), fits into a
preceding tonal context that is established
by a single chord, chord progression, or
melody, the rating depends on the relation-
ship of each tone to the instantiated tonal
context. Nondiatonic tones that do not oc-
cur in the key are rated as fitting poorly,
whereas tones that form part of the tonic
triad (the defining chord of the key) are
judged as fitting best (2). Probe-tone pro-
files obtained in this manner for each key
can then be correlated with the probe-tone
profile of every other key to obtain a matrix
of distances among the 24 major and minor
keys. The distance relationships among the
keys readily map onto the surface of the
torus (4). Thus, there is a direct correspon-
dence between music-theoretic and cogni-
tive descriptions of the harmonic organiza-
tion of tonal music (7).
Second, electroencephalographic stud-
ies of musical expectancy (8–11) have ex-
amined the effect of melodic and harmonic
context violations on one or more compo-
nents of event-related brain responses that
index the presence and magnitude of con-
text violations. Overall, the cognitive dis-
tance of the probe event from the estab-
lished harmonic context correlates positive-
ly with the amplitudes of such components.
These effects appear even in listeners with-
out any musical training (9, 11). The per-
ceptual and cognitive structures that facil-
itate listening to music may thus be learned
implicitly (2, 12–15).
The prefrontal cortex has been implicated
in the manipulation and evaluation of tonal
information (10, 11, 16 –18). However, the
regions that track motion on the tonality sur-
face have not been identified directly. When
presented with a stimulus that systematically
moves across the entire tonality surface, will
some populations of neurons respond selec-
tively to one region of the surface and other
populations respond selectively to another
region of the surface?
Identification of tonality-tracking
brain areas. In order to identify cortical
sites that were consistently sensitive to acti-
vation changes on the tonality surface, eight
musically experienced listeners (see “sub-
jects” in supporting online text) underwent
three scanning sessions each, separated by 1
week on average, in which they performed
two perceptual tasks during separate runs.
During each run, they heard a melody that
systematically modulated through all 12 ma-
jor and 12 minor keys (see “stimuli and
tasks” in supporting online text) (Fig. 1 and
audio S1). A timbre deviance detection task
required listeners to respond whenever they
heard a note played by a flute instead of the
standard clarinet timbre, whereas a tonality
violation detection task required listeners to
respond whenever they perceived notes that
violated the local tonality (Fig. 1D). The use
of two tasks that required attentive listening
to the same melody but different perceptual
analyses facilitated our primary goal of iden-
tifying cortical areas that exhibit tonality
tracking that is largely independent of the
specific task that is being performed (see
“scanning procedures” in supporting online
1
Department of Psychological and Brain Sciences,
2
Center for Cognitive Neuroscience,
3
Dartmouth
Brain Imaging Center, Dartmouth College, Hanover,
NH 03755, USA.
4
Institute for Psychoacoustics and
Electronic Music, Ghent University, Ghent, Belgium.
*To whom correspondence should be addressed. E-
mail: petr.janata@dartmouth.edu
RESEARCH ARTICLES
www.sciencemag.org SCIENCE VOL 298 13 DECEMBER 2002 2167
on May 23, 2011www.sciencemag.orgDownloaded from
text). Using a regression analysis with sepa-
rate sets of regressors to distinguish task ef-
fects from tonality surface tracking, we iden-
tified task- and tonality-sensitive areas (see
fMRI analysis proceduresin supporting on-
line text). Tonality regressors were construct-
ed from the output of a neural network model
of the moment-to-moment activation changes
on the tonality surface (see tonality surface
estimationin supporting online text).
Our tasks consistently activated several
regions in the temporal, parietal, frontal,
and limbic lobes as well as the thalamus
and cerebellum. The most extensive consis-
tent activation was along the superior tem-
poral gyrus (STG) of both hemispheres,
though the extent was greater in the right
hemisphere, stretching from the planum
temporale to the rostral STG and middle
temporal gyrus (Fig. 2A and Table 1). Both
the task and the tonality regressors corre-
lated significantly and consistently with ac-
tivity in the rostromedial prefrontal cortex,
primarily in the rostral and ventral reaches
of the superior frontal gyrus (SFG) (Figs. 2
and 3). The consistent modulation of this
area in all of our listeners led us to focus on
this region as a possible site of a tonality
map.
Tonality-specific responses in the
rostromedial prefrontal cortex. At the
individual level, we reconstructed and cate-
gorized the tonality sensitivity surface (TSS)
for each voxel that exhibited significant re-
sponses (P!0.001) in every one of the three
scanning sessions (see tonality surface
estimationin supporting online text). The
reconstructed surfaces from each session in-
dicated that the medial prefrontal cortex
maintains a distributed topographic represen-
tation of the overall tonality surface (Fig. 3).
Although some voxels exhibited similar TSSs
from session to session, the global tonality
Fig. 1. Properties of the tonality surface and
behavioral response profiles. In the key names,
capital letters indicate major keys and lower-
case letters indicate minor keys. (A) Unfolded
tori showing the average tonality surfaces for
each of the 24 keys in the original melody. The
top and bottom edges of each rectangle wrap
around to each other, as do the left and right
edges. "and #refer to the angular position
along each of the circles comprising the torus.
The color scale is arbitrary, with red and blue
indicating strongest and weakest activation, re-
spectively. Starting with C major and shifting
from left to right, the activation peak in each
panel reflects the melody’s progression through
all of the keys. (B) The circle of fifths. Major
keys are represented by the outside ring of
letters. Neighboring keys have all but one of
their notes in common. The inner ring depicts
the (relative) minor keys that share the same
key signature (number of sharps and flats) with
the adjacent major key. The color code refers to
the three groups of keys into which tonality
tracking voxels were categorized (Fig. 3). (C)
Correlations among the average tonality sur-
face topographies for each key. The topogra-
phies of keys that are closely related in a
music-theoretic sense are also highly positively
correlated, whereas those that are distantly
related are negatively correlated. Three groups
of related keys, indicated in (B), were identified
by singular value decomposition of this corre-
lation matrix. (D). Average response profiles
(eight listeners, three sessions each) from the
tonality deviance detection task illustrate the
propensity of specific test tones to pop out and
elicit a response in some keys but not in others
over the course of the melody. Error bars reflect 1 SEM.
Fig. 2. Group conjunction maps
showing the consistency with
which specific structures were
activated across listeners. Con-
junction maps of individual lis-
teners, containing the voxels
that were activated significantly
(P!0.001) in all scanning ses-
sions for that listener, were nor-
malized into a common space
and summed together across lis-
teners (see “spatial normaliza-
tion” in supporting online text).
Voxels that were consistently
activated by at least four of the
eight listeners are projected onto
the group’s mean normalized T1
image. (A) Areas sensitive to the
two task regressors (Table 1). (B)
The only areas whose activity
patterns were significantly and
consistently correlated with the
tonality regressors both within
and across listeners were the
rostral portion of the ventrome-
dial superior frontal gyrus and
the right orbitofrontal gyrus.
RESEARCH ARTICLES
13 DECEMBER 2002 VOL 298 SCIENCE www.sciencemag.org2168
on May 23, 2011www.sciencemag.orgDownloaded from
topography varied across sessions in each of
the listeners. The number of voxels falling
into each of the tonality categories (Fig. 1B)
was evenly distributed within each session
(table S1), but the relative pattern of tonality
sensitivity changed. For all listeners, we also
found tonality-sensitive voxels outside of the
medial prefrontal region (table S2). The pre-
cise constellations of sensitive areas differed
across listeners. We found tonality-sensitive
foci in the orbital and frontal gyri, primarily
in the right hemisphere; the temporal pole;
the anterior and posterior superior temporal
sulci; the precuneus and superior parietal gy-
rus; the posterior lingual gyrus; and the cer-
ebellum (19).
Discussion. Central to our ability to
hear music coherently are cognitive struc-
tures that maintain perceptual distance re-
lationships among individual pitches and
groups of pitches. These structures shape
expectations about pitches we will hear,
given a preceding musical input. Given the
diversity of the music we hear, the situa-
tions in which we hear it, and our affective
and motoric responses to it, it is likely that
tonal contexts are maintained in cortical
regions predisposed to mediating interac-
tions between sensory, cognitive, and af-
fective information. The medial prefrontal
cortex is a nexus for such functions (20, 21)
and is therefore an ideal region for main-
taining a tonality map. In the macaque,
connections to the medial prefrontal cortex
from unimodal sensory cortices are wide-
spread for the auditory modality and sparse
for the other sensory modalities (22). In our
experiments, we observed significant task-
related activity in auditory association ar-
eas and the anterior STG, primarily in the
right hemisphere. Reciprocal projections
between these areas and the ventral medial
prefrontal cortex help explain how and why
a tonality map might be maintained in the
medial prefrontal cortex. This region has
already been implicated in assessing the
degree of musical consonance or disso-
nance caused by a harmonic accompani-
ment to a melody (23). Our results suggest
that the rostromedial prefrontal cortex not
only responds to the general degree of con-
sonance but actively maintains a distributed
topographic representation of the tonality
surface. The perception of consonance and
dissonance depends on intact auditory cor-
tices (24, 25). However, even with bilateral
auditory cortex ablations, the ability to gen-
erate expectancies based on tonal contexts
remains, suggesting that the cognitive
structures maintaining tonal knowledge
largely reside outside of temporal lobe au-
ditory structures (24).
Dynamic topographies. In contrast to
distributed cortical representations of classes
of complex visual objects that appear to be
topographically invariant (26), we found that
the mapping of specific keys to specific neu-
ral populations in the rostromedial prefrontal
cortex is relative rather than absolute. Within
a reliably recruited network, the populations
of neurons that represent different regions of
the tonality surface are dynamically allocated
from one occasion to the next. This type of
dynamic topography may be explained by the
properties of tonality structures. In contrast to
categories of common visual objects that dif-
fer in their spatial features, musical keys are
abstract constructs that share core properties.
The internal relationships among the pitches
defining a key are the same in each key,
thereby facilitating the transposition of musi-
cal themes from one key to another. Howev-
er, the keys themselves are distributed on a
torus at unique distances from one another. A
dynamic topography may also arise from the
interplay of short-term and long-term memo-
ry stores of tonal information and may serve
Fig. 3. Topography of tonality sensitivity of rostroventral prefrontal cortex in three listeners
across three scanning sessions each. Each voxel’s color represents the key group with which the
voxel’s TSS was maximally correlated (Fig. 1B). The minority of voxels that were maximally
correlated with the average tonality surface are shown in white. A TSS represents how sensitive
the voxel is to each point on the torus. The TSSs of selected voxels are displayed as unfolded
tori. Figure 1A serves as a legend for assigning keys to the individual TSSs. The highlighted
voxels were chosen to display both the consistency and heterogeneity of the tonality surfaces
across sessions. For each listener, the activity of all voxels shown was significantly correlated
with the tonality regressors in all sessions. Thus, what changed between sessions was not the
tonality-tracking behavior of these brain areas but rather the region of tonal space (keys) to
which they were sensitive. This type of relative representation provides a mechanism by which
pieces of music can be transposed from key to key, yet maintain their internal pitch
relationships and tonal coherence.
RESEARCH ARTICLES
www.sciencemag.org SCIENCE VOL 298 13 DECEMBER 2002 2169
on May 23, 2011www.sciencemag.orgDownloaded from
a beneficial role in coupling the moment-to-
moment perception of tonal space with cog-
nitive, affective, and motoric associations,
which themselves may impose constraints on
the activity patterns within rostral prefrontal
regions (21, 2729).
References and Notes
1. R. N. Shepard, Psychol. Rev. 89, 305 (1982).
2. C. L. Krumhansl, Cognitive Foundations of Musical
Pitch (Oxford Univ. Press, New York, 1990).
3. F. Lerdahl, Tonal Pitch Space (Oxford Univ. Press, New
York, 2001).
4. C. L. Krumhansl, E. J. Kessler, Psychol. Rev. 89, 334
(1982).
5. L. B. Meyer, Emotion and Meaning in Music (Univ. of
Chicago Press, Chicago, 1956).
6. The chromatic scale consists of 12 equally sized inter-
vals into which an octave is divided. On a piano, a
chromatic scale starting at middle C would be played by
striking adjacent keys until the note C, either one octave
above or below middle C, was reached.
7. The extent to which tonality representations are
maintained in long-term or short-term memory
stores, or a combination of the two, is a matter of
debate. Self-organizing neural network models of im-
plicit learning accurately mimic results from a wide
array of experiments that assess tonal knowledge
(15), and harmonic priming experiments directly
highlight the influence of learned tonal structures
(13, 30). However, models of short-term sensory
memory account for significant proportions of the
variance in probe-tone experiments (31, 32), and
probe tone ratings depend, partially, on the pitch
distribution statistics of the contexts that precede
probes (33).
8. P. Janata, J. Cognit. Neurosci. 7, 153 (1995).
9. M. Besson, F. Faı¨ta, J. Exp. Psychol. Hum. Percept.
Perf. 21, 1278 (1995).
10. A. D. Patel, E. Gibson, J. Ratner, M. Besson, P. J.
Holcomb, J. Cognit. Neurosci. 10, 717 (1998).
11. S. Koelsch, T. Gunter, A. D. Friederici, E. Schro¨ger, J.
Cognit. Neurosci. 12, 520 (2000).
12. J. J. Bharucha, K. Stoeckig, J. Exp. Psychol. Hum.
Percept. Perf. 12, 403 (1986).
13. H. G. Tekman, J. J. Bharucha, J. Exp. Psychol. Hum.
Percept. Perf. 24, 252 (1998).
14. R. France`s, La Perception de la Musique (Vrin, Paris,
1958).
15. B. Tillmann, J. J. Bharucha, E. Bigand, Psychol. Rev.
107, 885 (2000).
16. R. J. Zatorre, A. C. Evans, E. Meyer, A. Gjedde, Science
256, 846 (1992).
17. R. J. Zatorre, A. C. Evans, E. Meyer, J. Neurosci. 14,
1908 (1994).
18. B. Maess, S. Koelsch, T. C. Gunter, A. D. Friederici,
Nature Neurosci. 4, 540 (2001).
19. The existence of a tonal map that is distributed
within and across cortical areas rather than focused
within a small cortical area may seem paradoxical,
yet this representational form is predicted by some
models of functional brain organization (27).
20. H. Barbas, Brain Res. Bull. 52, 319 (2000).
21. D. Tranel, A. Bechara, A. R. Damasio, in The New
Cognitive Neurosciences, M. S. Gazzaniga, Ed. (MIT
Press, Cambridge, MA 2000), pp. 1047–1061.
22. H. Barbas, H. Ghashghaei, S. M. Dombrowski, N. L.
Rempel-Clower, J. Comp. Neurol. 410, 343 (1999).
23. A. J. Blood, R. J. Zatorre, P. Bermudez, A. C. Evans,
Nature Neurosci. 2, 382 (1999).
24. M. J. Tramo, J. J. Bharucha, F. E. Musiek, J. Cognit.
Neurosci. 2, 195 (1990).
25. I. Peretz, A. J. Blood, V. Penhune, R. Zatorre, Brain 124,
928 (2001).
26. J. V. Haxby et al., Science 293, 2425 (2001).
27. A. R. Damasio, Cognition 33, 25 (1989).
28. J. J. Eggermont, Neurosci. Biobehav. Rev. 22, 355
(1998).
29. S. Funahashi, Neurosci. Res. 39, 147 (2001).
30. E. Bigand, B. Poulain, B. Tillmann, D. D’Adamo, J. Exp.
Psychol. Hum. Percept. Perf., in press.
31. D. Huron, R. Parncutt, Psychomusicology 12, 154
(1993).
32. M. Leman, Music Percept. 17, 481 (2000).
33. N. Oram, L. L. Cuddy, Psychol. Res. 57, 103 (1995).
34. We thank T. Laroche for assistance with data collec-
tion. Supported by NIH grant P50 NS17778-18. The
data and stimuli from the experiment are available
on request from the fMRI Data Center at Dartmouth
College (www.fmridc.org) under accession number
2-2002-1139B.
Supporting Online Material
www.sciencemag.org/cgi/content/full/298/5601/2167/
DC1
SOM Text
Figs. S1 to S3
Tables S1 and S2
References
Audio S1
17 July 2002; accepted 27 September 2002
Table 1. Loci consistently showing a main effect of task in a majority of listeners. MTG, middle temporal gyrus; IFG, inferior frontal gyrus; SPG, superior parietal
gyrus.
Lobe Region (Brodmann area)
Left hemisphere Right hemisphere
Location (mm) Listeners
at peak
(no.)
Cluster
size
(voxels)
Location (mm) Listeners
at peak
(no.)
Cluster
size
(voxels)
xyz xyz
Temporal
STG (22) 64 –11 10 6 74
STG/Heschl’s gyrus (41/42) –56 –19 9 7 74 52 –11 5 8 163
STG/planum temporale (22) 68 41 15 5 14 64 –30 15 6 163
64 –26 5 6 163
Rostromedial STG 38 15 –35 5 36
Rostroventral MTG (21) 52 0 –35 6 36
Middle MTG/superior temporal sulcus (21) 56 –15 –15 5 163
Ventral MTG (21) 60 –11 –25 6 163
Frontal
Rostroventromedial SFG (10/14) 0 49 0 5 27 4 64 0 5 27
Superior frontal sulcus/frontopolar gyrus (10) 26 64 30 5 3
Lateral orbital gyrus (11) 49 41 –10 5 4
IFG, pars orbitalis (47) 49 45 4 5 3
IFG, pars opercularis (44) 56 19 5 6 3
60 22 20 4 11
Precentral gyrus (6) 49 4 55 5 10
Parietal
Postcentral gyrus (1) 64 –11 25 6 163
Supramarginal gyrus (40) 64 –30 35 6 3
Precuneus (7) 0 45 55 5 42 0 45 55 5 42
4 –56 75 6 42
SPG (7) 11 –56 80 5 3
19 –49 75 6 5
SPG/transverse parietal sulcus (7) 4 –71 60 6 22
Limbic
Collateral sulcus –30 8 –30 5 10
Hippocampus/collateral sulcus 26 –11 –25 5 23
Other
Cerebellum 4 – 82 –35 5 11
–38 –79 –25 6 19 26 86 –30 5 10
45 –64 –45 5 8
Mediodorsal thalamic nucleus 0 –11 9 5 3
RESEARCH ARTICLES
13 DECEMBER 2002 VOL 298 SCIENCE www.sciencemag.org2170
on May 23, 2011www.sciencemag.orgDownloaded from
... Empirical studies based on musical corpora have shown harmonic functions to be an accurate and parsimonious way of categorizing chords for the purpose of characterizing commonpractice repertoires (Anzuoni et al., 2021;Jacoby et al., 2015;Rohrmeier & Cross, 2008;White & Quinn, 2018). Crucially, a functional understanding of harmony also allows for clear predictions to be made as to the patterns of expectations it elicits in listeners, which numerous studies have tested in the context of diatonic tonality (Brown et al., 2021;Janata et al., 2002;Leino et al., 2007;Sears et al., 2019;Wall et al., 2020). However, features proper to extended tonality may also contribute to listeners' perception (Bisesi, 2017;Krumhansl, 1998;Milne & Holland, 2016). ...
Article
Functional harmony is an integral part of many repertoires in the Western musical practices, including both diatonic and extended tonality. In the latter context, music-theoretical accounts suggest that the three octatonic equivalence classes (OECs) consisting of pitch-classes related by stacked minor-third intervals may be associated with tonic (T), dominant (D), and subdominant (S) functions. Whether this theoretical description of music is also relevant to the perception of music has not yet been tested empirically. In this study, 100 participants familiar with Western repertoires were presented with jazz chord progressions containing chord substitutions. When each stimulus had been played, participants predicted how many more chords they would have expected to hear before the progression could reach a plausible conclusion. We computed the similarity of responses for pairs of stimuli containing different harmonic substitutions and modeled such similarity values based on different measures of harmonic relatedness between substitutions. Data show that the OEC membership of substitutions strongly predicts the similarity of participants’ completion ratings. Bayesian mixed-effects modeling of similarity values further showed a categorical distinction between D and S as functional categories, on one hand, and T, on the other hand. The data also appear to reflect the prevalent influence of rock and pop repertoires on the participants, encouraging further research into the influence of stylistic diversity and musical expertise. Overall, results contribute to the characterization of listeners’ implicit knowledge of the principles of harmonic structure in extended tonality and support the relevance of OECs not only as descriptors of extended-tonal compositional practices but also parsimonious predictors of perceived functionality.
... Humans commonly perceive musical features such as chords, harmonies, sensations, and rhythms when reading music, playing, and listening to music. This perception takes place in a network that includes the inferior anterior medial and prefrontal cortices, the premotor cortex, the anterior and posterior parts of the superior temporal gyrus, and the inferior parietal lobe (Janata et al., 2002;Patel, 2003). It is unclear which of the following factors is responsible for the increase in CBF associated with performance: playing an instrument, listening to music, and reading music scores. ...
Full-text available
Article
Playing a musical instrument includes reading music scores, playing, and listening in parallel. It is unclear which of these activities are responsible for an increase in cerebral blood flow. We investigated the factors increasing middle cerebral artery velocity (MCAv) during musical performance, and examined whether playing and reading music affects cognitive function. Seventeen musicians played an instrument with reading music, played music from memory, and read music scores in a randomized order, for 10 min each. The MCAv was continuously recorded from 5 min before to 10 min after the performance. A Stroop test was performed before and after performance. The MCAv increased significantly with reading music, playing from memory, and reading music. Stroop test scores increased significantly after music reading. These findings suggest that both music reading and playing an instrument are involved in the increase in MCAv during music performance. Cognitive function was transiently improved by playing musical instruments.
... In the brain, the processing of higher-order musicsyntactic features (e.g., chords, harmony) requires the simultaneous and sequential analysis of pitch structures. This takes place beyond the auditory cortex in multiple frontal and parietal areas, involving especially the inferior frontal gyrus, the medial prefrontal cortex, the inferior parietal lobule, and the premotor cortex (Janata et al., 2002;Schulze et al., 2011;Foster et al., 2013;Royal et al., 2016) (Fig. 3.1). In contrast to pitch, the perception of rhythm involves a motor network comprising the cerebellum, the basal ganglia, and the primary motor cortex (Grahn and Brett, 2007;Chen et al., 2008). ...
Chapter
Music is a universal and important human trait, which is orchestrated by complex brain network centered in the temporal lobe but connecting broadly to multiple cortical and subcortical regions. In the human brain, music engages a widespread bilateral network of regions that govern auditory perception, syntactic and semantic processing, attention and memory, emotion and reward, and motor skills. The ability to perceive or produce music can be severely impaired either due to abnormal brain development or brain damage, leading to a condition called amusia. Modern neuroimaging studies of amusia have provided valuable knowledge about the structure and function of specific brain regions and white matter pathways that are crucial for music perception, highlighting the role of the right frontotemporal network in this process. In this chapter, we provide an overview on the neural basis of music processing in a healthy brain and review evidence obtained from the studies of congenital and acquired amusia.
... Overall, the ventral auditory pathway might not merely be a semantic pathway, but a system that also encodes affective semantics, especially in the right hemisphere. Given the strong parallel between speech prosody and tonal processing in the ventral auditory pathway, we predict that there should be areas downstream of BA 47 that show specificity for tonal processing in music but that have not yet been characterized as such (although see Janata et al., 2002). ...
Full-text available
Article
Music is used as an important medium for communication in human societies, often times to enhance the emotional meaning of narrative scenarios and ritual events. Music has a number of domain-specific tonal devices for doing this, spanning from scale structure to harmonic progressions and beyond. In order to explore the neural basis of tonal processing in music, we carried out an activation likelihood estimation (ALE) meta-analysis of 20 published functional magnetic resonance imaging studies of tonal cognition, with an emphasis on harmony processing. The most concordant areas of activation across these studies occurred at the junction of the inferior frontal gyrus, anterior insula, and orbitofrontal cortex in Brodmann areas 47 and 13 in the right hemisphere. This region is associated not only with emotion in general, but with the conveyance of affective meanings during communication processes, including speech prosody and music.
... Hippocampal activity and connectivity has been detected during a range of tasks during music listening, including tone detection (Lehne et al., 2014), timbre and tonality deviant detection (Janata, 2002), temporal order judgments (Mueller et al., 2015), spontaneity judgments (Engel and Keller, 2011), and memory encoding . Hippocampus is also activated during passive music listening, compared to a silent or scrambled baseline (Brown et al., 2004;Mueller et al., 2015Mueller et al., , 2011Mutschler et al., 2010). ...
Full-text available
Article
The hippocampus has a well-established role in spatial and episodic memory but a broader function has been proposed including aspects of perception and relational processing. Neural bases of sound analysis have been described in the pathway to auditory cortex, but wider networks supporting auditory cognition are still being established. We review what is known about the role of the hippocampus in processing auditory information, and how the hippocampus itself is shaped by sound. In examining imaging, recording, and lesion studies in species from rodents to humans, we uncover a hierarchy of hippocampal responses to sound including during passive exposure, active listening, and the learning of associations between sounds and other stimuli. We describe how the hippocampus' connectivity and computational architecture allow it to track and manipulate auditory information – whether in the form of speech, music, or environmental, emotional, or phantom sounds. Functional and structural correlates of auditory experience are also identified. The extent of auditory-hippocampal interactions is consistent with the view that the hippocampus makes broad contributions to perception and cognition, beyond spatial and episodic memory. More deeply understanding these interactions may unlock applications including entraining hippocampal rhythms to support cognition, and intervening in links between hearing loss and dementia.
... Previous work shows that the brain is sensitive to the tonal hierarchy in sounded music. For instance, Janata et al. (2002) found distinct fMRI responses in rostroventral prefrontal cortex to an aspect of tonality, consonance versus dissonance. Alluri et al. (2012) showed that activity in several brain areas increased as key clarity decreased (i.e., the task of imputing the key became more difficult), including bilateral pre-and post-central gyrus, right midcingulate gyrus, SMA, left Heschl's gyrus, and bilateral rolandic operculum. ...
Article
Notes in a musical scale convey different levels of stability or incompleteness, forming what is known as a tonal hierarchy. Levels of stability conveyed by these scale degrees are partly responsible for generating expectations as a melody proceeds, for emotions deriving from fulfillment (or not) of those expectations, and for judgments of overall melodic well-formedness. These functions can be extracted even during imagined music. We investigated whether patterns of neural activity in fMRI could be used to identify heard and imagined notes, and if patterns associated with heard notes could identify notes that were merely imagined. We presented trained musicians with the beginning of a scale (key and timbre were varied). The next note in the scale was either heard or imagined. A probe tone task assessed sensitivity to the tonal hierarchy, and state and trait measures of imagery were included as predictors. Multivoxel classification yielded above-chance results in primary auditory cortex (Heschl's gyrus) for heard scale-degree decoding. Imagined scale-degree decoding was successful in multiple cortical regions spanning bilateral superior temporal, inferior parietal, precentral, and inferior frontal areas. The right superior temporal gyrus yielded successful cross-decoding of heard-to-imagined scale-degree, indicating a shared pathway between tonal-hierarchy perception and imagery. Decoding in right and left superior temporal gyrus and right inferior frontal gyrus was more successful in people with more differentiated tonal hierarchies and in left inferior frontal gyrus among people with higher self-reported auditory imagery vividness, providing a link between behavioral traits and success of neural decoding. These results point to the neural specificity of imagined auditory experiences—even of such functional knowledge—but also document informative individual differences in the precision of that neural response.
... Hippocampal activity and connectivity has been detected during a range of tasks during music listening, including tone detection (Lehne et al., 2014), timbre and tonality deviant detection (Janata, 2002), temporal order judgments (Mueller et al., 2015), spontaneity judgments (Engel and Keller, 2011), and memory encoding . Hippocampus is also activated during passive music listening, compared to a silent or scrambled baseline (Brown et al., 2004;Mueller et al., 2015Mueller et al., , 2011Mutschler et al., 2010). ...
Full-text available
Preprint
The hippocampus has a well-established role in spatial and episodic memory but a broader function has been proposed including aspects of perception and relational processing. Neural bases of sound analysis have been described in the pathway to auditory cortex, but wider networks supporting auditory cognition are still being established. We review what is known about the role of the hippocampus in processing auditory information, and how the hippocampus itself is shaped by sound. In examining imaging, recording, and lesion studies in species from rodents to humans, we uncover a hierarchy of hippocampal responses to sound including during passive exposure, active listening, and the learning of associations between sounds and other stimuli. We describe how the hippocampus' connectivity and computational architecture allow it to track and manipulate auditory information – whether in the form of speech, music, or environmental, emotional, or phantom sounds. Functional and structural correlates of auditory experience are also identified. The extent of auditory-hippocampal interactions is consistent with the view that the hippocampus makes broad contributions to perception and cognition, beyond spatial and episodic memory. More deeply understanding these interactions may unlock applications including entraining hippocampal rhythms to support cognition, and intervening in links between hearing loss and dementia.
Full-text available
Article
Music is ubiquitous across human cultures — as a source of affective and pleasurable experience, moving us both physically and emotionally — and learning to play music shapes both brain structure and brain function. Music processing in the brain — namely, the perception of melody, harmony and rhythm — has traditionally been studied as an auditory phenomenon using passive listening paradigms. However, when listening to music, we actively generate predictions about what is likely to happen next. This enactive aspect has led to a more comprehensive understanding of music processing involving brain structures implicated in action, emotion and learning. Here we review the cognitive neuroscience literature of music perception. We show that music perception, action, emotion and learning all rest on the human brain’s fundamental capacity for prediction — as formulated by the predictive coding of music model. This Review elucidates how this formulation of music perception and expertise in individuals can be extended to account for the dynamics and underlying brain mechanisms of collective music making. This in turn has important implications for human creativity as evinced by music improvisation. These recent advances shed new light on what makes music meaningful from a neuroscientific perspective. People may respond to listening to music by physically moving or feeling emotions. In this Review, Peter Vuust and colleagues discuss how music perception and related actions, emotions and learning are associated with the predictive capabilities of the human brain, with a focus on their predictive coding of music model.
Article
A growing number of functional neuroimaging studies have identified regions within the temporal lobe, particularly along the planum polare and planum temporale, that respond more strongly to music than other types of acoustic stimuli, including voice. This “music preferred” regions have been reported using a variety of stimulus sets, paradigms and analysis approaches and their consistency across studies confirmed through meta-analyses. However, the critical question of intra-subject reliability of these responses has received less attention. Here, we directly assessed this important issue by contrasting brain responses to musical vs. vocal stimuli in the same subjects across three consecutive fMRI runs, using different types of stimuli. Moreover, we investigated whether these music- and voice-preferred responses were reliably modulated by expertise. Results demonstrated that music-preferred activity previously reported in temporal regions, and its modulation by expertise, exhibits a high intra-subject reliability. However, we also found that activity in some extra-temporal regions, such as the precentral and middle frontal gyri, did depend on the particular stimuli employed, which may explain why these are less consistently reported in the literature. Taken together, our findings confirm and extend the notion that specific regions in the brain consistently respond more strongly to certain socially-relevant stimulus categories, such as faces, voices and music, but that some of these responses appear to depend, at least to some extent, on the specific features of the paradigm employed.
Full-text available
Book
W niniejszej publikacji opisaliśmy znaczenie artyzmu pacjentów oraz ich kontaktu ze sztuką w procesach terapeutycznych różnych schorzeń
Full-text available
Article
Cerebral activation was measured with positron emission tomography in ten human volunteers. The primary auditory cortex showed increased activity in response to noise bursts, whereas acoustically matched speech syllables activated secondary auditory cortices bilaterally. Instructions to make judgments about different attributes of the same speech signal resulted in activation of specific lateralized neural systems. Discrimination of phonetic structure led to increased activity in part of Broca's area of the left hemisphere, suggesting a role for articulatory recoding in phonetic perception. Processing changes in pitch produced activation of the right prefrontal cortex, consistent with the importance of right-hemisphere mechanisms in pitch perception.
Full-text available
Article
The formation of the cell plate, a unique structure in dividing plant cells, is pivotal for cytokinesis. A mutation in the Arabidopsis KORRIGAN (KOR) gene causes the formation of aberrant cell plates, incomplete cell walls, and multinucleated cells, leading to severely abnormal seedling morphology. The mutant, designed kor1-2, was identified as a stronger allele than the previously identified kor1-1, which appears to be defective only in cell elongation. KOR1 encodes an endo-1,4-β-d-glucanase with a transmembrane domain and two putative polarized targeting signals in the cytosolic tail. When expressed in tobacco BY2 cells, a KOR1-GFP (green fluorescence protein) fusion protein was localized to growing cell plates. Substitution mutations in the polarized targeting motifs of KOR1 caused the fusion proteins to localize to the plasma membrane as well. Expression of these mutant genes in kor1-2 plants complemented only the cell elongation defect but not the cytokinesis defect, indicating that polarized targeting of KOR1 to forming cell plates is essential for cytokinesis. Our results suggest that KOR1 plays a critical role during cytokinesis.
Full-text available
Article
The tone profile method of key determination (Krumhansl, 1990) predicts key and key changes in a range of western tonal styles. However, the tone profile method fails to account for certain important effects in tonality perception (Butler, 1989). A modified version of Krumhansl’s method of key determination is described that takes into account (a) subsidiary pitches and pitch salience according to Terhardt, Stoll, and Seewann (1982a, 1982b), and (b) the effect of sensory memory decay. Both modifications are shown to improve the correlation between model predictions and experimental data gathered by Krumhansl and Kessler (1982) on the tonality of harmonic progressions. However, the new model described here fails to account for Brown’s (1988) experimental findings on the tonality of melodies. The results here are consistent with the view that both structural and functional factors play a role in the perception of tonality.
Full-text available
Article
Musicians and nonmusicians listened to musical phrases that were either selected from the classical repertoire or composed for the experiments. The phrases ended either congruously or with a nondiatonic, diatonic, or rhythmic violation. Percentage of correct responses was analyzed in Exp 1, and event-related potentials (ERPs) were recorded in Exps 2 and 3. Musicians performed better than nonmusicians in recognizing familiar musical phrases and classifying terminal notes. The differences found as a function of expertise were larger for unfamiliar than for familiar melodies. The ERPs to the end notes differed both in terms of amplitude and latency between musicians and nonmusicians, and as a function of participants' familiarity with the melodies and type of violation. Results show that expertise influences the decisional rather than the purely perceptual aspects of music processing and that ERPs can provide important insight into the study of music perception. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Full-text available
Article
A chord-priming paradigm was used to test predictions of a neural net model (MUSACT). The model makes a nonintuitive prediction: Following a prime chord, expectations for the target chord are based on psychoacoustic similarity at short stimulus onset asynchronies (SOAs) but on implicit knowledge of conventional relationships at longer SOAs. In a critical test, 2 targets were selected for each prime. One was more psychoacoustically similar to the prime, and the other was more closely related on the basis of convention. With an SOA of 50 ms, priming favored the psychoacoustically similar target; with SOAs of 500 ms and longer, the effect reversed, and priming favored conventional relatedness. The results underscore the limitations of models of harmony based on psychoacoustic factors alone. These studies demonstrate how neural net learning models that are appropriately constrained can be subject to strong empirical verification. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
The neural correlates of music perception were studied by measuring cerebral blood flow (CBF) changes with positron emission tomography (PET). Twelve volunteers were scanned using the bolus water method under four separate conditions: (1) listening to a sequence of noise bursts, (2) listening to unfamiliar tonal melodies, (3) comparing the pitch of the first two notes of the same set of melodies, and (4) comparing the pitch of the first and last notes of the melodies. The latter two conditions were designed to investigate short-term pitch retention under low or high memory load, respectively. Subtraction of the obtained PET images, superimposed on matched MRI scans, provides anatomical localization of CBF changes associated with specific cognitive functions. Listening to melodies, relative to acoustically matched noise sequences, resulted in CBF increases in the right superior temporal and right occipital cortices. Pitch judgments of the first two notes of each melody, relative to passive listening to the same stimuli, resulted in right frontal-lobe activation. Analysis of the high memory load condition relative to passive listening revealed the participation of a number of cortical and subcortical regions, notably in the right frontal and right temporal lobes, as well as in parietal and insular cortex. Both pitch judgment conditions also revealed CBF decreases within the left primary auditory cortex. We conclude that specialized neural systems in the right superior temporal cortex participate in perceptual analysis of melodies; pitch comparisons are effected via a neural network that includes right prefrontal cortex, but active retention of pitch involves the interaction of right temporal and frontal cortices.
Article
Abstract Expectancy violations of a highly constrained musical context were studied by presenting subjects with a chord sequence (I, IV, V) that generated a strong expectancy for a specific final chord, and then completing the sequence with either a "best-possible" (Tonic), harmonically plausible (Minor), or harmonically implausible (Dissonant) resolution. Subjects determined whether it was the best-possible resolution, and in one-half of the trials made their decision known with an overt response. Several ERP waveform components showed differences among resolution types, response conditions, and electrode locations. Among the affected components were two subclasses of the P300, the first of which (P3a) was largest in response to the Dissonant in both response conditions at all electrode sites. The area and peak amplitude of the P3b varied at several electrode sites as a function of the degree of expectancy violation represented by the resolutions. The peak latency of the P3b component reflected the behavioral response time measurement in which the Tonic and Dissonant were identified 300 msec more quickly than the Minor. A comparison between frequency and timedomain data analyses demonstrates several parallels, and it is concluded that both can serve to investigate the perception and processing of the probability structure underlying musical events and contexts.
Article
We present experimental and anatomical data from a case study of impaired auditory perception following bilateral hemispheric strokes. To consider the cortical representation of sensory, perceptual, and cognitive functions mediating tonal information processing in music, pure tone sensation thresholds, spectral intonation judgments, and the associative priming of spectral intonation judgments by harmonic context were examined, and lesion localization was analyzed quantitatively using straight-line two-dimensional maps of the cortical surface reconstructed from magnetic resonance images. Despite normal pure tone sensation thresholds at 250–8000 Hz, the perception of tonal spectra was severely impaired, such that harmonic structures (major triads) were almost uniformly judged to sound dissonant; yet, the associative priming of spectral intonation judgments by harmonic context was preserved, indicating that cognitive representations of tonal hierarchies in music remained intact and accessible. Brainprints demonstrated complete bilateral lesions of the transverse gyri of Heschl and partial lesions of the right and left superior temporal gyri involving 98 and 20% of their surface areas, respectively. In the right hemisphere, there was partial sparing of the planum temporale, temporoparietal junction, and inferior parietal cortex. In the left hemisphere, all of the superior temporal region anterior to the transverse gyrus and parts of the planum temporale, temporoparietal junction, inferior parietal cortex, and insula were spared. These observations suggest that (1) sensory, perceptual, and cognitive functions mediating tonal information processing in music are neurologically dissociable; (2) complete bilateral lesions of primary auditory cortex combined with partial bilateral lesions of auditory association cortex chronically impair tonal consonance perception; (3) cognitive functions that hierarchically structure pitch information and generate harmonic expectancies during music perception do not rely on the integrity of primary auditory cortex; and (4) musical priming may be mediated by broadly tuned subcomponents of the thala-mocortical auditory system.
Chapter
This book addresses the central problem of music cognition: how listeners' responses move beyond mere registration of auditory events to include the organization, interpretation, and remembrance of these events in terms of their function in a musical context of pitch and rhythm. The work offers an analysis of the relationship between the psychological organization of music and its internal structure. It combines over a decade of original research on music cognition with an overview of the available literature. The author also provides a background in experimental methodology and music theory.