Multisensory integration: frequency tuning of audio-tactile integration.
ABSTRACT Multisensory information can be crucial, yet in many circumstances we have little, if any, awareness of the effects of multisensory inputs on what appear to be entirely unisensory perceptions. A recent study shows robust effects of auditory input on tactile frequency discriminations and that this auditory cross-sensory interference has specific tuning.
- [Show abstract] [Hide abstract]
ABSTRACT: A development essential for understanding the neural basis of complex behavior and cognition is the description, during the last quarter of the twentieth century, of detailed patterns of neuronal circuitry in the mammalian cerebral cortex. This effort established that sensory pathways exhibit successive levels of convergence, from the early sensory cortices to sensory-specific association cortices and to multisensory association cortices, culminating in maximally integrative regions; and that this convergence is reciprocated by successive levels of divergence, from the maximally integrative areas all the way back to the early sensory cortices. This article first provides a brief historical review of these neuroanatomical findings, which were relevant to the study of brain and mind-behavior relationships using a variety of approaches and to the proposal of heuristic anatomo-functional frameworks. In a second part, the article reviews new evidence that has accumulated from studies of functional neuroimaging, employing both univariate and multivariate analyses, as well as electrophysiology, in humans and other mammals, that the integration of information across the auditory, visual, and somatosensory-motor modalities proceeds in a content-rich manner. Behaviorally and cognitively relevant information is extracted from and conserved across the different modalities, both in higher-order association cortices and in early sensory cortices. Such stimulus-specific information is plausibly relayed along the neuroanatomical pathways alluded to above. The evidence reviewed here suggests the need for further in-depth exploration of the intricate connectivity of the mammalian cerebral cortex in experimental neuroanatomical studies. J. Comp. Neurol., 2013. © 2013 Wiley Periodicals, Inc.The Journal of Comparative Neurology 07/2013; 521(18). · 3.51 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: The frequency of environmental vibrations is sampled by two of the major sensory systems, audition and touch, notwithstanding that these signals are transduced through very different physical media and entirely separate sensory epithelia. Psychophysical studies have shown that manipulating frequency in audition or touch can have a significant cross-sensory impact on perceived frequency in the other sensory system, pointing to intimate links between these senses during computation of frequency. In this regard, the frequency of a vibratory event can be thought of as a multisensory perceptual construct. In turn, electrophysiological studies point to temporally early multisensory interactions that occur in hierarchically early sensory regions where convergent inputs from the auditory and somatosensory systems are to be found. A key question pertains to the level of processing at which the multisensory integration of featural information, such as frequency, occurs. Do the sensory systems calculate frequency independently before this information is combined, or is this feature calculated in an integrated fashion during preattentive sensory processing? The well characterized mismatch negativity, an electrophysiological response that indexes preattentive detection of a change within the context of a regular pattern of stimulation, served as our dependent measure. High-density electrophysiological recordings were made in humans while they were presented with separate blocks of somatosensory, auditory, and audio-somatosensory "standards" and "deviants," where the deviant differed in frequency. Multisensory effects were identified beginning at ∼200 ms, with the multisensory mismatch negativity (MMN) significantly different from the sum of the unisensory MMNs. This provides compelling evidence for preattentive coupling between the somatosensory and auditory channels in the cortical representation of frequency.Journal of Neuroscience 10/2012; 32(44):15338-44. · 6.75 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: Background Niemann-Pick type-C (NPC) is an autosomal recessive disease in which cholesterol and glycosphingolipids accumulate in lysosomes due to aberrant cell-transport mechanisms. It is characterized by progressive and ultimately terminal neurological disease, but both pre-clinical studies and direct human trials are underway to test the safety and efficacy of cholesterol clearing compounds, with good success already observed in animal models. Key to assessing the effectiveness of interventions in patients, however, is the development of objective neurobiological outcome measures. Multisensory integration mechanisms present as an excellent candidate since they necessarily rely on the fidelity of long-range neural connections between the respective sensory cortices (e.g. the auditory and visual systems).MethodsA simple way to test integrity of the multisensory system is to ask whether individuals respond faster to the occurrence of a bisensory event than they do to the occurrence of either of the unisensory constituents alone. Here, we presented simple auditory, visual, and audio-visual stimuli in random sequence. Participants responded as fast as possible with a button push. One 11-year-old and two 14-year-old boys with NPC participated in the experiment and their results were compared to those of 35 age-matched neurotypical boys.ResultsReaction times (RTs) to the stimuli when presented simultaneously were significantly faster than when they were presented alone in the neurotypical children, a facilitation that could not be accounted for by probability summation, as evidenced by violation of the so-called `race¿ model. In stark contrast, the NPC boys showed no such speeding, despite the fact that their unisensory RTs fell within the distribution of RTs observed in the neurotypicals.Conclusions These results uncover a previously undescribed deficit in multisensory integrative abilities in NPC, with implications for ongoing treatment of the clinical symptoms of these children. They also suggest that multisensory processes may represent a good candidate biomarker against which to test the efficacy of therapeutic interventions.Orphanet Journal of Rare Diseases 09/2014; 9(1):149. · 3.96 Impact Factor
1. Hirai, K., Hirose, M., Haidawa, Y., and
Takenaka, T. (1998). The development of honda
humanoid robot. Proc. IEEE International Conf.
on Robotics and Automation, pp. 1321–1326.
2. McGeer, T. (1992). Principles of walking and
running. In Advances in Comparative and
Environmental Physiology 11: Mechanics of
Animal Locomotion, R. McN. Alexander, ed.
(Berlin: Springer-Verlag), pp. 113–139.
3. Pearson, K.G. (2008). Role of sensory feedback
in the control of stance duration in walking
cats. Brain Res. Rev. 57, 222–227.
4. Borgmann, A., Hooper, S.L., and Bu ¨schges, A.
(2009). Sensory feedback induced by front-leg
stepping entrains the activity of central pattern
generators in caudal segments of the stick
insect walking system. J. Neurosci. 29,
5. Cruse, H. (1990). What mechanisms coordinate
leg movement in walking arthropods? Trends
Neurosci. 13, 15–21.
6. Grillner, S. (2003). The motor infrastructure:
from ion channels to neuronal networks. Nat.
Rev. Neurosci. 4, 573–586.
7. Bu ¨schges, A. (2005). Sensory control and
organization of neural networks mediating
coordination of multisegmental organs for
locomotion. J. Neurophysiol. 93, 1127–1135.
8. Duysens, J., Clarac, F., and Cruse, H. (2000).
Load regulating mechanisms in gait and
posture, comparative aspects. Phys. Rev. 80,
9. Zill, S., Schmitz, J., and Bu ¨schges, A. (2004).
Load sensing and control of posture and
locomotion. Arthropod Struct. Dev. 33,
10. Ekeberg, O., Blumel, M., and Bu ¨schges, A.
(2004). Dynamic simulation of insect walking.
Arthropod Struct. Dev. 33, 287–300.
11. Akay, T., McVea, D.A., Tachibana, A., and
Pearson, K.G. (2006). Coordination of fore and
hind leg stepping in cats on a transversely-split
treadmill. Exp. Brain Res. 175, 211–222.
12. Schmitz, J. (1993). Load-compensating
reactions in the proximal leg joints of stick
insects during standing and walking. J. Exp.
Biol. 183, 15–33.
13. Clarac, F., and Chasserat, C. (1979).
Experimental modification of interlimb
coordination during locomotion of
a crustacean. Neurosci. Lett. 12, 271–276.
14. Lamb, T., and Yang, J.F. (2000). Could different
directions of infant stepping be controlled by
the same locomotor central pattern generator?
J. Neurophysiol. 83, 2814–2824.
15. Pang, M.Y.C., and Yang, J.F. (2002). Sensory
gating for the initiation of the swing phase in
different directions of human infant stepping.
J. Neurosci. 22, 5734–5740.
16. Ekeberg, O., and Pearson, K. (2005). Computer
simulation of stepping the hind legs of the cat:
an examination of mechanisms regulating the
stance-to-swing transition. J. Neurophysiol. 94,
17. Giszter, S.F., Davies, M.R., and Graziani, V.
(2008). Coordination strategies for limb forces
during weight-bearing locomotion in normal
rats, and in rats spinalized as neonates. Exp.
Brain Res. 190, 53–69.
18. Zill, S.N., Keller, B.R., and Duke, E.R. (2009).
Sensory signals of unloading in one leg follow
stance onset in another leg: Transfer of load
and emergent coordination in cockroach
walking. J. Neurophysiol. doi: 10.1152/
Department of Anatomy and Pathology,
Joan C. Edwards School of Medicine,
Marshall University, Huntington,
WV 25704, USA.
Multisensory Integration: Frequency
Tuning of Audio-Tactile Integration
Multisensory information can be crucial, yet in many circumstances we have
little, if any, awareness of the effects of multisensory inputs on what appear to
be entirely unisensory perceptions. A recent study shows robust effects of
auditory input on tactile frequency discriminations and that this auditory
cross-sensory interference has specific tuning.
John J. Foxe
Gentlemen, have you tried shaving with
your ears plugged? Of course you
haven’t and perhaps this doesn’t
even strike you as all that difficult an
undertaking, not like being asked to
do it without a mirror. And goodness
knows what it would be like to shave
if you were asked to apply topical
anesthetic to your face beforehand.
Still, the next time you shave, take
a little extra time to consider the
sensory signals that you rely on during
this tedious job. I bring up shaving here
because it has often struck me just how
much one relies on the combination of
auditory and somatosensory inputs
during this routine chore. It is a truly
multisensory task and the interplay of
the sound of the razor passing over
unshaven areas and the feel of the
blades on the skin is an excellent
demonstration of the interplay of these
two sensory systems during a very
personal tactile roughness task. I’m not
so sure that the effect is quite as strong
for the opposite sex when shaving
more distal and less innervated
aspects of the body, but I’ll assume
there are reasonable parallels.
One of the earliest formal
demonstrations of the role of auditory
inputs on tactile sensations was also
one of the most extraordinary, and is
not unrelated to my shaving example.
Jousmaki and Hari , writing in
Current Biology, showed that by
artificially altering the rubbing sounds
that participants heard when asked to
rub their palms together, one could
dramatically alter the tactile sensations
that subjects reported. They used
a simple setup where they placed
a microphone next to the hands and
then played the rubbing sounds the
hands made back through a pair of
headphones. In some cases, the
sounds were unaltered and in others,
all frequencies above 2000 Hz were
either enhanced or dampened by
15 decibels. Participants were asked to
rate their tactile sensation on a scale
between relative moistness
(roughness) and dryness
(smoothness). Two effects were seen.
First, the louder the rubbing sounds
were, the smoother and dryer the
rubbing experience became. So, it
became clear that auditory inputs
could affect tactile roughness
judgments. More importantly for our
purposes here, they also found that by
enhancing the high-frequency
component (2 kHz) of the rubbing
sound, the majority of their subjects
also experienced a significant shift in
the perceived smoothness/dryness of
the skin surface. A number of the
subjects spontaneously reported the
rather extraordinary sensation of
having a leaf of parchment paper
interposed between their rubbing
hands and so the effect has entered the
vernacular in the multisensory field as
the ‘parchment-skin illusion’.
auditory inputs as an important adjunct
hinted that there might be a tuning
function underlying this effect, as it was
because of the manipulation of the
high-versus-low frequency ratios of the
auditory inputs that those tactile
perceptions were altered. Guest et al. 
Participants in their study made
forced-choice discriminations regarding
the roughness of abrasive surfaces, and
their data showed that roughness
perception was modulated by the
frequency content of the auditory
feedback, with attenuation of high
frequency inputs causing a shift in
Yau et al.  have revisited this issue
in a series of carefully conducted
psychophysical experiments. As they
reported recently in Current Biology,
they have shown convincingly that
auditory inputs interfere with tactile
frequency-discrimination, but not with
judgments of intensity, thus ruling out
an attentional confound, and that this
interference only occurs when the
auditory inputs are at or near the same
frequency as the tactile inputs. In other
words, the frequency of vibrations at
the skin surface and of vibrations in
the hair cells of the cochlea, two
entirely separate sensory epithelia
transducing two very different forms
of energy, is a critical feature
dimension for subsequent
a two-alternative forced-choice task
where they judged which of two
sequentially presented vibro-tactile
stimulations of the index finger was
higher in frequency. Vibratory stimuli
ranged in frequency from 100 to 300 Hz
in 40 Hz increments, and the second of
the vibratory stimuli was accompanied
by an auditory ‘distractor’ which could
either be the same frequency as the
ranging between 100 and 1500 Hz. The
data indicated a clear decrement in the
sensitivity of subjects to tactile
frequency differences, but only for
auditory distractors in the low
frequency range. Another intriguing
finding was that the perceived
to be pulled towards the lowest
frequency auditory stimulus (100Hz).
It remained possible that these
interference effects were only driven by
low-frequency auditory inputs, so in
a follow-up experiment, the authors
used a higher frequency vibro-tactile
comparator stimulus (400 Hz) and
showed that the interference effects
Lastly, they showed that the effects
were also present for band-pass noise
distractors centered at the frequency of
the vibro-tactile stimulus and so, the
perception of pure pitch was not
necessary for the interference effect.
The authors argued that this latter
finding pointed to a sensory rather than
decisional locus for the effect.
One obvious question that arises is
why this cross-sensory effect is
manifest as an impairment in tactile
frequency perception. Why did
same-frequency auditory inputs not
aide in performance if they were
integrated? Aren’t performance
improvements what we have come
to expect when there is redundant
but congruent sensory input
[4–7]. Jousmaki and Hari’s 
parchment-skin findings may hold
the answer. In that study, the rubbing
sounds were the actual sounds of the
participants own hands, and yet tactile
roughness perception was changed
notonly byshiftsinthe passbandof the
rubbing sounds, but also by shifts in
the loudness of the auditory inputs,
even when the frequencies were not
manipulated. In the Yau et al.  study,
there is no pre-existing lawful
relationship between the intensity of
the auditory and vibro-tactile stimuli
used. Rather, the levels chosen were
experimenter controlled and were
essentially arbitrary, not necessarily
particularly ‘realistic’ in terms of
similar stimuli in the natural
environment. As such, it is highly
likely that the auditory inputs shifted
the perceived frequency of the tactile
inputs, and this is in fact what is
seen in the data where there is a clear
shift in the perceived vibro-tactile
frequency towards lower frequency
One of the excellent aspects of the
work by Yau et al.  is that some clear
and testable predictions are made
about the underlying neurophysiology.
First, they argue that the effects likely
occur at a sensory processing level,
pointing to a region of the auditory
cortex, the so-called caudo-medial belt
area (CM), as the most likely substrate.
This area was first implicated in
auditory-somatosensory integration by
Charles Schroeder , who showed
feedforward auditory and
somatosensory inputs to layer 4 in area
CM of awake behaving macaques. This
was considered a remarkable finding at
was a classical region of auditory
cortex, often uncovered larger
somatosensory than auditory
responses. It bears emphasizing that
CM is just one synapse from primary
auditory cortex at the second stage
of the auditory hierarchy (Figure 1).
Thus, robust somatosensory
responses are evident at an auditory
processing stage that is at the
equivalent hierarchical level that V2
occupies in the visual hierarchy.
Further, these inputs are just as fast as
the auditory inputs to this region .
At the same time, my own lab had
begun to uncover similarly early
in early auditory cortex using both EEG
and fMRI in humans [10,11]. Our initial
thought was that this early integration
must surely be to do with spatial
mapping, but a formal test of this
assumption proved us wrong .
Much work has been done since to try
to understand the role of CM in
Broad Broad Narrow
Functional (tonotopic) parcellation
Figure 1. The organization of early auditory cortex in the macaque.
(A) Organization of monkey auditory cortex. Three primary auditory fields comprise the core
region. These are surrounded by the secondary fields, the so-called belt region, and in turn
by higher association areas of the so-called parabelt region. Electrophysiological studies have
shown that several of these fields contain an ordered representation of sound frequency (tono-
topic map, indicated on the left), and that core and belt fields prefer narrow and broadband
sounds, respectively. These two functional properties can be exploited to map the layout of
these auditory fields in individual subjects using functional imaging. (B) Single-slice fMRI data
showing frequency-selective BOLD responses to low and high tones (left panel) and a complete
(smoothed) frequency map obtained from stimulation using six frequency bands. Combining the
frequency map with an estimate of the core region obtained as well as anatomical landmarks to
delineate the parabelt results in a full parcellation of auditory cortex in individual subjects.
(Figure kindly provided by Christoph Kayser and reprinted with permission from .)
Current Biology Vol 19 No 9
processing  and Figure 1 shows
a nice illustration of work by Christoph
Kayser and colleagues . This group
used both intracranial recordings and
functional imaging in macaques to
create detailed individualized maps of
the tonotopic organization of the
auditory regions. One thing to note
from this illustration is that area CM,
which is the region implicated by Yau
et al.  as the putative site for cross-
frequency coupling effects between
broad-band low-frequency stimuli and
shows little or no obvious tonotopic
map . Thus, it may be the case that
CM does not have the requisite
frequency resolution to underlie these
effects whereas the neighbouring CL
shows a higher degree of tonotopy and
may be the more likely candidate.
We have concentrated very much on
these very early feedforward phases of
cortical integrations in our own work
, in no small part because these
early integrations were considered by
many to be so very surprising when
they first came to light just a decade
ago. But a reasonable question is just
how ‘clever’ are these early
integrations. How complex is the
And what of the large-scale complex of
next 150 milliseconds of processing
time? For example, high-density
electrical mapping has been used to
assess the integration of extremely
basic tone-pips and monochromatic
flash inputs while subjects perform
nothing more complex than a reaction
time task . In this very simple
paradigm, no fewer than six distinct
phases of multisensory processing
were dissociated across a wide
network of cortical regions, including
early extrastriate cortex, early auditory
cortex, parietal regions and frontal
regions. All of these distinct phases
occurred within just 130 milliseconds
of initial afference in area V1. So,
while this study showed very early
audio-visual integration in extrastriate
visual areas at just 45 milliseconds,
it is simply not clear what aspect of
the integrative process this initial
multisensory effect represents. It is
certainly reasonable to think that it
can hardly be a very complex
operation given how early it occurs.
Much remains to be done to delineate
what aspects of multisensory
integration are achieved across this
temporal hierarchy of integration
phases. The work of Yau et al. 
provides some very nice predictions
that can now be tested by the
neurophysiology and neuroimaging
1. Jousma ¨ki, V., and Hari, R. (1998). Parchment-
skin illusion: sound-biased touch. Curr. Biol. 8,
2. Guest, S., Catmur, C., Lloyd, D., and Spence, C.
(2002). Audiotactile interactions in roughness
perception. Exp. Brain Res. 146, 161–171.
3. Yau, J.M., Olenczak, J.B., Dammann, J.F., and
Bensmaia, S.J. (2009). Temporal frequency
channels are linked across audition and touch.
Curr. Biol. 19, 561–566.
4. Sumby, W.H., and Pollack, I. (1954). Visual
contribution to speech intelligibility in noise.
J. Acoust. Soc. Am. 26, 212–215.
5. Hairston, W.D., Laurienti, P.J., Mishra, G.,
Burdette, J.H., and Wallace, M.T. (2003).
Multisensory enhancement of localization
under conditions of induced myopia. Exp. Brain
Res. 152, 404–408.
6. Ross, L.A., Saint-Amour, D., Leavitt, V.,
Javitt, D.C., and Foxe, J.J. (2007). Do you see
what I’m saying? Optimal visual enhancement
of speech comprehension in noisy
environments. Cereb. Cortex 17,
7. wland, B.A., and Stein, B.E. (2008). Temporal
profiles of response enhancement in
multisensory integration. Front Neurosci. 2,
8. Schroeder, C.E., Lindsley, R.W., Specht, C.,
Marcovici, A., Smiley, J.F., and Javitt, D.C.
(2001). Somatosensory input to auditory
association cortex in the macaque monkey.
J. Neurophysiol. 85, 1322–1327.
9. Schroeder, C.E., and Foxe, J.J. (2002). The
timing and laminar profile of converging inputs
to multisensory areas of the macaque
neocortex. Brain Res. Cognit. Brain Res. 14,
10. Foxe, J.J., Morocz, I.A., Higgins, B.A.,
Murray, M.A., Javitt, D.C., and Schroeder, C.E.
(2000). Multisensory auditory-somatosensory
interactions in early cortical processing. Brain
Res. Cognit. Brain Res. 10, 77–83.
11. Foxe, J.J., Wylie, G.R., Martinez, A.G.,
Schroeder, C.E., Javitt, D.C., Guilfoyle, D., and
Murray, M.M. (2002). Auditory-somatosensory
multisensory processing in auditory association
cortex: An fMRI study. J. Neurophysiol. 88,
12. Murray, M.M., Molholm, S., Michel, C.M.,
Ritter, W., Heslenfeld, D.J., Schroeder, C.E.,
Javitt, D.C., and Foxe, J.J. (2005). Grabbing
your ear: Rapid auditory-somatosensory
multisensory interactions in low-level sensory
cortices are not constrained by stimulus
alignment. Cereb. Cortex 15, 963–974.
13. Foxe, J.J., and Schroeder, C.E. (2005). The case
for feedforward multisensory convergence
during early cortical processing. Neuroreport
14. Kayser, C., Petkov, C.I., and Logothetis, N.K.
(2009). Multisensory interactions in primate
auditory cortex: fMRI and electrophysiology.
Hear. Res. doi: 10.1016/j.heares.2009.02.011.
15. Lakatos, P., Pincze, Z., Fu, K.M., Javitt, D.C.,
Karmos, G., and Schroeder, C.E. (2005). Timing
of pure tone and noise-evoked responses in
macaque auditory cortex. Neuroreport 16,
16. Molholm, S., Ritter, W., Murray, M.M.,
Javitt, D.C., Schroeder, C.E., and Foxe, J.J.
(2002). Multisensory auditory-visual interactions
during early sensory processing in humans:
A high-density electrical mapping study.
Brain Res. Cognit. Brain Res. 14, 121–134.
Program in Cognitive Neuroscience,
Departments of Psychology & Biology, The
City College of the City University of New
York, 138th Street and Convent Avenue, NAC
Building – Room 7/202, New York, NY 10031,
USA; and The Cognitive Neurophysiology
Laboratory, Nathan S. Kline Institute
for Psychiatric Research, Program in
Cognitive Neuroscience and Schizophrenia,
140 Old Orangeburg Road, Orangeburg,
NY 10962, USA.
Necrosis: C-Type Lectins Sense Cell
Recent studies have shown that C-type lectins, a family of surface receptors
dying cells and transduce inflammatory signals that modulate the immune
Alessandra Cambi and Carl Figdor*
During embryonic development and
throughout the life of multicellular
organisms many cells die, either during
tissue remodeling, or because of injury,
or at sites of mechanical stress.
Two main types of cell death can be
distinguished. Apoptotic cell death —
a physiological event that is important
during development and maintenance
of tissues. Apoptosis is an active and
energy-conserving form of cell death
that eradicates aged or diseased cells
and poses little threat to the organism.
This death process therefore must not
lead to activation of the immune system
but rather to quick clearance of the
dying cells by phagocytes without
concomitant induction of inflammatory
responses. By contrast, cell death