Multisensory integration: frequency tuning of audio-tactile integration.
ABSTRACT Multisensory information can be crucial, yet in many circumstances we have little, if any, awareness of the effects of multisensory inputs on what appear to be entirely unisensory perceptions. A recent study shows robust effects of auditory input on tactile frequency discriminations and that this auditory cross-sensory interference has specific tuning.
- [show abstract] [hide abstract]
ABSTRACT: The sounds produced when we touch textured surfaces frequently provide information regarding the structure of those surfaces. It has recently been demonstrated that the perception of the texture of the hands can be modified simply by manipulating the frequency content of such touch-related sounds. We investigated whether similar auditory manipulations change people's perception of the roughness of abrasive surfaces (experiment 1). Participants were required to make speeded, forced-choice discrimination responses regarding the roughness of a series of abrasive samples which they touched briefly. Analysis of discrimination errors verified that tactile roughness perception was modulated by the frequency content of the auditory feedback. Specifically, attenuating high frequencies led to a bias towards an increased perception of tactile smoothness. In experiment 2, we replicated the rubbing-hands manipulation of previous experimenters while participants rated either the perceived roughness or wetness of their hands. The wetness scale data replicated the results in the literature, while the roughness scale data replicated the result from experiment 1. A final experiment showed that delaying the auditory feedback from the hand-rubbing reduced the magnitude of this parchment-skin illusion. These experiments demonstrate the dramatic effect that auditory frequency manipulations can have on the perceived tactile roughness and moistness of surfaces, and are consistent with the proposal that different auditory perceptual dimensions may have varying salience for different surfaces.Experimental Brain Research 10/2002; 146(2):161-71. · 2.22 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Temporal frequency is a fundamental sensory dimension in audition and touch. In audition, analysis of temporal frequency is necessary for speech and music perception; in touch, the spectral analysis of vibratory signals has been implicated in texture perception and in sensing the environment through tools. Environmental oscillations impinging upon the ear are generally thought to be processed independently of oscillations impinging upon the skin. Here, we show that frequency channels are perceptually linked across audition and touch. In a series of psychophysical experiments, we demonstrate that auditory stimuli interfere with tactile frequency perception in a systematic manner. Specifically, performance on a tactile-frequency-discrimination task is impaired when an auditory distractor is presented with the tactile stimuli, but only if the frequencies of the auditory and tactile stimuli are similar. The frequency-dependent interference effect is observed whether the distractors are pure tones or band-pass noise, so an auditory percept of pitch is not required for the effect to be produced. Importantly, distractors that strongly impair frequency discrimination do not interfere with judgments of tactile intensity. This surprisingly specific crosstalk between different modalities reflects the importance of supramodal representations of fundamental sensory dimensions.Current biology: CB 04/2009; 19(7):561-6. · 10.99 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: "Oral speech intelligibility tests were conducted with, and without, supplementary visual observation of the speaker's facial and lip movements. The difference between these two conditions was examined as a function of the speech-to-noise ratio and of the size of the vocabulary under test. The visual contribution to oral speech intelligibility (relative to its possible contribution) is, to a first approximation, independent of the speech-to-noise ratio under test. However, since there is a much greater opportunity for the visual contribution at low speech-to-noise ratios, its absolute contribution can be exploited most profitably under these conditions." (PsycINFO Database Record (c) 2012 APA, all rights reserved)The Journal of the Acoustical Society of America 10/2012; · 1.65 Impact Factor
1. Hirai, K., Hirose, M., Haidawa, Y., and
Takenaka, T. (1998). The development of honda
humanoid robot. Proc. IEEE International Conf.
on Robotics and Automation, pp. 1321–1326.
2. McGeer, T. (1992). Principles of walking and
running. In Advances in Comparative and
Environmental Physiology 11: Mechanics of
Animal Locomotion, R. McN. Alexander, ed.
(Berlin: Springer-Verlag), pp. 113–139.
3. Pearson, K.G. (2008). Role of sensory feedback
in the control of stance duration in walking
cats. Brain Res. Rev. 57, 222–227.
4. Borgmann, A., Hooper, S.L., and Bu ¨schges, A.
(2009). Sensory feedback induced by front-leg
stepping entrains the activity of central pattern
generators in caudal segments of the stick
insect walking system. J. Neurosci. 29,
5. Cruse, H. (1990). What mechanisms coordinate
leg movement in walking arthropods? Trends
Neurosci. 13, 15–21.
6. Grillner, S. (2003). The motor infrastructure:
from ion channels to neuronal networks. Nat.
Rev. Neurosci. 4, 573–586.
7. Bu ¨schges, A. (2005). Sensory control and
organization of neural networks mediating
coordination of multisegmental organs for
locomotion. J. Neurophysiol. 93, 1127–1135.
8. Duysens, J., Clarac, F., and Cruse, H. (2000).
Load regulating mechanisms in gait and
posture, comparative aspects. Phys. Rev. 80,
9. Zill, S., Schmitz, J., and Bu ¨schges, A. (2004).
Load sensing and control of posture and
locomotion. Arthropod Struct. Dev. 33,
10. Ekeberg, O., Blumel, M., and Bu ¨schges, A.
(2004). Dynamic simulation of insect walking.
Arthropod Struct. Dev. 33, 287–300.
11. Akay, T., McVea, D.A., Tachibana, A., and
Pearson, K.G. (2006). Coordination of fore and
hind leg stepping in cats on a transversely-split
treadmill. Exp. Brain Res. 175, 211–222.
12. Schmitz, J. (1993). Load-compensating
reactions in the proximal leg joints of stick
insects during standing and walking. J. Exp.
Biol. 183, 15–33.
13. Clarac, F., and Chasserat, C. (1979).
Experimental modification of interlimb
coordination during locomotion of
a crustacean. Neurosci. Lett. 12, 271–276.
14. Lamb, T., and Yang, J.F. (2000). Could different
directions of infant stepping be controlled by
the same locomotor central pattern generator?
J. Neurophysiol. 83, 2814–2824.
15. Pang, M.Y.C., and Yang, J.F. (2002). Sensory
gating for the initiation of the swing phase in
different directions of human infant stepping.
J. Neurosci. 22, 5734–5740.
16. Ekeberg, O., and Pearson, K. (2005). Computer
simulation of stepping the hind legs of the cat:
an examination of mechanisms regulating the
stance-to-swing transition. J. Neurophysiol. 94,
17. Giszter, S.F., Davies, M.R., and Graziani, V.
(2008). Coordination strategies for limb forces
during weight-bearing locomotion in normal
rats, and in rats spinalized as neonates. Exp.
Brain Res. 190, 53–69.
18. Zill, S.N., Keller, B.R., and Duke, E.R. (2009).
Sensory signals of unloading in one leg follow
stance onset in another leg: Transfer of load
and emergent coordination in cockroach
walking. J. Neurophysiol. doi: 10.1152/
Department of Anatomy and Pathology,
Joan C. Edwards School of Medicine,
Marshall University, Huntington,
WV 25704, USA.
Multisensory Integration: Frequency
Tuning of Audio-Tactile Integration
Multisensory information can be crucial, yet in many circumstances we have
little, if any, awareness of the effects of multisensory inputs on what appear to
be entirely unisensory perceptions. A recent study shows robust effects of
auditory input on tactile frequency discriminations and that this auditory
cross-sensory interference has specific tuning.
John J. Foxe
Gentlemen, have you tried shaving with
your ears plugged? Of course you
haven’t and perhaps this doesn’t
even strike you as all that difficult an
undertaking, not like being asked to
do it without a mirror. And goodness
knows what it would be like to shave
if you were asked to apply topical
anesthetic to your face beforehand.
Still, the next time you shave, take
a little extra time to consider the
sensory signals that you rely on during
this tedious job. I bring up shaving here
because it has often struck me just how
much one relies on the combination of
auditory and somatosensory inputs
during this routine chore. It is a truly
multisensory task and the interplay of
the sound of the razor passing over
unshaven areas and the feel of the
blades on the skin is an excellent
demonstration of the interplay of these
two sensory systems during a very
personal tactile roughness task. I’m not
so sure that the effect is quite as strong
for the opposite sex when shaving
more distal and less innervated
aspects of the body, but I’ll assume
there are reasonable parallels.
One of the earliest formal
demonstrations of the role of auditory
inputs on tactile sensations was also
one of the most extraordinary, and is
not unrelated to my shaving example.
Jousmaki and Hari , writing in
Current Biology, showed that by
artificially altering the rubbing sounds
that participants heard when asked to
rub their palms together, one could
dramatically alter the tactile sensations
that subjects reported. They used
a simple setup where they placed
a microphone next to the hands and
then played the rubbing sounds the
hands made back through a pair of
headphones. In some cases, the
sounds were unaltered and in others,
all frequencies above 2000 Hz were
either enhanced or dampened by
15 decibels. Participants were asked to
rate their tactile sensation on a scale
between relative moistness
(roughness) and dryness
(smoothness). Two effects were seen.
First, the louder the rubbing sounds
were, the smoother and dryer the
rubbing experience became. So, it
became clear that auditory inputs
could affect tactile roughness
judgments. More importantly for our
purposes here, they also found that by
enhancing the high-frequency
component (2 kHz) of the rubbing
sound, the majority of their subjects
also experienced a significant shift in
the perceived smoothness/dryness of
the skin surface. A number of the
subjects spontaneously reported the
rather extraordinary sensation of
having a leaf of parchment paper
interposed between their rubbing
hands and so the effect has entered the
vernacular in the multisensory field as
the ‘parchment-skin illusion’.
auditory inputs as an important adjunct
hinted that there might be a tuning
function underlying this effect, as it was
because of the manipulation of the
high-versus-low frequency ratios of the
auditory inputs that those tactile
perceptions were altered. Guest et al. 
Participants in their study made
forced-choice discriminations regarding
the roughness of abrasive surfaces, and
their data showed that roughness
perception was modulated by the
frequency content of the auditory
feedback, with attenuation of high
frequency inputs causing a shift in
Yau et al.  have revisited this issue
in a series of carefully conducted
psychophysical experiments. As they
reported recently in Current Biology,
they have shown convincingly that
auditory inputs interfere with tactile
frequency-discrimination, but not with
judgments of intensity, thus ruling out
an attentional confound, and that this
interference only occurs when the
auditory inputs are at or near the same
frequency as the tactile inputs. In other
words, the frequency of vibrations at
the skin surface and of vibrations in
the hair cells of the cochlea, two
entirely separate sensory epithelia
transducing two very different forms
of energy, is a critical feature
dimension for subsequent
a two-alternative forced-choice task
where they judged which of two
sequentially presented vibro-tactile
stimulations of the index finger was
higher in frequency. Vibratory stimuli
ranged in frequency from 100 to 300 Hz
in 40 Hz increments, and the second of
the vibratory stimuli was accompanied
by an auditory ‘distractor’ which could
either be the same frequency as the
ranging between 100 and 1500 Hz. The
data indicated a clear decrement in the
sensitivity of subjects to tactile
frequency differences, but only for
auditory distractors in the low
frequency range. Another intriguing
finding was that the perceived
to be pulled towards the lowest
frequency auditory stimulus (100Hz).
It remained possible that these
interference effects were only driven by
low-frequency auditory inputs, so in
a follow-up experiment, the authors
used a higher frequency vibro-tactile
comparator stimulus (400 Hz) and
showed that the interference effects
Lastly, they showed that the effects
were also present for band-pass noise
distractors centered at the frequency of
the vibro-tactile stimulus and so, the
perception of pure pitch was not
necessary for the interference effect.
The authors argued that this latter
finding pointed to a sensory rather than
decisional locus for the effect.
One obvious question that arises is
why this cross-sensory effect is
manifest as an impairment in tactile
frequency perception. Why did
same-frequency auditory inputs not
aide in performance if they were
integrated? Aren’t performance
improvements what we have come
to expect when there is redundant
but congruent sensory input
[4–7]. Jousmaki and Hari’s 
parchment-skin findings may hold
the answer. In that study, the rubbing
sounds were the actual sounds of the
participants own hands, and yet tactile
roughness perception was changed
notonly byshiftsinthe passbandof the
rubbing sounds, but also by shifts in
the loudness of the auditory inputs,
even when the frequencies were not
manipulated. In the Yau et al.  study,
there is no pre-existing lawful
relationship between the intensity of
the auditory and vibro-tactile stimuli
used. Rather, the levels chosen were
experimenter controlled and were
essentially arbitrary, not necessarily
particularly ‘realistic’ in terms of
similar stimuli in the natural
environment. As such, it is highly
likely that the auditory inputs shifted
the perceived frequency of the tactile
inputs, and this is in fact what is
seen in the data where there is a clear
shift in the perceived vibro-tactile
frequency towards lower frequency
One of the excellent aspects of the
work by Yau et al.  is that some clear
and testable predictions are made
about the underlying neurophysiology.
First, they argue that the effects likely
occur at a sensory processing level,
pointing to a region of the auditory
cortex, the so-called caudo-medial belt
area (CM), as the most likely substrate.
This area was first implicated in
auditory-somatosensory integration by
Charles Schroeder , who showed
feedforward auditory and
somatosensory inputs to layer 4 in area
CM of awake behaving macaques. This
was considered a remarkable finding at
was a classical region of auditory
cortex, often uncovered larger
somatosensory than auditory
responses. It bears emphasizing that
CM is just one synapse from primary
auditory cortex at the second stage
of the auditory hierarchy (Figure 1).
Thus, robust somatosensory
responses are evident at an auditory
processing stage that is at the
equivalent hierarchical level that V2
occupies in the visual hierarchy.
Further, these inputs are just as fast as
the auditory inputs to this region .
At the same time, my own lab had
begun to uncover similarly early
in early auditory cortex using both EEG
and fMRI in humans [10,11]. Our initial
thought was that this early integration
must surely be to do with spatial
mapping, but a formal test of this
assumption proved us wrong .
Much work has been done since to try
to understand the role of CM in
Broad Broad Narrow
Functional (tonotopic) parcellation
Figure 1. The organization of early auditory cortex in the macaque.
(A) Organization of monkey auditory cortex. Three primary auditory fields comprise the core
region. These are surrounded by the secondary fields, the so-called belt region, and in turn
by higher association areas of the so-called parabelt region. Electrophysiological studies have
shown that several of these fields contain an ordered representation of sound frequency (tono-
topic map, indicated on the left), and that core and belt fields prefer narrow and broadband
sounds, respectively. These two functional properties can be exploited to map the layout of
these auditory fields in individual subjects using functional imaging. (B) Single-slice fMRI data
showing frequency-selective BOLD responses to low and high tones (left panel) and a complete
(smoothed) frequency map obtained from stimulation using six frequency bands. Combining the
frequency map with an estimate of the core region obtained as well as anatomical landmarks to
delineate the parabelt results in a full parcellation of auditory cortex in individual subjects.
(Figure kindly provided by Christoph Kayser and reprinted with permission from .)
Current Biology Vol 19 No 9
processing  and Figure 1 shows
a nice illustration of work by Christoph
Kayser and colleagues . This group
used both intracranial recordings and
functional imaging in macaques to
create detailed individualized maps of
the tonotopic organization of the
auditory regions. One thing to note
from this illustration is that area CM,
which is the region implicated by Yau
et al.  as the putative site for cross-
frequency coupling effects between
broad-band low-frequency stimuli and
shows little or no obvious tonotopic
map . Thus, it may be the case that
CM does not have the requisite
frequency resolution to underlie these
effects whereas the neighbouring CL
shows a higher degree of tonotopy and
may be the more likely candidate.
We have concentrated very much on
these very early feedforward phases of
cortical integrations in our own work
, in no small part because these
early integrations were considered by
many to be so very surprising when
they first came to light just a decade
ago. But a reasonable question is just
how ‘clever’ are these early
integrations. How complex is the
And what of the large-scale complex of
next 150 milliseconds of processing
time? For example, high-density
electrical mapping has been used to
assess the integration of extremely
basic tone-pips and monochromatic
flash inputs while subjects perform
nothing more complex than a reaction
time task . In this very simple
paradigm, no fewer than six distinct
phases of multisensory processing
were dissociated across a wide
network of cortical regions, including
early extrastriate cortex, early auditory
cortex, parietal regions and frontal
regions. All of these distinct phases
occurred within just 130 milliseconds
of initial afference in area V1. So,
while this study showed very early
audio-visual integration in extrastriate
visual areas at just 45 milliseconds,
it is simply not clear what aspect of
the integrative process this initial
multisensory effect represents. It is
certainly reasonable to think that it
can hardly be a very complex
operation given how early it occurs.
Much remains to be done to delineate
what aspects of multisensory
integration are achieved across this
temporal hierarchy of integration
phases. The work of Yau et al. 
provides some very nice predictions
that can now be tested by the
neurophysiology and neuroimaging
1. Jousma ¨ki, V., and Hari, R. (1998). Parchment-
skin illusion: sound-biased touch. Curr. Biol. 8,
2. Guest, S., Catmur, C., Lloyd, D., and Spence, C.
(2002). Audiotactile interactions in roughness
perception. Exp. Brain Res. 146, 161–171.
3. Yau, J.M., Olenczak, J.B., Dammann, J.F., and
Bensmaia, S.J. (2009). Temporal frequency
channels are linked across audition and touch.
Curr. Biol. 19, 561–566.
4. Sumby, W.H., and Pollack, I. (1954). Visual
contribution to speech intelligibility in noise.
J. Acoust. Soc. Am. 26, 212–215.
5. Hairston, W.D., Laurienti, P.J., Mishra, G.,
Burdette, J.H., and Wallace, M.T. (2003).
Multisensory enhancement of localization
under conditions of induced myopia. Exp. Brain
Res. 152, 404–408.
6. Ross, L.A., Saint-Amour, D., Leavitt, V.,
Javitt, D.C., and Foxe, J.J. (2007). Do you see
what I’m saying? Optimal visual enhancement
of speech comprehension in noisy
environments. Cereb. Cortex 17,
7. wland, B.A., and Stein, B.E. (2008). Temporal
profiles of response enhancement in
multisensory integration. Front Neurosci. 2,
8. Schroeder, C.E., Lindsley, R.W., Specht, C.,
Marcovici, A., Smiley, J.F., and Javitt, D.C.
(2001). Somatosensory input to auditory
association cortex in the macaque monkey.
J. Neurophysiol. 85, 1322–1327.
9. Schroeder, C.E., and Foxe, J.J. (2002). The
timing and laminar profile of converging inputs
to multisensory areas of the macaque
neocortex. Brain Res. Cognit. Brain Res. 14,
10. Foxe, J.J., Morocz, I.A., Higgins, B.A.,
Murray, M.A., Javitt, D.C., and Schroeder, C.E.
(2000). Multisensory auditory-somatosensory
interactions in early cortical processing. Brain
Res. Cognit. Brain Res. 10, 77–83.
11. Foxe, J.J., Wylie, G.R., Martinez, A.G.,
Schroeder, C.E., Javitt, D.C., Guilfoyle, D., and
Murray, M.M. (2002). Auditory-somatosensory
multisensory processing in auditory association
cortex: An fMRI study. J. Neurophysiol. 88,
12. Murray, M.M., Molholm, S., Michel, C.M.,
Ritter, W., Heslenfeld, D.J., Schroeder, C.E.,
Javitt, D.C., and Foxe, J.J. (2005). Grabbing
your ear: Rapid auditory-somatosensory
multisensory interactions in low-level sensory
cortices are not constrained by stimulus
alignment. Cereb. Cortex 15, 963–974.
13. Foxe, J.J., and Schroeder, C.E. (2005). The case
for feedforward multisensory convergence
during early cortical processing. Neuroreport
14. Kayser, C., Petkov, C.I., and Logothetis, N.K.
(2009). Multisensory interactions in primate
auditory cortex: fMRI and electrophysiology.
Hear. Res. doi: 10.1016/j.heares.2009.02.011.
15. Lakatos, P., Pincze, Z., Fu, K.M., Javitt, D.C.,
Karmos, G., and Schroeder, C.E. (2005). Timing
of pure tone and noise-evoked responses in
macaque auditory cortex. Neuroreport 16,
16. Molholm, S., Ritter, W., Murray, M.M.,
Javitt, D.C., Schroeder, C.E., and Foxe, J.J.
(2002). Multisensory auditory-visual interactions
during early sensory processing in humans:
A high-density electrical mapping study.
Brain Res. Cognit. Brain Res. 14, 121–134.
Program in Cognitive Neuroscience,
Departments of Psychology & Biology, The
City College of the City University of New
York, 138th Street and Convent Avenue, NAC
Building – Room 7/202, New York, NY 10031,
USA; and The Cognitive Neurophysiology
Laboratory, Nathan S. Kline Institute
for Psychiatric Research, Program in
Cognitive Neuroscience and Schizophrenia,
140 Old Orangeburg Road, Orangeburg,
NY 10962, USA.
Necrosis: C-Type Lectins Sense Cell
Recent studies have shown that C-type lectins, a family of surface receptors
dying cells and transduce inflammatory signals that modulate the immune
Alessandra Cambi and Carl Figdor*
During embryonic development and
throughout the life of multicellular
organisms many cells die, either during
tissue remodeling, or because of injury,
or at sites of mechanical stress.
Two main types of cell death can be
distinguished. Apoptotic cell death —
a physiological event that is important
during development and maintenance
of tissues. Apoptosis is an active and
energy-conserving form of cell death
that eradicates aged or diseased cells
and poses little threat to the organism.
This death process therefore must not
lead to activation of the immune system
but rather to quick clearance of the
dying cells by phagocytes without
concomitant induction of inflammatory
responses. By contrast, cell death