Article

Seeing or not seeing where your hands are.

Department of Psychology, University of Bologna, Italy.
Experimental Brain Research (Impact Factor: 2.22). 05/2000; 131(4):458-67. DOI: 10.1007/s002219900264
Source: PubMed

ABSTRACT Previous findings have demonstrated the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. A strong modulatory effect of vision on touch perception was found when a visual stimulus was presented near the hand. In contrast, when the visual stimulus was presented far from the hand, only a weak modulatory effect was found. The aim of the present study was to verify whether such cross-modal links between touch and vision in the peripersonal space centered on the hand could be mediated by proprioceptive signals specifying the current hand positions or if they directly reflect an interaction between two sensory modalities, i.e., vision and touch. To this aim, cross-modal effects were studied in two different experiments: one in which patients could see their hands and one in which vision of their hands was prevented. The results showed strong modulatory effects of vision on touch perception when the visual stimulus was presented near the seen hand and only mild effects when the vision of the hand was prevented. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand, and corresponding visual receptive fields in the space immediately adjacent to the tactile fields. One important feature of these bimodal neurons is that their responsiveness to visual stimuli delivered near the body part is reduced or even extinguished when the view of the body part is prevented. This implies that, at least for the hand, the vision of the hand is crucial for determining the spatial mapping between vision and touch that takes place in the peripersonal space. In contrast, the proprioceptive signals specifying the current hand position in space do not seem to be relevant in determining the cross-modal interaction between vision and touch.

0 Bookmarks
 · 
59 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The Brain's ability to integrate information from different modalities (multisensory integration) is fundamental for accurate sensory experience and efficient interaction with the environment: it enhances detection of external stimuli, disambiguates conflict situations, speeds up responsiveness, facilitates processes of memory retrieval and object recognition. Multisensory integration operates at several brain levels: in subcortical structures (especially the Superior Colliculus), in higher-level associative cortices (e.g., posterior parietal regions), and even in early cortical areas (such as primary cortices) traditionally considered to be purely unisensory. Because of complex non-linear mechanisms of brain integrative phenomena, a key tool for their understanding is represented by neurocomputational models. This review examines different modelling principles and architectures, distinguishing the models on the basis of their aims: (i) Bayesian models based on probabilities and realizing optimal estimator of external cues; (ii) biologically inspired models of multisensory integration in the Superior Colliculus and in the Cortex, both at level of single neuron and network of neurons, with emphasis on physiological mechanisms and architectural schemes; among the latter, some models exhibit synaptic plasticity and reproduce development of integrative capabilities via Hebbian-learning rules or self-organizing maps; (iii) models of semantic memory that implement object meaning as a fusion between sensory-motor features (embodied cognition). This overview paves the way to future challenges, such as reconciling neurophysiological and Bayesian models into a unifying theory, and stimulates upcoming research in both theoretical and applicative domains.
    Neural networks: the official journal of the International Neural Network Society 08/2014; 60C:141-165. · 1.88 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual-tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual target locations, and for the remainder, the hands were aligned with the outer target locations. In Experiments 1 and 2, the inner and outer eccentricities were 17.5º and 52.5º, respectively. In Experiment 1, the arms were completely covered, and visual up-down judgments were better when on the same side as the preceding tactile cue. Cueing effects were not significantly affected by hand or target alignment. In Experiment 2, the arms were in view, and now some target responses were affected by cue alignment: Cueing for outer targets was only significant when the hands were aligned with them. In Experiment 3, we tested whether any unseen posture changes could alter the cueing effects, by widely separating the inner and outer target eccentricities (now 10º and 86º). In this case, hand alignment did affect some of the cueing effects: Cueing for outer targets was now only significant when the hands were in the outer position. Although these results confirm that proprioception can, in some cases, influence tactile-visual links in exogenous spatial attention, they also show that spatial precision is severely limited, especially when posture is unseen.
    Attention Perception & Psychophysics 01/2014; · 1.97 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The ability to localize nociceptive stimuli on the body surface is essential for an organism to respond appropriately to potential physical threats. This ability not only requires a representation of the space of the observer's body, but also of the external space with respect to their body. Therefore, localizing nociceptive stimuli requires coordinating multiple senses into an integrated frame of reference. The peripersonal frame of reference allows for the coding of the position of somatosensory stimuli on the body surface and the position of stimuli occurring close to the body (e.g., visual stimuli). Intensively studied for touch, this topic has been largely ignored when it comes to nociception. Here, we investigated, using a temporal order judgment task, whether the spatial perception of nociceptive stimuli is coordinated with that of proximal visual stimuli into an integrated representation of peripersonal space. Participants judged which of two nociceptive stimuli, one presented to either hand, had been presented first. Each pair of nociceptive stimuli was preceded by lateralized visual cues presented either unilaterally or bilaterally, and either close to, or far from, the participant's body. The perception of nociceptive stimuli was biased in favor of the stimulus delivered on the hand adjacent to the unilateral visual cue, especially when the cue was presented near the participant's hand. These results therefore suggest that a peripersonal frame of reference is used to map the position of nociceptive stimuli in multisensory space. We propose that peripersonal space constitutes a kind of margin of safety around the body to alert an organism to possible threats.
    Neuropsychologia 01/2014; · 3.48 Impact Factor