Article

Seeing or not seeing where your hands are.

Department of Psychology, University of Bologna, Italy.
Experimental Brain Research (Impact Factor: 2.17). 05/2000; 131(4):458-67. DOI: 10.1007/s002219900264
Source: PubMed

ABSTRACT Previous findings have demonstrated the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. A strong modulatory effect of vision on touch perception was found when a visual stimulus was presented near the hand. In contrast, when the visual stimulus was presented far from the hand, only a weak modulatory effect was found. The aim of the present study was to verify whether such cross-modal links between touch and vision in the peripersonal space centered on the hand could be mediated by proprioceptive signals specifying the current hand positions or if they directly reflect an interaction between two sensory modalities, i.e., vision and touch. To this aim, cross-modal effects were studied in two different experiments: one in which patients could see their hands and one in which vision of their hands was prevented. The results showed strong modulatory effects of vision on touch perception when the visual stimulus was presented near the seen hand and only mild effects when the vision of the hand was prevented. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand, and corresponding visual receptive fields in the space immediately adjacent to the tactile fields. One important feature of these bimodal neurons is that their responsiveness to visual stimuli delivered near the body part is reduced or even extinguished when the view of the body part is prevented. This implies that, at least for the hand, the vision of the hand is crucial for determining the spatial mapping between vision and touch that takes place in the peripersonal space. In contrast, the proprioceptive signals specifying the current hand position in space do not seem to be relevant in determining the cross-modal interaction between vision and touch.

0 Bookmarks
 · 
64 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The Brain's ability to integrate information from different modalities (multisensory integration) is fundamental for accurate sensory experience and efficient interaction with the environment: it enhances detection of external stimuli, disambiguates conflict situations, speeds up responsiveness, facilitates processes of memory retrieval and object recognition. Multisensory integration operates at several brain levels: in subcortical structures (especially the Superior Colliculus), in higher-level associative cortices (e.g., posterior parietal regions), and even in early cortical areas (such as primary cortices) traditionally considered to be purely unisensory. Because of complex non-linear mechanisms of brain integrative phenomena, a key tool for their understanding is represented by neurocomputational models. This review examines different modelling principles and architectures, distinguishing the models on the basis of their aims: (i) Bayesian models based on probabilities and realizing optimal estimator of external cues; (ii) biologically inspired models of multisensory integration in the Superior Colliculus and in the Cortex, both at level of single neuron and network of neurons, with emphasis on physiological mechanisms and architectural schemes; among the latter, some models exhibit synaptic plasticity and reproduce development of integrative capabilities via Hebbian-learning rules or self-organizing maps; (iii) models of semantic memory that implement object meaning as a fusion between sensory-motor features (embodied cognition). This overview paves the way to future challenges, such as reconciling neurophysiological and Bayesian models into a unifying theory, and stimulates upcoming research in both theoretical and applicative domains.
    Neural networks: the official journal of the International Neural Network Society 08/2014; 60C:141-165. · 1.88 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The complexities of bodily experience are outlined; its spatial phenomenology is specified as the explanatory target. The mereological structure of body representation is discussed; it is claimed that global spatial representations of the body are not necessary, as structural features of the actual body can be exploited in partial internal representation. The spatial structure of bodily experience is discussed; a structural affordance theory is introduced; it is claimed that bodily experience and subpersonal representation have action-oriented content; and that egocentric terms continue to make sense in application to bodily experience.
  • Source
    Embodiment, Ego-space and Action., 1st Edition edited by R. L. Klatzky, M. Behrmann, B. MacWhinney, 06/2008: pages 247-274; Psychology Press., ISBN: 0805862889