Seeing or not seeing where your hands are.

Department of Psychology, University of Bologna, Italy.
Experimental Brain Research (Impact Factor: 2.22). 05/2000; 131(4):458-67. DOI: 10.1007/s002219900264
Source: PubMed

ABSTRACT Previous findings have demonstrated the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. A strong modulatory effect of vision on touch perception was found when a visual stimulus was presented near the hand. In contrast, when the visual stimulus was presented far from the hand, only a weak modulatory effect was found. The aim of the present study was to verify whether such cross-modal links between touch and vision in the peripersonal space centered on the hand could be mediated by proprioceptive signals specifying the current hand positions or if they directly reflect an interaction between two sensory modalities, i.e., vision and touch. To this aim, cross-modal effects were studied in two different experiments: one in which patients could see their hands and one in which vision of their hands was prevented. The results showed strong modulatory effects of vision on touch perception when the visual stimulus was presented near the seen hand and only mild effects when the vision of the hand was prevented. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand, and corresponding visual receptive fields in the space immediately adjacent to the tactile fields. One important feature of these bimodal neurons is that their responsiveness to visual stimuli delivered near the body part is reduced or even extinguished when the view of the body part is prevented. This implies that, at least for the hand, the vision of the hand is crucial for determining the spatial mapping between vision and touch that takes place in the peripersonal space. In contrast, the proprioceptive signals specifying the current hand position in space do not seem to be relevant in determining the cross-modal interaction between vision and touch.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: How do we distinguish "self" from "other"? The correlation between willing an action and seeing it occur is an important cue. We exploited the fact that this correlation needs to occur within a restricted temporal window in order to obtain a quantitative assessment of when a body part is identified as "self". We measured the threshold and sensitivity (d') for detecting a delay between movements of the finger (of both the dominant and non-dominant hands) and visual feedback as seen from four visual perspectives (the natural view, and mirror-reversed and/or inverted views). Each trial consisted of one presentation with minimum delay and another with a delay of between 33 and 150 ms. Participants indicated which presentation contained the delayed view. We varied the amount of efference copy available for this task by comparing performances for discrete movements and continuous movements. Discrete movements are associated with a stronger efference copy. Sensitivity to detect asynchrony between visual and proprioceptive information was significantly higher when movements were viewed from a "plausible" self perspective compared with when the view was reversed or inverted. Further, we found differences in performance between dominant and non-dominant hand finger movements across the continuous and single movements. Performance varied with the viewpoint from which the visual feedback was presented and on the efferent component such that optimal performance was obtained when the presentation was in the normal natural orientation and clear efferent information was available. Variations in sensitivity to visual/non-visual temporal incongruence with the viewpoint in which a movement is seen may help determine the arrangement of the underlying visual representation of the body.
    Experimental Brain Research 08/2012; 222(4):389-97. · 2.22 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The ability to localize nociceptive stimuli on the body surface is essential for an organism to respond appropriately to potential physical threats. This ability not only requires a representation of the space of the observer's body, but also of the external space with respect to their body. Therefore, localizing nociceptive stimuli requires coordinating multiple senses into an integrated frame of reference. The peripersonal frame of reference allows for the coding of the position of somatosensory stimuli on the body surface and the position of stimuli occurring close to the body (e.g., visual stimuli). Intensively studied for touch, this topic has been largely ignored when it comes to nociception. Here, we investigated, using a temporal order judgment task, whether the spatial perception of nociceptive stimuli is coordinated with that of proximal visual stimuli into an integrated representation of peripersonal space. Participants judged which of two nociceptive stimuli, one presented to either hand, had been presented first. Each pair of nociceptive stimuli was preceded by lateralized visual cues presented either unilaterally or bilaterally, and either close to, or far from, the participant's body. The perception of nociceptive stimuli was biased in favor of the stimulus delivered on the hand adjacent to the unilateral visual cue, especially when the cue was presented near the participant's hand. These results therefore suggest that a peripersonal frame of reference is used to map the position of nociceptive stimuli in multisensory space. We propose that peripersonal space constitutes a kind of margin of safety around the body to alert an organism to possible threats.
    Neuropsychologia 01/2014; · 3.48 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The Rubber Hand Illusion (RHI) is an established paradigm for studying body ownership, and several studies have implicated premotor and temporo-parietal brain regions in its neuronal foundation. Here we used an automated setup to induce a novel multi-site version of the RHI in healthy human participants inside an MR-scanner, with a RHI and control condition that were matched in terms of synchrony of visual and tactile stimulation. Importantly, as previous research has shown that most of the ownership-related brain areas also respond to observed human actions and touch, or body parts of others, here such potential effects of the experimenter were eliminated by the automated procedure. The RHI condition induced a strong ownership illusion; we found correspondingly stronger brain activity during the RHI versus control condition in contralateral middle occipital gyrus (mOCG) and bilateral anterior insula, which have previously been related to illusory body ownership. Using independent functional localizers, we confirmed that the activity in mOCG was located within the body-part selective extrastriate body area (EBA). Crucially, activity differences in participants' peak voxels within left EBA correlated strongly positively with their behavioral illusion scores. Thus EBA activity also reflected interindividual differences in the experienced intensity of illusory limb ownership. Moreover, psychophysiological interaction analyses (PPI) revealed that contralateral primary somatosensory cortex had stronger brain connectivity with EBA during the RHI versus control condition, while EBA was more strongly interacting with temporo-parietal multisensory regions. In sum, our findings demonstrate a direct involvement of EBA in limb ownership.
    NeuroImage 10/2013; · 6.25 Impact Factor