Vision in the palm of your hand

Department of Psychology, Trent University, Peterborough, Ontario, Canada.
Neuropsychologia (Impact Factor: 3.3). 12/2008; 47(6):1621-6. DOI: 10.1016/j.neuropsychologia.2008.11.021
Source: PubMed


Here we show that pointing movements made to visual targets projected onto the palm of the hand are more precise and accurate than those made to targets projected onto back of the hand. This advantage may be related to the fact that the number of cortical bimodal neurons coding both visual and tactile stimuli increases with tactile receptor density, which is known to be higher in glabrous than in hairy skin.

Download full-text


Available from: Melvyn Goodale
  • Source
    • "Other studies indicate that people are slower to disengage from visual targets when they appear near the hands (Abrams et al., 2008; Thura et al., 2008; Tseng and Bridgeman, 2011), and that nearby hands slow switching between the global and local levels of a stimulus (Davoli et al., 2012). Evidence suggests that these psychophysical effects are stronger in the presence of the participants' real hand than a fake one (Reed et al., 2006; Brown et al., 2009), while others indicate that near-hand effects can be linked to the presence of an avatar-hand whose movements mirror the actions of the participants' real hand but are not linked to an unmoving avatar (Short and Ward, 2009). Together, this evidence suggests that visual stimuli are processed differently when the observer's own hand(s) is placed near the stimulus rather than when the hand is placed elsewhere. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Visual targets can be processed more quickly and reliably when a hand is placed near the target. Both unimodal and bimodal representations of hands are largely lateralized to the contralateral hemisphere, and since each hemisphere demonstrates specialized cognitive processing, it is possible that targets appearing near the left hand may be processed differently than targets appearing near the right hand. The purpose of this study was to determine whether visual processing near the left and right hands interacts with hemispheric specialization. We presented hierarchical-letter stimuli (e.g., small characters used as local elements to compose large characters at the global level) near the left or right hands separately and instructed participants to discriminate the presence of target letters (X and O) from non-target letters (T and U) at either the global or local levels as quickly as possible. Targets appeared at either the global or local level of the display, at both levels, or were absent from the display; participants made foot-press responses. When discriminating target presence at the global level, participants responded more quickly to stimuli presented near the left hand than near either the right hand or in the no-hand condition. Hand presence did not influence target discrimination at the local level. Our interpretation is that left-hand presence may help participants discriminate global information, a right hemisphere (RH) process, and that the left hand may influence visual processing in a way that is distinct from the right hand.
    Full-text · Article · Oct 2013 · Frontiers in Psychology
  • Source
    • "This can possibly be explained with a functional account. To emphasize this functional aspect, the Brown, Morrissey, and Goodale (2009) study found that faster target detection was observed only when stimuli were projected onto the palm, but not the back of the hand. Similarly, Davoli and Brockmole (2012) also elegantly showed that hands, with palms facing inward, can ''shield'' attention from distractors outside the palm areas. "
    [Show abstract] [Hide abstract]
    ABSTRACT: An exciting new line of research that investigates the impact of one's own hands on visual perception and attention has flourished in the past several years. Specifically, several studies have demonstrated that the nearness of one's hands can modulate visual perception, visual attention, and even visual memory. These studies together shed new light on how the brain prioritizes certain information to be processed first. This review first outlines the recent progress that has been made to uncover various characteristics of the nearby-hand effect, including how they may be transferred to a familiar tool. We then summarize the findings into four specific characteristics of the nearby-hand effect, and conclude with a possible neural mechanism that may account for all the findings.
    Full-text · Article · Sep 2012 · Vision research
  • Source
    • "If the target falls within the visual RFs of bimodal cells, these cells may be recruited to help represent and process the target. In general, these benefits are not reliable when the patient or healthy participant sees a fake hand near the visual target [12], [15], [16]. This explanation for hand-proximity effects is reminiscent of the statistical facilitation that appears to explain redundancy effects, in which two identical stimuli are processed more quickly than one [19], [20]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented.
    Full-text · Article · Dec 2011 · PLoS ONE
Show more