Multisensory space representations in the macaque ventral intraparietal area.
ABSTRACT Animals can use different sensory signals to localize objects in the environment. Depending on the situation, the brain either integrates information from multiple sensory sources or it chooses the modality conveying the most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP), known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive neurons had surprisingly small spatially restricted auditory receptive fields (RFs)]. Given this finding, we compared the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation of external space.
- [show abstract] [hide abstract]
ABSTRACT: The position of gaze (eye plus head position) relative to body is known to alter the perceived locations of sensory targets. This effect suggests that perceptual space is at least partially coded in a gaze-centered reference frame. However, the direction of the effects reported has not been consistent. Here, we investigate the cause of a discrepancy between reported directions of shift in tactile localization related to head position. We demonstrate that head eccentricity can cause errors in touch localization in either the same or opposite direction as the head is turned depending on the procedure used. When head position is held eccentric during both the presentation of a touch and the response, there is a shift in the direction opposite to the head. When the head is returned to center before reporting, the shift is in the same direction as head eccentricity. We rule out a number of possible explanations for the difference and conclude that when the head is moved between a touch and response the touch is coded in a predominantly gaze-centered reference frame, whereas when the head remains stationary a predominantly body-centered reference frame is used. The mechanism underlying these displacements in perceived location is proposed to involve an underestimated gaze signal. We propose a model demonstrating how this single neural error could cause localization errors in either direction depending on whether the gaze or body midline is used as a reference. This model may be useful in explaining gaze-related localization errors in other modalities.Experimental Brain Research 09/2012; 222(4):437-45. · 2.22 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Numerosity, the number of elements in a set, is a most abstract quantitative category. As such, it is independent of the sensory modality of its elements, i.e., supramodal. Because neuronal numerosity selectivity had never been compared directly across different sensory modalities, it remained elusive if and where single neurons encode numerosity irrespective of the items' modality. Here, monkeys were trained to discriminate both the number of auditory sounds and visual items within the same session. While the monkeys performed this task, the activity of neurons was recorded in the lateral prefrontal cortex and ventral intraparietal sulcus, structures critically involved in numerical cognition. Groups of neurons in both areas encoded either the number of auditory pulses, visual items, or both. The finding of neurons responding to numerosity irrespective of the sensory modality supports the idea of a nonverbal, supramodal neuronal code of numerical quantity in the primate brain.Proceedings of the National Academy of Sciences 07/2012; 109(29):11860-5. · 9.74 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Many neurons in the macaque ventral intraparietal area (VIP) are multimodal, i.e., they respond not only to visual but also to tactile, auditory and vestibular stimulation. Anatomical studies have shown distinct projections between area VIP and a region of premotor cortex controlling head movements. A specific function of area VIP could be to guide movements in order to head for and/or to avoid objects in near extrapersonal space. This behavioral role would require a consistent representation of visual motion within 3-D space and enhanced activity for nearby motion signals. Accordingly, in our present study we investigated whether neurons in area VIP are sensitive to moving visual stimuli containing depth signals from horizontal disparity. We recorded single unit activity from area VIP of two awake behaving monkeys (Macaca mulatta) fixating a central target on a projection screen. Sensitivity of neurons to horizontal disparity was assessed by presenting large field moving images (random dot fields) stereoscopically to the two eyes by means of LCD shutter goggles synchronized with the stimulus computer. During an individual trial, stimuli had one of seven different disparity values ranging from 3° uncrossed- (far) to 3° crossed- (near) disparity in 1° steps. Stimuli moved at constant speed in all simulated depth planes. Different disparity values were presented across trials in pseudo-randomized order. Sixty-one percent of the motion sensitive cells had a statistically significant selectivity for the horizontal disparity of the stimulus (p < 0.05, distribution free ANOVA). Seventy-five percent of them preferred crossed-disparity values, i.e., moving stimuli in near space, with the highest mean activity for the nearest stimulus. At the population level, preferred direction of visual stimulus motion was not affected by horizontal disparity. Thus, our findings are in agreement with the behavioral role of area VIP in the representation of movement in near extrapersonal space.Frontiers in Behavioral Neuroscience 01/2013; 7:8. · 4.76 Impact Factor