Article

Visual Guidance of Smooth-Pursuit Eye Movements: Sensation, Action, and What Happens in Between

Howard Hughes Medical Institute, Department of Physiology, and W.M. Keck Foundation Center for Integrative Neuroscience, University of California, San Francisco, San Francisco, CA 94143-0444, USA.
Neuron (Impact Factor: 15.98). 05/2010; 66(4):477-91. DOI: 10.1016/j.neuron.2010.03.027
Source: PubMed

ABSTRACT Smooth-pursuit eye movements transform 100 ms of visual motion into a rapid initiation of smooth eye movement followed by sustained accurate tracking. Both the mean and variation of the visually driven pursuit response can be accounted for by the combination of the mean tuning curves and the correlated noise within the sensory representation of visual motion in extrastriate visual area MT. Sensory-motor and motor circuits have both housekeeping and modulatory functions, implemented in the cerebellum and the smooth eye movement region of the frontal eye fields. The representation of pursuit is quite different in these two regions of the brain, but both regions seem to control pursuit directly with little or no noise added downstream. Finally, pursuit exhibits a number of voluntary characteristics that happen on short timescales. These features make pursuit an excellent exemplar for understanding the general properties of sensory-motor processing in the brain.

0 Followers
 · 
106 Views
    • "When considering these predictions, it is important to consider the limitations of our model—some of which are similar to those of previous feedforward network models (Blohm 2012; Blohm et al. 2009). First, the network only performs the transformation for the initiation, or " open-loop, " portion of smooth pursuit (Blohm and Lefèvre 2010; e.g., Ilg 2008 or Lisberger 2010). Therefore, the transformation during the minimization of retinal slip during ongoing smooth pursuit (i.e., once pursuit is driven primarily by extraretinal signals) is beyond the scope of this model. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Smooth pursuit eye movements are driven by retinal motion and enable us to view moving targets with high acuity. Complicating the generation of these movements is the fact that different eye and head rotations can produce different retinal stimuli but giving rise to identical smooth pursuit trajectories. However, because our eyes accurately pursue targets regardless of eye and head orientation (Blohm and Lefèvre 2010), the brain must somehow take these signals into account. To learn about the neural mechanisms potentially underlying this visual-to-motor transformation, we trained a physiologically-inspired neural network model to combine 2D retinal motion signals with 3D eye and head orientation and velocity signals to generate a spatially correct 3D pursuit command. We then simulated conditions of (1) head roll-induced ocular counter-roll, (2) oblique gaze-induced retinal rotations, (3) eccentric gazes (invoking the half-angle rule) and (4) optokinetic nystagmus to investigate how units in the intermediate layers of the network accounted for different 3D constraints. Simultaneously, we simulated electrophysiological recordings (visual and motor tunings) and microstimulation experiments to quantify the reference frames of signals at each processing stage. We found a gradual retinal-to-intermediate-to-spatial feedforward transformation through the hidden layers. Our model is the first to describe the general 3D transformation for smooth pursuit mediated by eye- and head-dependent gain modulation. Based on several testable experimental predictions, our model provides a mechanism by which the brain could perform the 3D visuomotor transformation for smooth pursuit. Copyright © 2014, Journal of Neurophysiology.
    Journal of Neurophysiology 12/2014; 113(5):jn.00273.2014. DOI:10.1152/jn.00273.2014 · 3.04 Impact Factor
  • Source
    • "Makin and Poliakoff (2011) elaborated on the tracking hypothesis. They noted that the oculomotor system is relatively well understood on cognitive and neural levels (for reviews see Barnes, 2008; Lisberger, 2010), and this knowledge can be used to help understand motion extrapolation. One putative feature of the pursuit system is that velocity information can be retained in a short-term velocity memory store when targets disappear. "
    [Show abstract] [Hide abstract]
    ABSTRACT: People can estimate the current position of an occluded moving target. This is called motion extrapolation, and it has been suggested that the performance in such tasks is mediated by the smooth-pursuit system. Experiment 1 contrasted a standard position extrapolation task with a novel number extrapolation task. In the position extrapolation task, participants saw a horizontally moving target become occluded, and then responded when they thought the target had reached the end of the occluder. Here the stimuli can be tracked with pursuit eye movements. In the number extrapolation task, participants saw a rapid countdown on the screen that disappeared before reaching zero. Participants responded when they thought the hidden counter would have reached zero. Although this stimulus cannot be tracked with the eyes, performance was comparable on both the tasks. The response times were also found to be correlated. Experiments 2 and 3 extended these findings, using extrapolation through color space as well as number space, while Experiment 4 found modest evidence for similarities between color and number extrapolation. Although more research is certainly needed, we propose that a common rate controller guides extrapolation through physical space and feature space. This functions like the velocity store module of the smooth-pursuit system, but with a broader function than previously envisaged.
    Journal of Vision 11/2014; 14(13). DOI:10.1167/14.13.10 · 2.73 Impact Factor
  • Source
    • "doi:10.1167/14.8.8. neuronal mechanisms (Lisberger, 2010 "
    [Show abstract] [Hide abstract]
    ABSTRACT: PERCEPTUAL LEARNING IMPROVES DETECTION AND DISCRIMINATION OF RELEVANT VISUAL INFORMATION IN MATURE HUMANS, REVEALING SENSORY PLASTICITY WHETHER VISUAL PERCEPTUAL LEARNING AFFECTS MOTOR RESPONSES IS UNKNOWN HERE WE IMPLEMENTED A PROTOCOL THAT ENABLED US TO ADDRESS THIS QUESTION WE TESTED A PERCEPTUAL RESPONSE MOTION DIRECTION ESTIMATION, IN WHICH OBSERVERS OVERESTIMATE MOTION DIRECTION AWAY FROM A REFERENCE AND A MOTOR RESPONSE VOLUNTARY SMOOTH PURSUIT EYE MOVEMENTS PERCEPTUAL TRAINING LED TO GREATER OVERESTIMATION AND, REMARKABLY, IT MODIFIED UNTRAINED SMOOTH PURSUIT IN CONTRAST, PURSUIT TRAINING DID NOT AFFECT OVERESTIMATION IN EITHER PURSUIT OR PERCEPTION, EVEN THOUGH OBSERVERS IN BOTH TRAINING GROUPS WERE EXPOSED TO THE SAME STIMULI FOR THE SAME TIME PERIOD A SECOND EXPERIMENT REVEALED THAT ESTIMATION TRAINING ALSO IMPROVED DISCRIMINATION, INDICATING THAT OVERESTIMATION MAY OPTIMIZE PERCEPTUAL SENSITIVITY HENCE, ACTIVE PERCEPTUAL TRAINING IS NECESSARY TO ALTER PERCEPTUAL RESPONSES, AND AN ACQUIRED CHANGE IN PERCEPTION SUFFICES TO MODIFY PURSUIT, A MOTOR RESPONSE:
    Journal of Vision 07/2014; 14(8). DOI:10.1167/14.8.8 · 2.73 Impact Factor
Show more

Preview

Download
3 Downloads
Available from