The movement of motion-defined contours can bias perceived position.
ABSTRACT Illusory position shifts induced by motion suggest that motion processing can interfere with perceived position. This may be because accurate position representation is lost during successive visual processing steps. We found that complex motion patterns, which can only be extracted at a global level by pooling and segmenting local motion signals and integrating over time, can influence perceived position. We used motion-defined Gabor patterns containing motion-defined boundaries, which themselves moved over time. This 'motion-defined motion' induced position biases of up to 0.5 degrees , much larger than has been found with luminance-defined motion. The size of the shift correlated with how detectable the motion-defined motion direction was, suggesting that the amount of bias increased with the magnitude of this complex directional signal. However, positional shifts did occur even when participants were not aware of the direction of the motion-defined motion. The size of the perceptual position shift was greatly reduced when the position judgement was made relative to the location of a static luminance-defined square, but not eliminated. These results suggest that motion-induced position shifts are a result of general mechanisms matching dynamic object properties with spatial location.
Full-textDOI: · Available from: Szonya Durant, May 28, 2015
SourceAvailable from: Alan Johnston[Show abstract] [Hide abstract]
ABSTRACT: Objects in motion appear shifted in space. For global motion stimuli we can ask whether the shift depends on the local or global motion. We constructed arrays of randomly oriented Gaussian enveloped drifting sine gratings (dynamic Gabors) whose speed was set such that the normal component of motion was consistent with a single global velocity. The array appears shifted in space in the direction of the global motion. The size of the shift is the same as for arrays of uniformly oriented dynamic Gabors that are moving in the same direction at the same global speed. Arrays made up of vertically oriented gratings whose speeds were set to the horizontal component of the random array elements were shifted less far. This shows that motion-induced position shifts of coherently moving surface patches are generated after the completion of the global motion computation.Journal of Vision 12/2009; 9(13):8.1-8. DOI:10.1167/9.13.8 · 2.73 Impact Factor
[Show abstract] [Hide abstract]
ABSTRACT: The perceived position of stationary objects can appear shifted in space due to the presence of motion in another part of the visual field (motion drag). We investigated this phenomenon with global motion Gabor arrays. These arrays consist of randomly oriented Gabors (Gaussian windowed sinusoidal luminance modulations) whose speed is set such that the normal component of the individual Gabor's motion is consistent with a single 2D global velocity. Global motion arrays were shown to alter the perceived position of nearby stationary objects. The size of this shift was the same as that induced by arrays of Gabors uniformly oriented in the direction of global motion and drifting at the global motion speed. Both types of array were found to be robust to large changes in array density and exhibited the same time course of effect. The motion drag induced by the global motion arrays was consistent with the estimated 2D global velocity, rather than by the component of the local velocities in the global motion direction. This suggests that the motion signal that induces motion drag originates at or after a stage at which local motion signals have been integrated to produce a global motion estimate.Journal of Vision 08/2010; 10(5):14. DOI:10.1167/10.5.14 · 2.73 Impact Factor
[Show abstract] [Hide abstract]
ABSTRACT: Visual figures may be distinguished based on elementary motion or higher-order non-Fourier features, and flies track both. The canonical elementary motion detector, a compact computation for Fourier motion direction and amplitude, can also encode higher-order signals provided elaborate preprocessing. However, the way in which a fly tracks a moving figure containing both elementary and higher-order signals has not been investigated. Using a novel white noise approach, we demonstrate that (1) the composite response to an object containing both elementary motion (EM) and uncorrelated higher-order figure motion (FM) reflects the linear superposition of each component; (2) the EM-driven component is velocity-dependent, whereas the FM component is driven by retinal position; (3) retinotopic variation in EM and FM responses are different from one another; (4) the FM subsystem superimposes saccadic turns upon smooth pursuit; and (5) the two systems in combination are necessary and sufficient to predict the full range of figure tracking behaviors, including those that generate no EM cues at all. This analysis requires an extension of the model that fly motion vision is based on simple elementary motion detectors and provides a novel method to characterize the subsystems responsible for the pursuit of visual figures.Current biology: CB 02/2012; 22(6):482-7. DOI:10.1016/j.cub.2012.01.044 · 9.92 Impact Factor