The Role of Stereo Vision in Visual–Vestibular Integration

Max-Planck Institute for Biological Cybernetics, Spemannstrasse 38, Tübingen 72076, Germany.
Seeing and perceiving (Impact Factor: 1.32). 09/2011; 24(5):453-70. DOI: 10.1163/187847511X588070
Source: PubMed


Self-motion through an environment stimulates several sensory systems, including the visual system and the vestibular system. Recent work in heading estimation has demonstrated that visual and vestibular cues are typically integrated in a statistically optimal manner, consistent with Maximum Likelihood Estimation predictions. However, there has been some indication that cue integration may be affected by characteristics of the visual stimulus. Therefore, the current experiment evaluated whether presenting optic flow stimuli stereoscopically, or presenting both eyes with the same image (binocularly) affects combined visual–vestibular heading estimates.
Participants performed a two-interval forced-choice task in which they were asked which of two presented movements was more rightward. They were presented with either visual cues alone, vestibular cues alone or both cues combined. Measures of reliability were obtained for both binocular and stereoscopic conditions.
Group level analyses demonstrated that when stereoscopic information was available there was clear evidence of optimal integration, yet when only binocular information was available weaker evidence of cue integration was observed. Exploratory individual analyses demonstrated that for the stereoscopic condition 90% of participants exhibited optimal integration, whereas for the binocular condition only 60% of participants exhibited results consistent with optimal integration. Overall, these findings suggest that stereo vision may be important for self-motion perception, particularly under combined visual–vestibular conditions.

Download full-text


Available from: John S Butler
  • Source
    • "Specifically, Gu and colleagues (2007, 2008) showed a functional link between medial superior temporal (MST) activity and behavioral heading discrimination for purely vestibular signals, which suggests that vestibular cues can be processed in a unisensory manner as well as in a multisensory manner, akin to other sensory modalities. In accord with these results in animal models, recent human behavioral studies have shown that vestibular-alone cues can be used to accurately discriminate heading (Telford et al. 1995; Ohmi 1996; Fetsch et al. 2009; Butler et al. 2010; de Winkel et al. 2010; MacNeilage et al. 2010), although some evidence also points to interference effects from entirely veridical vestibular inputs under certain multisensory stimulation conditions (Telford et al. 1995; de Winkel et al. 2010; Butler et al. 2011). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The perception of self-motion is a product of the integration of information from both visual and non-visual cues, to which the vestibular system is a central contributor. It is well documented that vestibular dysfunction leads to impaired movement and balance, dizziness and falls, and yet our knowledge of the neuronal processing of vestibular signals remains relatively sparse. In this study, high-density electroencephalographic recordings were deployed to investigate the neural processes associated with vestibular detection of changes in heading. To this end, a self-motion oddball paradigm was designed. Participants were translated linearly 7.8 cm on a motion platform using a one second motion profile, at a 45° angle leftward or rightward of straight ahead. These headings were presented with a stimulus probability of 80-20 %. Participants responded when they detected the infrequent direction change via button-press. Event-related potentials (ERPs) were calculated in response to the standard (80 %) and target (20 %) movement directions. Statistical parametric mapping showed that ERPs to standard and target movements differed significantly from 490 to 950 ms post-stimulus. Topographic analysis showed that this difference had a typical P3 topography. Individual participant bootstrap analysis revealed that 93.3 % of participants exhibited a clear P3 component. These results indicate that a perceived change in vestibular heading can readily elicit a P3 response, wholly similar to that evoked by oddball stimuli presented in other sensory modalities. This vestibular-evoked P3 response may provide a readily and robustly detectable objective measure for the evaluation of vestibular integrity in various disease models.
    Full-text · Article · Mar 2012 · Experimental Brain Research
  • [Show abstract] [Hide abstract]
    ABSTRACT: When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously present. Many studies have shown that sensory information is integrated using a weighted linear sum, yet little is known about whether this holds true for the integration of visual and body-based cues for travelled distance perception. In this study using Virtual Reality technologies, participants first travelled a predefined distance and subsequently matched this distance by adjusting an egocentric, in-depth target. The visual stimulus consisted of a long hallway and was presented in stereo via a head-mounted display. Body-based cues were provided either by walking in a fully tracked free-walking space (Exp. 1) or by being passively moved in a wheelchair (Exp. 2). Travelled distances were provided either through optic flow alone, body-based cues alone or through both cues combined. In the combined condition, visually specified distances were either congruent (1.0×) or incongruent (0.7× or 1.4×) with distances specified by body-based cues. Responses reflect a consistent combined effect of both visual and body-based information, with an overall higher influence of body-based cues when walking and a higher influence of visual cues during passive movement. When comparing the results of Experiments 1 and 2, it is clear that both proprioceptive and vestibular cues contribute to travelled distance estimates during walking. These observed results were effectively described using a basic linear weighting model.
    No preview · Article · Mar 2012 · Experimental Brain Research
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: When illusory self-motion is induced in a stationary observer by optic flow, the perceived distance traveled is generally overestimated relative to the distance of a remembered target (Redlick, Harris, & Jenkin, 2001): subjects feel they have gone further than the simulated distance and indicate that they have arrived at a target's previously seen location too early. In this article we assess how the radial and laminar components of translational optic flow contribute to the perceived distance traveled. Subjects monocularly viewed a target presented in a virtual hallway wallpapered with stripes that periodically changed color to prevent tracking. The target was then extinguished and the visible area of the hallway shrunk to an oval region 40° (h) × 24° (v). Subjects either continued to look centrally or shifted their gaze eccentrically, thus varying the relative amounts of radial and laminar flow visible. They were then presented with visual motion compatible with moving down the hallway toward the target and pressed a button when they perceived that they had reached the target's remembered position. Data were modeled by the output of a leaky spatial integrator (Lappe, Jenkin, & Harris, 2007). The sensory gain varied systematically with viewing eccentricity while the leak constant was independent of viewing eccentricity. Results were modeled as the linear sum of separate mechanisms sensitive to radial and laminar optic flow. Results are compatible with independent channels for processing the radial and laminar flow components of optic flow that add linearly to produce large but predictable errors in perceived distance traveled.
    Full-text · Article · Sep 2012 · Journal of Vision
Show more