Article

The role of stereo vision in visual-vestibular integration.

Max-Planck Institute for Biological Cybernetics, Spemannstrasse 38, Tübingen 72076, Germany.
Seeing and perceiving (Impact Factor: 1.14). 09/2011; 24(5):453-70. DOI: 10.1163/187847511X588070
Source: PubMed

ABSTRACT Self-motion through an environment stimulates several sensory systems, including the visual system and the vestibular system. Recent work in heading estimation has demonstrated that visual and vestibular cues are typically integrated in a statistically optimal manner, consistent with Maximum Likelihood Estimation predictions. However, there has been some indication that cue integration may be affected by characteristics of the visual stimulus. Therefore, the current experiment evaluated whether presenting optic flow stimuli stereoscopically, or presenting both eyes with the same image (binocularly) affects combined visual-vestibular heading estimates. Participants performed a two-interval forced-choice task in which they were asked which of two presented movements was more rightward. They were presented with either visual cues alone, vestibular cues alone or both cues combined. Measures of reliability were obtained for both binocular and stereoscopic conditions. Group level analyses demonstrated that when stereoscopic information was available there was clear evidence of optimal integration, yet when only binocular information was available weaker evidence of cue integration was observed. Exploratory individual analyses demonstrated that for the stereoscopic condition 90% of participants exhibited optimal integration, whereas for the binocular condition only 60% of participants exhibited results consistent with optimal integration. Overall, these findings suggest that stereo vision may be important for self-motion perception, particularly under combined visual-vestibular conditions.

0 Bookmarks
 · 
112 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The brain is able to determine angular self-motion from visual, vestibular, and kinesthetic information. There is compelling evidence that both humans and non-human primates integrate visual and inertial (i.e., vestibular and kinesthetic) information in a statistically optimal fashion when discriminating heading direction. In the present study, we investigated whether the brain also integrates information about angular self-motion in a similar manner. Eight participants performed a 2IFC task in which they discriminated yaw-rotations (2-s sinusoidal acceleration) on peak velocity. Just-noticeable differences (JNDs) were determined as a measure of precision in unimodal inertial-only and visual-only trials, as well as in bimodal visual-inertial trials. The visual stimulus was a moving stripe pattern, synchronized with the inertial motion. Peak velocity of comparison stimuli was varied relative to the standard stimulus. Individual analyses showed that data of three participants showed an increase in bimodal precision, consistent with the optimal integration model; while data from the other participants did not conform to maximum-likelihood integration schemes. We suggest that either the sensory cues were not perceived as congruent, that integration might be achieved with fixed weights, or that estimates of visual precision obtained from non-moving observers do not accurately reflect visual precision during self-motion.
    Experimental Brain Research 09/2013; · 2.17 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Vection is the illusion of self-motion in the absence of real physical movement. The aim of the present study was to analyze how multisensory inputs (visual and auditory) contribute to the perception of vection. Participants were seated in a stationary position in front of a large, curved projection display and were exposed to a virtual scene that constantly rotated around the yaw-axis, simulating a 360° rotation. The virtual scene contained either only visual, only auditory, or a combination of visual and auditory cues. Additionally, simulated rotation speed (90°/s vs. 60°/s) and the number of sound sources (1 vs. 3) were varied for all three stimulus conditions. All participants were exposed to every condition in a randomized order. Data specific to vection latency, vection strength, the severity of motion sickness (MS), and postural steadiness were collected. Results revealed reduced vection onset latencies and increased vection strength when auditory cues were added to the visual stimuli, whereas MS and postural steadiness were not affected by the presence of auditory cues. Half of the participants reported experiencing auditorily induced vection, although the sensation was rather weak and less robust than visually induced vection. Results demonstrate that the combination of visual and auditory cues can enhance the sensation of vection.
    Experimental Brain Research 12/2013; · 2.17 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7×, 1.0×, 1.4×; Exp. 1) or the proprioceptive gain across trials (0.7×, 1.0×, 1.4×; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.
    Experimental Brain Research 06/2014; · 2.17 Impact Factor

Full-text

Download
39 Downloads
Available from
May 30, 2014