Content uploaded by Jamie Voros
Author content
All content in this area was uploaded by Jamie Voros on Jul 26, 2023
Content may be subject to copyright.
978-1-6654-9032-0/23/$31.00 ©2023 IEEE
Human Orientation Perception during Transitions in the
Presence of Visual Cues
Abstract— Vestibular cues are critical for human perception of
self-motion and when available visual cues influence these
perceptions. However, it is poorly understood how perception of
self-motion is affected by transitions in the presence of visual
cues (e.g., when visual cues become present, such as when a pilot
flies out of the clouds). To investigate this, 11 subjects (3 female,
mean 25 ± 4 years) were seated and rotated about an Earth-
vertical yaw axis, while asked to report their perception of
angular rotation by pressing a left/right button every time they
felt like they had rotated 90 degrees to the left/right. A head
mounted display provided visual rotation cues (specifically
angular velocity cues only). When present, visual cues were
always congruent with inertial rotation. We used 4 different
visual cue conditions: no visual cues, visual angular velocity
cues, visual angular velocity cues transitioning to no visual cues,
and no visual cues transitioning to visual angular velocity cues.
We experimented with 2 different rotation profiles per visual
cue condition. Based on the timing between subject button press
inputs, we inferred their perception of angular velocity.
During and immediately after a sudden loss of visual cues, we
found a transition in perception of angular velocity on the order
of 30 seconds. When visual cues appeared after vestibular cues
had deviated from reality, there was a 10 second delay before
angular velocity perception converged to that associated with
the provided visual angular velocity cues. Both these time
periods are long than expected and longer than the time delay
associated with the psychophysical task.
Our results indicate that the brain does not discard the influence
of past visual motion cues immediately after suddenly losing
visual cues. Suddenly gaining visual cues is also associated with
some time delay in integrating the cues into a central perception
of motion. By quantifying the time course of self-motion
perception following transitions in the presence of visual cues,
we can better understand pilot spatial disorientation. For
example, a terrestrial pilot flying out of clouds or an astronaut
in the final stages of landing on the moon with dust blowback,
each experience a transition in the presence of visual cues
affecting spatial orientation perception.
TABLE OF CONTENTS
1. INTRODUCTION ....................................................... 1
2. METHODS ................................................................ 2
3. DATA PROCESSING ................................................ 4
4. RESULTS .................................................................. 4
5. DISCUSSION ............................................................. 5
6. CONTRIBUTIONS ..................................................... 6
ACKNOWLEDGEMENTS .............................................. 6
REFERENCES .............................................................. 6
BIOGRAPHY ................................................................ 8
1. INTRODUCTION
Spatial orientation in humans refers to our understanding of
where we are in space and our motion [1], [2]. While humans
generally perceive orientation well [3], unusual motions [4],
[5] or environments [6]–[9] can cause misperception of
orientation, or spatial disorientation. Spatial disorientation in
pilots is considered a leading source of aviation mishaps in
both fixed wing and rotary aircraft [10], [11]. An example of
becoming spatially disoriented is during take-off on a
commercial flight. During take-off, a commercial jet aircraft
must accelerate linearly. Such forward linear acceleration is
indistinguishable (to otoliths of the vestibular system and any
other graviceptors) from pitching backwards and can create a
sensation of pitch tilt while the aircraft is still level on the
runway. In such a scenario, the pilot must refrain from
pitching the aircraft downwards in order to counter the
spurious sensation of backwards tilt elicited by linear
acceleration.
We have a thorough understanding of the functionality and
limitations of the vestibular system [2], [3], [6], [12] and a
general understanding of typical spatial disorientation
illusions which occur when visual cues are unavailable [1],
[10], [13]–[15]. However, our existing understanding lacks
Jamie Voros
Ann & H.J. Smead Department
of Aerospace Engineering
Sciences
The University of Colorado
Boulder, CO
jamie.voros@colorado.edu
Torin K Clark
Ann & H.J. Smead Department of
Aerospace Engineering
Sciences
The University of Colorado
Boulder, CO
torin.clark@colorado.edu
2
the ability to accurately predict spatial orientation in
scenarios that occur in more operational environments [16].
For example, aviation mishaps are often associated with
scenarios where visual cues rapidly degrade (e.g., flying into
a cloud) despite the presence of flight instruments. This
motivates studying orientation perception in scenarios where
the presence of visual cues suddenly changes. To our
knowledge no existing study has sought to quantify motion
perception in humans when visual cues are suddenly gained
or lost.
Although humans use cues from multiple sensing modalities
in order to perceive orientation [17]–[21], a large body of
prior work has focused on the integration of only visual and
vestibular cues to predict orientation perception [20]–[23]. It
has been possible to accurately predict orientation perception
in many scenarios with only visual and vestibular cues [4],
[6], [24]–[29]. Therefore, the current study quantifies motion
perception during sudden transitions in the availability of
visual cues as a result of vestibular and visual cues only.
2. METHODS
Subjects were seated on a RotoChair which rotates in earth
vertical yaw motion only. Subjects wore a virtual reality head
mounted display (HMD), specifically an HTC Vive Pro with
Wireless Tracking. The experimental hardware is shown in
Figure 1. The Rotochair was set to rumble with a white noise
rumble profile in order to help mask potential tactile cueing.
Similarly, subjects heard auditory white noise via the HTC
Vive Pro’s inbuilt headphones to reduce external auditory
cues. The experiment occurred in a darkroom in order to
account for light leak which is a known limitation of the HTC
Vive Pro HMD.
Figure 1: Model (not actual subject) sitting in Rotochair
with HTC Vive Pro with Wireless Tracking HMD donned.
Subject holding two HTC Vive Controllers used for button
press task. Head restraint to limit head movement during
testing. Testing occurred in a darkroom, the lights are on
(and operator’s chamber door open) for the purposes of
taking the photograph.
Psychophysical Task
Subjects were tasked with pressing a button every time they
felt they had rotated 90 degrees in one direction. Subjects had
a controller (with button) in each hand and could thus also
indicate in which direction they felt they had rotated. If they
felt they had rotated 90 degrees to the right (left) after the last
button press, they were instructed to press the button in their
right (left) hand. The task has been used successfully
previously in existing literature [30], [31]. In order to report
perceptions of starting or stopping rotation, subjects were
instructed to hold down the triggers on the back of the two
3
controllers when stationary and then to release the triggers
once they felt they were in (angular) motion.
Motion Profiles and Visual Cues
We limited motion to Earth-vertical yaw rotation in order to
isolate one sensory processing pathway. Subject’s heads were
fixed in place with a head restraint (as shown in Figure 1)
such that they were always facing the same direction as the
chair. Earth-vertical yaw motion primarily stimulates the
semicircular canals. We used several motion profiles and
ultimately present data from two motion profiles (shown in
Figure 2).
The motion profiles contained different accelerations rotating
to the left and to the right. Each subject (that completed
testing) experienced each motion profile 4 times, each time
with different visual cues. The order in which each subject
experienced each trial was randomized prior to the start of
testing.
Figure 2: Plots to show the two motion profiles used in the
present study. Top: a motion profile designed to trigger a
decay in perception during the no visual cues condition.
Bottom: a motion profile designed to be unpredictable, bi-
directional and require subjects to constantly be
considering their perception of motion (as opposed to
falling into a pattern of pressing the button consistently).
The four different visual cue conditions were as follows:
1. Optical flow delivered by the head mounted display
for the duration of the motion profile
2. No visual cues at all (HMD displaying black)
3. Optical flow for the initial part of the motion and
none for the remainder
4. No visual cues for the initial part of the motion
(HMD displaying black) and optical flow for the
remainder
Optical flow was delivered in the form of dots inside of a
virtual sphere that the subjects was rotating inside of (shown
in Figure 3). The dots moved across the HMD in the opposite
direction that the subject was rotating in order to generate a
sensation of motion. The visual cues delivered were always
congruent with the physical rotation of the subject, as our
goal was not to confuse subjects with erroneous visual
information. We chose to use a large number of identical,
small dots in order to stimulate sensation of angular velocity
only (specifically, we did not provide any angular position
cues).
Figure 3: View inside HMD while visual cues were being
delivered.
Experimental Procedure
This research complied with the American Psychological
Association Code of Ethics and was approved by the
Institutional Review Board at The University of Colorado
Boulder under protocol #19-002. Informed consent was
obtained from each participant. Eleven unique subjects
participated in the study, but not all subjects completed
testing. Subjects were between the ages of 18 and 40 years
because age over 40 is associated with reduced vestibular
functioning [32], [33]. Subjects self-reported no known
history of vestibular dysfunction and self-reported 20-20
vision (corrective contact lenses were accepted, glasses were
not due to the physical constraints of the HMD). Subjects also
completed the motion sickness susceptibility questionnaire
4
and would have been screened out had they scored above the
90% percentile. No subjects were screened out as a result of
their motion sickness susceptibility score. However, two
subjects did not complete testing as a result of cybersickness.
Subjects completed 4 practice trials prior to the beginning of
testing to become familiar with the visual scene and practice
performing the psychophysical task. Subjects then completed
up to 24 test trials (not all data is presented here). Subjects
were asked for their sleepiness level and if they were feeling
motion sick after each trial. If a subject reported feeling
motion sick for 3 trials in a row, their testing session ended.
Subjects were able to take a break (and remove the HMD) at
any point between trials and most did so at least once during
testing.
In order to account for potential differences in perception due
to rotation direction, positive angular velocity was randomly
assigned to represent either clockwise or anticlockwise
rotation. The random assignment of positive angular velocity
representing motion in the clockwise or anticlockwise
direction was done per motion profile (and not per subject or
per trial). This means that each time a subject performed the
task for the same motion profile (with different visual cue
conditions), they rotated in the same direction. However, a
subject may have had positive angular velocity assignment to
anticlockwise for all trials with one motion profile but
positive angular velocity assignment to clockwise for all the
trials with another motion profile.
3. DATA PROCESSING
Figure 4: Example plot showing each individual subject's
perception of angular motion (6 subjects). In green is the
average perception at each instant in time, with light
green as the standard error bounds.
In order to infer perception of angular velocity, we calculated
the time between each button press (or between trigger
release–signaling the perception of the start of the motion–
and the following button press). We divided 90 degrees by
this time to obtain the average angular velocity perception in
degrees per second, over the time period between successive
button presses. We then averaged angular velocity perception
at each instant in time across all subject reports per motion
profile per visual cue condition as shown in Figure 4. A two-
second smoothing filter (Gaussian) was applied to the
resulting average perception. Each visual cue condition per
motion profile had at least six subjects (although we tested 11
unique subjects in total) because not all study participants
completed the full course of testing.
Approximately half of our data had positive angular velocity
associated with clockwise motion and the other half with
anticlockwise motion. In order to collate data where the
assignment of positive angular velocity was not necessarily
in the same direction, we multiplied inferred angular velocity
by -1 where positive angular velocity had been assigned
anticlockwise motion. Qualitatively, we did not see any
differences in subject responses associated with which
direction (clockwise vs anticlockwise) they had completed
the task in.
4. RESULTS
Figure 5: Angular velocity and angular velocity
perception for the two control conditions (visual cues
throughout and no visual cues throughout) on top of
angular velocity perception during test condition (green).
6 subjects. The shaded part of graph indicates where no
visual cues were provided during the test condition. The
light part of the graph shows where visual cues were
provided during the test condition. The sudden transition
happens just before 50 seconds into the motion. Before the
transition, the test condition (green) tracks the visual cues
provided control condition (light yellow). Beyond the
sudden transition, we see the test condition (green) slowly
approach the perception of the no visual cue control
condition (dark blue) over the course of about 30 seconds.
As expected, motion perception in each profile begins by
consistently following the corresponding control condition
either side of the visual transition. For example, if a test
condition started with visual cues and ended without them,
perception begins by following the “with visual cues”
perceptual pattern and ends by following the “without visual
cues” perceptual pattern. Our data quantifies the time taken
to transition from one perceptual pattern to another. The
5
length of the transition period differs between the “with to
without” visual cues and “without to with” visual cues
conditions.
Figure 5 shows what subjects thought their motion was, on
average across one test condition in green (with to without
visual cues) and both control conditions in yellow and blue
(with visual cues the entire time and without visual cues the
entire time respectively). During the visual cues to no visual
cue transition shown in Figure 5, it takes around 30 seconds
for the control condition (green) to shift from tracking the
with visual cues (yellow) to the without visual cues (blue)
control conditions. This is longer than the transition period
(or “perceptual decay”) [25] we would expect had visual cues
never been present in the first place.
During the no visual cues to visual cues transitions, we see an
approximately 10 second delay in motion perception moving
from within the error bounds of the initial (no visual cues
condition) to the latter (visual cues) condition. Based on
angular velocities of 40-60 degrees per second across visual
transitions and one click every 90 degrees of perceived
rotation, we anticipated less than a 2.5 second delay as a
limitation of the psychophysical task. We note that 10
seconds is substantially longer than the 2.5 second delay we
might expect as a result of the psychophysical task.
Therefore, subject’s perception does not undergo a
discontinuous transition when accurate visual cues are
suddenly gained during a period of spatial disorientation.
Not all data that we collected is presented in this paper. We
also collected data on several, highly predicable motion
profiles. These motion profiles (one particular profile is
shown in Figure 7) did not result in spatial disorientation
during the no visual cues control condition. This means that
there was no difference between subject’s orientation
perception in the two control conditions. The test condition
could not, therefore, transition between the two. We defined
no substantial difference in perception to be where average
perception of one condition remained within the error bounds
of another condition. For example, in Figure 7, the yellow
line (with visual cues control condition average) stays within
the bounds of the light blue shaded area (error bounds of the
no visual cues control condition). If a motion profile produces
no differences in perception between conditions preceding
and following the visual transition, we cannot expect a
transition in orientation perception for that motion.
Therefore, data where the control condition averages were
within the other’s error bounds were not used for this
analysis.
Figure 6: Two motion profiles and corresponding human
subject data collected (10 and 6 subjects respectively).
Both top and bottom are plots of the suddenly gaining
visual cues scenario. The top plot clearly shows the 10
second delay between gaining visual cues and orientation
perception matching orientation perception where the
visual cues had been available the whole time (the green
line takes 10 seconds to match the yellow line).
5. DISCUSSION
Based on the data from the present study, perception
following a sudden loss of visual cues does not change
instantaneously; we see a more gradual convergence towards
true motion. Similarly, orientation perception does not decay
as fast as we would expect [20], [25], [34] when visual cues
have previously been present indicating that past visual
information may be used to inform subsequent spatial
orientation.
Existing models of orientation perceptions are not robust to
sudden changes in visual cues (as far as we are aware). Future
work will focus on integrating our results into existing
models of motion perception which do not currently
accurately predict orientation perception during sudden
visual transitions. Sensory-conflict-based models of
6
orientation perception [20], [25], [35], [36] may be updated
with the addition of low pass filtering in order to reconcile
the sudden changes in sensory information (large error terms
within the model). Dynamic reweighting of sensory cues may
also be necessary for accurate modelling of perception when
unexpected cues become available [37], [38].
Figure 7: One of the highly consistent motion profiles we
collected data for (9 subjects). There is no substantial
difference in perception between the two control or test
conditions. We, therefore, do not see (and neither do we
expect to see) a change in perception following a visual
cue availability transition. We believe that there is no
perceptual decay in the no visual cue control condition
because the motion profile was too predictable and
subjects were simply pressing the button at a consistent
rate.
A limitation of our work, however, is that there is substantial
variation between subjects when visual cues are not
presented. As shown in Figures Figure 5 and Figure 6, the
standard error tends to enlarge after visual cues are lost and
is larger when visual cues are never present, particularly for
consistent motion profiles. Further, we note that we have not
presented all of our experimental data: several of our motion
profiles appeared too predictable for our subjects.
Figure 7 shows a consistent motion profile where we would
expect to see signal decay when motion is perceived with
vestibular cues only. However, we note that our subjects were
able to fairly accurately continue to report a sensation of
motion. We hypothesize that unidirectional and consistent
motion profiles became predictable to subjects over the
course of their two-hour testing session. If a motion profile
was too consistent, we believe that subjects simply started
pressing the button (to indicate perception) at a constant rate
rather than at the rate they felt they were spinning. Notably,
during the highly variable and bidirectional motion profile
(see Figure 6, bottom), we do not see such predictive
behavior. Instead, we see evidence of velocity storage [39],
[40] towards the end of the profile. By the end of the
“multistep 3” motion profile (without visual cues) subjects
reported that they felt they were rotating in the opposite
direction from their true motion (shown in Figure 5 and
Figure 6). An additional limitation is the use of VR to deliver
visual cues. VR is an extremely powerful tool but is
associated with reduced vestibular ocular reflex (VOR) gain
for a given amount of rotational optical flow [41]. While VR
can deliver interpretable motion cues, altered VOR responses
suggest the VR cues may be processed by the brain
differently than naturalistic visual cues.
We have measured orientation perception during a transition
in the availability of visual cues, which occurs regularly in
flight, but that has not been previously studied. We have
shown that the sudden gain of visual cues during a period of
misperception of rotation in the dark does not lead to
immediate correction of orientation perception. Similarly, we
have shown that suddenly losing visual cues takes many
seconds before rotation perception to that when visual cues
are unavailable throughout.
Our results are applicable to both space and aircraft piloting
tasks. Flying into and out of clouds is a common occurrence
in aviation and is associated with spatial disorientation.
Further, future space missions to other celestial bodies (such
as the Moon or Mars) will likely require human pilots who
may lose visual references from dust blowback as they are
landing. During flight (both air and space), transitions in the
presence of visual cues may also occur if the pilot is not
looking through the canopy and is instead directing their gaze
towards distraction inside the cockpit. Our results will allow
for more robust design of flight operations or spatial
disorientation mitigation procedures. This is because we are
now aware of how long it takes the pilot’s perception to
change following restoration or sudden loss of visual cues.
6. CONTRIBUTIONS
We are the first to quantify motion perception in humans
during a sudden transition in the availability of visual cues.
By using motion profiles that result in spatial disorientation,
we have captured motion perception in both sudden gain of
visual cues and sudden loss of visual cues. Our dataset
indicates that there may be some processing of visual
information when visual cues are suddenly gained because
the perception changes gradually, even after accounting for
the physical limitations of the psychophysical task.
Perception during the opposite transition (suddenly losing
visual cues) slowly deviates from reality, indicating that there
may be sustained influence of recent visual information, even
after it is no longer present.
ACKNOWLEDGEMENTS
This work is supported by the Office of Naval Research under
a Multi University Research Initiative. PI: Daniel Merfeld.
REFERENCES
[1] A. J. Benson, Spatial disorientation—general aspects
J. Ernsting, A.N. Nicholson, D.J. Rainford (Eds.),
Aviation Medicine, Butterworth. Oxford, England,
UK, 1978.
7
[2] D. M. Merfeld, Vestibular Sensation. Sensation and
Perception. Sinauer Associates, Inc, M. A.
Sunderland, 2017.
[3] F. E. Guedry, “Psychophysics of Vestibular
Sensation,” in Vestibular System Part 2:
Psychophysics, Applied Aspects and General
Interpretations, vol. 6 / 2, H. H. Kornhuber, Ed. Berlin,
Heidelberg: Springer Berlin Heidelberg, 1974, pp. 3–
154. doi: 10.1007/978-3-642-65920-1_1.
[4] T. N. Clark, “Systematizing Global and Regional
Creativity,” in Handbook of Science and Technology
Convergence, W. S. Bainbridge and M. C. Roco, Eds.
Cham: Springer International Publishing, 2015, pp. 1–
13. doi: 10.1007/978-3-319-04033-2_77-1.
[5] A. Tribukait, A. Ström, E. Bergsten, and O. Eiken,
“Vestibular Stimulus and Perceived Roll Tilt During
Coordinated Turns in Aircraft and Gondola
Centrifuge,” Aerosp. Med. Hum. Perform., vol. 87, no.
5, pp. 454–463, May 2016, doi:
10.3357/AMHP.4491.2016.
[6] T. K. Clark, “Effects of Spaceflight on the Vestibular
System,” in Handbook of Space Pharmaceuticals, Y.
Pathak, M. Araújo dos Santos, and L. Zea, Eds. Cham:
Springer International Publishing, 2019, pp. 1–39. doi:
10.1007/978-3-319-50909-9_2-1.
[7] G. Clément, P. Denise, M. F. Reschke, and S. J. Wood,
“Human ocular counter-rolling and roll tilt perception
during off-vertical axis rotation after spaceflight,” J.
Vestib. Res. Equilib. Orientat., vol. 17, no. 5–6, pp.
209–215, 2007.
[8] K. N. de Winkel, G. Clément, E. L. Groen, and P. J.
Werkhoven, “The perception of verticality in lunar and
Martian gravity conditions,” Neurosci. Lett., vol. 529,
no. 1, pp. 7–11, Oct. 2012, doi:
10.1016/j.neulet.2012.09.026.
[9] C. Oman, Spatial Orientation and Navigation in
Microgravity. Springer, Boston, MA, 2007.
[10] A. Bellenkes, R. Bason, and D. W. Yacavone, “Spatial
disorientation in naval aviation mishaps: a review of
class A incidents from 1980 through 1989,” Aviat.
Space Environ. Med., vol. 63, no. 2, pp. 128–131, Feb.
1992.
[11] L. R. Young, K. H. Sienko, L. E. Lyne, H. Hecht, and
A. Natapoff, “Adaptation of the vestibulo-ocular
reflex, subjective tilt, and motion sickness to head
movements during short-radius centrifugation,” J.
Vestib. Res. Equilib. Orientat., vol. 13, no. 2–3, pp. 65–
77, 2003.
[12] NASA, “Human Vestibular System in Space.” 2004.
Accessed: Nov. 22, 2021. [Online]. Available:
https://www.nasa.gov/audience/forstudents/9-
12/features/F_Human_Vestibular_System_in_Space.h
tml
[13] M. Braithwaite, S. Groh, and E. Alvarez, “Spatial
Disorientation in U.S. Army Helicopter Accidents: An
Update of the 1987-92 Survey to Include 1993-95.”
DTIC, 1997. [Online]. Available:
https://apps.dtic.mil/sti/citations/ADA323898
[14] B. Cheung, K. Money, H. Wright, and W. Bateman,
“Spatial disorientation-implicated accidents in
Canadian forces, 1982-92,” Aviat. Space Environ.
Med., vol. 66, no. 6, pp. 579–585, Jun. 1995.
[15] M. Cohen, “Disorienting effects of aircraft catapult
launchings: III. Cockpit displays and piloting
performance,” Aviat Space Env. Med, vol. 48, no. 9,
pp. 797–804, Sep. 1977.
[16] J. B. Dixon, C. A. Etgen, D. S. Horning, T. K. Clark,
and R. V. Folga, “Integration of a Vestibular Model for
the Disorientation Research Device Motion Algorithm
Application,” Aerosp. Med. Hum. Perform., vol. 90,
no. 10, pp. 901–907, Oct. 2019, doi:
10.3357/AMHP.5416.2019.
[17] M. J. Dai, I. S. Curthoys, and G. M. Halmagyi, “A
model of otolith stimulation,” Biol. Cybern., vol. 60,
no. 3, Jan. 1989, doi: 10.1007/BF00207286.
[18] C. R. Fetsch, A. Pouget, G. C. DeAngelis, and D. E.
Angelaki, “Neural correlates of reliability-based cue
weighting during multisensory integration,” Nat.
Neurosci., vol. 15, no. 1, pp. 146–154, Jan. 2012, doi:
10.1038/nn.2983.
[19] F. Karmali, K. Lim, and D. M. Merfeld, “Visual and
vestibular perceptual thresholds each demonstrate
better precision at specific frequencies and also exhibit
optimal integration,” J. Neurophysiol., vol. 111, no. 12,
pp. 2393–2403, Jun. 2014, doi:
10.1152/jn.00332.2013.
[20] M. C. Newman, “A multisensory observer model for
human spatial orientation perception,” Massachusetts
Institute of Technology, Cambridge, MA, 2009.
[Online]. Available:
http://hdl.handle.net/1721.1/51636
[21] K. E. Cullen, “The neural encoding of self-motion,”
Curr. Opin. Neurobiol., vol. 21, no. 4, pp. 587–595,
Aug. 2011, doi: 10.1016/j.conb.2011.05.022.
[22] J. Borah and L. Young, “Spatial Orientation and
Motion Cue Environment Study in the Total In-Flight
Simulator.” DTIC, 1983. [Online]. Available:
https://apps.dtic.mil/sti/citations/ADA129391
[23] J. Borah, L. Young, and R. E. Curry, Sensory
Mechanism Modeling. Air Force Human Resources
Laboratory, Air Force Systems Command, 1977.
[24] D. Merfeld, “Rotation otolith tilt-translation
reinterpretation (ROTTR) hypothesis: a new
hypothesis to explain neurovestibular spaceflight
adaptation,” J Vestib Res, vol. 13, no. 4–6, pp. 309–20,
2003.
[25] D. Merfeld, L. Young, C. Oman, and M. Shelhamer,
“A multidimensional model of the effect of gravity on
the spatial orientation of the monkey.,” J. Vestib. Res.
Equilib. Orientat., vol. 3, no. 2, pp. 141–61, 1993.
[26] L. H. Zupan and D. M. Merfeld, “Neural Processing of
Gravito-Inertial Cues in Humans. IV. Influence of
Visual Rotational Cues During Roll Optokinetic
Stimuli,” J. Neurophysiol., vol. 89, no. 1, pp. 390–400,
Jan. 2003, doi: 10.1152/jn.00513.2001.
[27] C. R. Fetsch, G. C. DeAngelis, and D. E. Angelaki,
“Visual-vestibular cue integration for heading
perception: applications of optimal cue integration
8
theory: Mechanisms of visual-vestibular cue
integration,” Eur. J. Neurosci., vol. 31, no. 10, pp.
1721–1729, May 2010, doi: 10.1111/j.1460-
9568.2010.07207.x.
[28] J. Laurens and D. E. Angelaki, “How the
Vestibulocerebellum Builds an Internal Model of Self-
motion,” in The Neuronal Codes of the Cerebellum,
Elsevier, 2016, pp. 97–115. doi: 10.1016/B978-0-12-
801386-1.00004-6.
[29] C. Oman, “A heuristic mathematical model for
dynamics of sensory conflict and motion sickness,”
Acta Otolaryngol Suppl, vol. 392, pp. 1–44, 1982.
[30] J. J. Groen and L. B. W. Jongkees, “The turning test
with small regulable stimuli; the cupulogram obtained
by subjective angle estimation,” J. Laryngol. Otol.,
vol. 62, no. 4, pp. 236–240, Apr. 1948, doi:
10.1017/s0022215100008926.
[31] F. E. Guedry and L. S. Lauver, “Vestibular reactions
during prolonged constant angular acceleration,” J.
Appl. Physiol., vol. 16, no. 2, pp. 215–220, Mar. 1961,
doi: 10.1152/jappl.1961.16.2.215.
[32] M. C. Bermúdez Rey, T. K. Clark, and D. M. Merfeld,
“Balance Screening of Vestibular Function in Subjects
Aged 4 Years and Older: A Living Laboratory
Experience,” Front. Neurol., vol. 8, p. 631, Nov. 2017,
doi: 10.3389/fneur.2017.00631.
[33] M. C. Bermúdez Rey, T. K. Clark, W. Wang, T.
Leeder, Y. Bian, and D. M. Merfeld, “Vestibular
Perceptual Thresholds Increase above the Age of 40,”
Front. Neurol., vol. 7, Oct. 2016, doi:
10.3389/fneur.2016.00162.
[34] J. Laurens, D. Straumann, and B. J. M. Hess,
“Processing of Angular Motion and Gravity
Information Through an Internal Model,” J.
Neurophysiol., vol. 104, no. 3, pp. 1370–1381, Sep.
2010, doi: 10.1152/jn.00143.2010.
[35] T. K. Clark, M. C. Newman, F. Karmali, C. M. Oman,
and D. M. Merfeld, “Mathematical models for
dynamic, multisensory spatial orientation perception,”
in Progress in Brain Research, vol. 248, Elsevier,
2019, pp. 65–90. doi: 10.1016/bs.pbr.2019.04.014.
[36] H. P. Williams, J. L. Voros, D. M. Merfeld, and T. K.
Clark, “Extending the Observer Model for Human
Orientation Perception to Include In-Flight Perceptual
Thresholds,” NAVAL MEDICAL RESEARCH UNIT
DAYTON, 2021. [Online]. Available:
https://apps.dtic.mil/sti/pdfs/AD1124219.pdf
[37] J. X. Brooks and K. E. Cullen, “The Primate
Cerebellum Selectively Encodes Unexpected Self-
Motion,” Curr. Biol., vol. 23, no. 11, pp. 947–955, Jun.
2013, doi: 10.1016/j.cub.2013.04.029.
[38] C. R. Fetsch, A. H. Turner, G. C. DeAngelis, and D. E.
Angelaki, “Dynamic Reweighting of Visual and
Vestibular Cues during Self-Motion Perception,” J.
Neurosci., vol. 29, no. 49, pp. 15601–15612, Dec.
2009, doi: 10.1523/JNEUROSCI.2574-09.2009.
[39] J. Laurens and D. E. Angelaki, “The functional
significance of velocity storage and its dependence on
gravity,” Exp. Brain Res., vol. 210, no. 3–4, pp. 407–
422, May 2011, doi: 10.1007/s00221-011-2568-4.
[40] Th. Raphan, V. Matsuo, and B. Cohen, “Velocity
storage in the vestibulo-ocular reflex arc (VOR),” Exp.
Brain Res., vol. 35, no. 2, Apr. 1979, doi:
10.1007/BF00236613.
[41] Stefano Di Girolamo, Pasqualina Pic, “Vestibulo-
Ocular Reflex Modification after Virtual Environment
Exposure,” Acta Otolaryngol. (Stockh.), vol. 121, no.
2, pp. 211–215, Jan. 2001, doi:
10.1080/000164801300043541.
BIOGRAPHY
Jamie L Voros is affiliated with the Ann
& H.J. Smead Department of Aerospace
Engineering Sciences, University of
Colorado Boulder. Highest degrees
obtained: M. S. Aerospace Engineering
Sciences, 2020, University of Colorado
Boulder. M. S. Computer Science, 2022,
University of Colorado Boulder. Prior to the University of
Colorado Boulder, Jamie completed her B. S. at the
Massachusetts Institute of Technology.
Torin K Clark is currently affiliated with
the Ann & H.J. Smead Department of
Aerospace Engineering Sciences,
University of Colorado Boulder. Highest
degree obtained: Ph. D. Humans in
Aerospace, 2013, Massachusetts Institute
of Technology. Prior to the
Massachusetts Institute of Technology,
Torin completed his B. S. at the University of Colorado
Boulder.