ArticlePDF AvailableLiterature Review

Virtual reality in neuroscience research and therapy

Authors:
  • Orthogonal Research and Education Lab, OpenWorm Foundation

Abstract and Figures

Virtual reality (VR) environments are increasingly being used by neuroscientists to simulate natural events and social interactions. VR creates interactive, multimodal sensory stimuli that offer unique advantages over other approaches to neuroscientific research and applications. VR's compatibility with imaging technologies such as functional MRI allows researchers to present multimodal stimuli with a high degree of ecological validity and control while recording changes in brain activity. Therapists, too, stand to gain from progress in VR technology, which provides a high degree of control over the therapeutic experience. Here we review the latest advances in VR technology and its applications in neuroscience research.
Content may be subject to copyright.
An enduring tension exists between ecological validity and
experimental control in psychology and neuroscience
research. Experimentalists have long used text-, graphic-
or computer-based abstractions of real-world objects or
situations when submitting experimental variables to
study. Such highly controlled but contextually impover-
ished stimuli greatly simplify the world for research but
leave us guessing as to their generalizability. Conversely,
therapists and practising psychologists often relinquish
control in order to observe or influence behaviour in
complex real-world surroundings.
Virtual reality (VR) provides a middle ground, sup-
porting naturalistic and contextually rich scenarios along
with an exacting degree of control over key variables.
VR has value for studying processes such as neuronal
connectivity, developmental dynamics, neuromuscular
output and perhaps even the initiation of molecular cas-
cades, and as reviewed below, VR continues to garner
validation as a therapeutic application. There have been
several reviews on the uses of VR for neuroscience-
related work1–8. Here, we focus on the most recent appli-
cations of VR, highlighting those that combine VR with
brain imaging, as well as developments in VR systems
for animal research.
State of the art
VR system components work in concert to create sen-
sory illusions that produce a more or less believable
simulation of reality9. The goal is to foster brain and
behavioural responses in the virtual world that are
analogous to those that occur in the realworld.
Sensory stimulation comes in many forms (BOX1).
VR systems are best at displaying visual and auditory
information. Increasingly, these are approaching the sen-
sory vividness of the physical environment. In addition,
VR systems may provide limited but compelling haptic
(tactile) feedback that simulates the feel of forces, sur-
faces and textures as users interact with virtual objects.
VR systems also include a way of interacting with the
simulation. In fully ‘immersive’ VR systems (BOX2),
movement of the body and the sensory flow of the vir-
tual environment are coupled10. Movements of the head
and body are often tracked so that the visual experience
changes in a way that corresponds to real-world head
and body movements.
Miniaturization of VR technologies and their grow-
ing affordability are helping to address some common
criticisms of VR for neuroscience research (BOX3).
As technology continues to improve, the barriers to
widespread adoption of VR are constantly diminishing.
Why use VR?
The use of VR in neuroscience research offers several
unique advantages. First, and perhaps most impor-
tantly, VR allows naturalistic interactive behaviours to
take place while brain activity is monitored via imaging
or direct recording. This allows researchers to directly
address many questions in a controlled environment
that would simply not be possible by studying perfor-
mance ‘in the wild’. Second, VR environments allow
researchers to manipulate multimodal stimulus inputs,
so the user’s sensorimotor illusion of being ‘present’
in the represented environment is maximized (BOX2).
By providing realistic stimulation to multiple sensory
channels at once, VR engages the sensorimotor system
more fully than the simple stimuli used in most
*Department of Psychology,
University of Central Florida,
Orlando, Florida, USA.
Department of Animal
Science, Michigan State
University, East Lansing,
Michigan, USA.
§Department of Interaction
Science, Sungkyunkwan
University, Seoul, Republic
of Korea.
||Department of
Communication, Syracuse
University, Syracuse,
New York, USA.
Correspondence to C.J.B.
e-mail: corey.bohil@ucf.edu
doi:10.1038/nrn3122
Published online
3 November 2011
Ecological validity
Refers to experimental
conditions that are reasonably
similar to those in a real-world
setting. In virtual environments,
contextually rich simulations
with multiple sensory cues
might be considered to have
greater ecological validity than
environments that are limited
to only the necessary and
sufficient features for an
experiment.
Virtual reality in neuroscience
research and therapy
Corey J.Bohil*, Bradly Alicea and Frank A.Biocca§ ||
Abstract | Virtual reality (VR) environments are increasingly being used by neuroscientists to
simulate natural events and social interactions. VR creates interactive, multimodal sensory
stimuli that offer unique advantages over other approaches to neuroscientific research and
applications. VR’s compatibility with imaging technologies such as functional MRI allows
researchers to present multimodal stimuli with a high degree of ecological validity and control
while recording changes in brain activity. Therapists, too, stand to gain from progress in VR
technology, which provides a high degree of control over the therapeutic experience. Here
we review the latest advances in VR technology and its applications in neuroscience research.
REVIEWS
752
|
DECEMBER 2011
|
VOLUME 12 www.nature.com/reviews/neuro
© 2011 Macmillan Publishers Limited. All rights reserved
0CVWTG4GXKGYU^0GWTQUEKGPEG
CE
F
D
\5KIJV*/&D[5GPUKEU */&YKVJQRVKECN
UGGVJTQWIJD[6TKXKUKQ
/QPQEWNCT*/&D[
.KVG[G5[UVGOU
psychological research, increasing the potential to elicit
realistic psychological and behavioural responses.
VR also offers maximal control over multisen-
sory stimulation. This kind of control is beneficial for
understanding sensorimotor interactions between, for
example, proprioception and visual experience (that is,
interactions between brain regions responding simul-
taneously). In some studies, parts of the represented
world are transformed between eye saccades to explore
how consciousness retains models of the world while
engaged in action11.
VR also increases the role of motor activation dur-
ing simulated experience, as users can move through
and physically interact with virtual objects. Virtual
environments can present combinations of stimuli that
are not found in the natural world and researchers can
execute changes in the environment that would not be
possible physically. VR might be used to decouple visual
and vestibular sensation, revealing the roles of separate
brain systems that are usually enlisted simultaneously
(for example, postural responses may reflect input from
visual perception more than from motion perception,
or vice versa).
Last, the equipment used to create interactive simula-
tions is readily leveraged for fine-grained recording and
analysis of behavioural responses that can be used to
monitor or produce change over time. In immersive VR,
tracking devices affixed to the head or hands sample the
wearer’s body coordinates in space very rapidly, and this
information can be recorded and analysed to assess very
minute improvements or changes in muscle control over
a period of time. For basic neuroscience researchers,
multisensory stimulation and embodied interaction are
difficult or impossible to achieve otherwise. Likewise,
Box 1 | Anatomy of a virtual environment
There are key technical
components that are found
in most virtual reality (VR)
systems. The most
commonly used forms of
sensory stimulation are
visual displays (see the
figure). Stereoscopic vision is
accomplished by presenting
horizontally displaced
images to the left and right
eyes, mimicking the natural
disparity in visual images
registered by each eye owing to horizontal displacement in the
head. The brain treats computer-generated images as any other
optical input, fusing the images to create a sensation of
three-dimensional space. The perspective from which a viewer
experiences the computer-generated image is controlled by a
virtual camera (unseen by the viewer). Changing the location or
direction of the camera changes the view, as does viewing the world
through a real camera. To ensure that viewpoint changes according
to where the user is looking, it is necessary to track the location of
the user’s head. The images can be delivered either by a closed
(personal) head-mounted display (HMD; parts ac of the figure) or
by an open display such as a computer monitor or projection screen
(part d of the figure). HMDs may be more immersive, but open
displays are often easier to engineer and work with (although HMDs
are becoming highly compact and affordable).
Auditory stimulation is commonly used in conjunction with visual
display, often in the form of realistic three-dimensional spatial surround sound. Haptic (tactile) feedback is sometimes provided
using devices called tactors — actuators that vibrate against the skin or within input devices. Haptic feedback devices are
increasingly able to deliver a strikingly compelling sense of physical contact with the virtual world115. The real power inherent
in virtual environments, however, is their ability to present synchronized simulations to multiple sensory channels116.
Interactivity is another key component of VR. Immersive VR environments incorporate highly sensitive head- and
body-tracking systems. Sensors monitor the user’s position to provide an egocentric reference frame for the simulation
(that is, a first-person perspective). A popular approach is inertial tracking, which uses accelerometers that behave in a
similar way to the vestibular system (accelerometers are electromechanical devices with moving parts that use gravity to
detect orientation, movement and vibration and then send this information to a computer). Inertial tracking also uses
gyroscopes for maintaining orientation and magnetometers for maintaining accurate direction information. Other
tracking alternatives make use of cameras, changes in the magnetic field orientation of a body-worn sensor, changes in
the time taken to receive an ultrasonic frequency by a body-worn sensor, or some hybrid of these.
Less immersive means of retrieving input from users include common keyboard, mouse and joystick devices. Although
these control devices are easy to work with, naturalistic user interfaces that replicate real-world interactions (such as
reaching, grasping or pushing) are becoming the norm.
REVIEWS
NATURE REVIEWS
|
NEUROSCIENCE VOLUME 12
|
DECEMBER 2011
|
753
© 2011 Macmillan Publishers Limited. All rights reserved
Morris water maze
A classic experimental
paradigm used to assess
spatial navigation abilities.
Traditionally, an animal swims
around a pool for a number of
trials, freely exploring the
space. In later trials, the goal is
to find the fastest route to a
submerged platform.
practitioners such as clinical and rehabilitation therapists
gain unique benefits from being able to control stimuli in
VR environments and gather data on patients’ responses
(for example, they can create environments that stimulate
phobic responses).
In the next section we review the contribution of
VR systems to various areas of neuroscience, including
spatial cognition, social cognition and other research
domains.
Experimental domains
VR environments meet the needs of several research
domains related to cognition and perception, and VR
environments have been created for studies in both
humans and animals. In this section, we highlight sev-
eral areas in which VR has become an established part of
research instrumentation and methodology.
Spatial cognition and navigation. VR environments ena-
ble researchers to study human navigation traits using
tasks that are directly comparable to those that have
been used in animal research for many years. Before the
advent of VR, researchers were forced to seek alterna-
tives (for example, mental rotation or map learning)
to these tasks in order to relate human brain activity to
results from animal studies. VR’s compatibility with
functional MRI (fMRI) further encourages explorations
inspired by neuroscience research inanimals.
An environment that has long been used in animal
spatial cognition research is the radial maze. Analogous
research in which humans navigate a virtual radial maze
reveals evidence for hippocampal activity (as expected
from animal studies), but also evidence of frontal cor-
tex activity, suggesting the additional contribution of
working memory circuits12. Similar work using a virtual
water maze confirms the involvement of areas that are
external to the hippocampus (for example, the parahip-
pocampal gyrus, precuneus and fusiform)13. Such stud-
ies would not have been feasible without VR. With VR,
matching environments and tasks can be used for ani-
mal and human studies even when they differ in scale
or mobility.
VR systems also allow researchers to rapidly change
or eliminate landmarks or pathways. This is used in
many studies to probe what was learned during naviga-
tion. For example, landmarks such as distant buildings
or other environmental cues can be altered to explore
how users rely on them to navigate.
Virtual mazes have proven useful in identifying dis-
tinct navigation strategies, along with their underlying
neural substrates. Two prominent strategies have emerged
using a virtual radial maze with recognizable features and
patterns on outside walls14. ‘Spatial’ learners rely on rela-
tionships between identifiable landmarks — such as the
patterns on the wall of the environment — to form a cog-
nitive map. So-called ‘response’ learners use a non-spatial
strategy, remembering a series of turns at each decision
point (for example, counting the turns and directions
without forming a cognitive map). In this work, neural
data revealed patterns related to the virtual behaviour.
Spatial learners exhibited more hippocampal grey matter
compared to response learners, whereas response learn-
ers exhibited comparatively more caudate nucleus grey
matter. In this case, landmarks were removed from view
and maze pathways were altered to probe for navigation
deficits belying each strategy for navigating thespace.
A virtual Morris water maze has been used to demon-
strate age-dependent differences in navigation strategy.
In the (physical) animal version, rats dropped into a
water tank use the patterns on the walls to find, swim to
and remember the location of hidden platforms under
the water. With a (virtual) human equivalent, researchers
find that the time to locate a hidden platform increases
with age, and that young participants spend more time
looking in the correct location for a previously learned
target than older participants15. A related study, using a
similar task, found that hippocampal volume positively
correlates with performance differences in young but
not old participants16. The authors speculate that older
participants may compensate for lack of hippocampal
contribution by adopting a non-spatial strategy that
relies more on the caudate nucleus and prefrontalcortex.
Spatial cognition researchers have also used VR
with patient populations. For example, patients with
Box 2 | Immersion and presence
When considering the technical components of virtual
reality systems, it helps to distinguish between the
concepts of ‘immersion’ and ‘presence’117. Immersion,
sometimes called sensorimotor immersion, refers to the
degree of physical stimulation impinging on the sensory
systems and the sensitivity of the system to motor inputs.
The level of immersion is determined by the number and
range of sensory and motor channels connected to the
virtual environment, and the extent and fidelity of
sensory stimulation and responsiveness to motor inputs
(for example, head and body movement, and hand
gestures to make commands). Immersion can be
increased by: increasing the range of visual stimuli, such
as the amount of visual field engaged and the fidelity of
visual displays; providing three-dimensional spatialized
sound, such as sound that is fixed around a moving body;
using interfaces with a tight sensorimotor coupling for
which changes in sensory stimulation respond naturally
to body movements, such as head, hand and other
motions; and through other techniques that increase the
sensorimotor realism of objects and settings in the virtual
environment.
The psychological product of technological immersion
is presence — the psychological sensation of being in the
virtual environment instead of the physical environment
and interacting with media. A commonly cited definition
of presence is “the perceptual illusion of
nonmediation”118, but it is often simply described as the
sensation of ‘being there’ in the virtual space117. Although
commonly measured by self-report, researchers have
begun looking for physiological indicators of a user’s
degree of presence. For instance, placing a person in a
virtual situation that is known to be stressful (for
example, a high place) leads to bodily responses similar
to those expected in a real-world analogue, such as
increased heart rate and skin conductance, and
decreased skin temperature119.
REVIEWS
754
|
DECEMBER 2011
|
VOLUME 12 www.nature.com/reviews/neuro
© 2011 Macmillan Publishers Limited. All rights reserved
Place cell
Hippocampal cell that encodes
different components of the
relationships between spatial
locations.
Huntington’s disease (who are characterized by degraded
caudate function) have shown a compensatory increase
in hippocampal activity during tasks that are normally
associated with caudate activity, so that their observable
navigation behaviour appears normal17. In patients with
epilepsy18, post-surgery lateralization of medial tempo-
ral lobe (MTL) activation is determined by the side of
pathology (for example, patients with right-side MTL
epilepsy showed increased left-lateralized hippocampal
activation during a VR navigation task) rather than by
gender, as suggested by studies of healthy subjects19.
VR systems can also be used in conjunction with
invasive recording of brain activity. This has been
invaluable for demonstrating human place-cell activity.
Although neuroimaging studies suggest that humans
may have place cells that are analogous to those
reported in animal studies, this is difficult to verify
because hippocampal and parahippocampal regions
respond to both visual stimuli and to specific locations.
Directly recording from neurons in MTL and frontal
lobes to separate the input of these factors using VR20
reveals human place-cell activity specifically related to
navigation (see also REFS21–23 for similar work using
electroencephalography (EEG) to measure theta activity).
Finally, studies on gender differences, a popular
topic among navigation researchers, have also benefited
from the use of VR with fMRI scanning. There is evi-
dence of increased activation of the posterior cingulate/
retrosplenial cortex in males while navigating a virtual
environment24. The cingulate/retrosplenial cortex may
have a role in our ability to orient ourselves in space.
However, females were shown to demonstrate relatively
more activity in the parahippocampal gyrus, which has
been linked to our ability to identify and remember
landmarks. These findings coincide with suggestions
that females make relatively more use of landmarks than
men do during navigation.
Spatial navigation systems for animals. Animal stud-
ies also inform our knowledge of spatial cognition and
navigation. Organisms with simpler nervous systems,
such as bees and ants, must navigate space to survive
(for example, to remember the location of food or to
avoid predators and other dangers). Understanding
how insect neural substrates handle navigation prob-
lems using landmarks, path integration or possibly even
some form of cognitive map formation can shed light
on necessary and sufficient informational and cognitive
processing requirements. Creating virtual environments
in which animals behave as they do in the real world and
that allow researchers to study variables such as smells,
sounds or sights has been a technical challenge. Here we
review some of the progress that has been made towards
a new era of VR systems with hardware and software that
has been designed for animal research.
Virtual environments have been used to study flight
control in both tethered and untethered insects (FIG.1).
Recent work details the design of a VR system in which
tethered moths are presented with visual informa-
tion on a small, dome-shaped rear projection screen25.
Along with visual information, olfactory stimulation in
the form of female pheromone is provided to induce
changes in flight direction (inferred from adjustments in
wing shape and abdomen movements for ruddering).
In open-loop studies (in which the experimenter con-
trols the visual stimulation as well as the pheromonal
stimuli independent of insect behaviour), visual display
changes had the greatest effect on wing responses. When
the experimenters changed the visual information to
indicate to the animal that it was veering away from a
straight flight path, it would adjust its wing pattern and
abdominal position to compensate and correct its flight
path. In closed-loop conditions (for example, where the
insect’s wing and abdomen responses to visual stimuli
drive the simulation display), abdominal movements
produced changes in visual heading (that is, direction
of flight) and orientation with respect to theground.
Some researchers argue that untethered flight is often
more appropriate for insect navigation studies because
mechanosensory feedback provided by specialized bal-
ance organs is undermined by the tethered approach26.
To this end, these researchers have designed a small
Box 3 | Common criticisms of virtual reality
Early virtual reality (VR) equipment suffered from many inadequacies, such as being
large and unwieldy, difficult to operate and very expensive to build and run. Early
experiences with these systems may have soured public enthusiasm for VR, and have
led to a range of criticisms that are likely to have slowed adoption. Nevertheless,
researchers have steadily progressed in making VR hardware and software more
reliable, cost effective and acceptable in terms of size and appearance.
Cost
The cost of advanced VR systems remains relatively high. For example, a
wide-field-of-view head-mounted display (HMD) can cost tens of thousands of US dollars,
a tracking system capable of covering a large area can cost upwards of a hundred
thousand dollars, and exploratory new systems can cost millions to develop. Nevertheless,
the trend follows that of computer equipment in general towards a rapid decrease in size
and price, and an increase in computational power and ease of operation.
Requirement for specialist technology skills
Creating virtual worlds and characters continues to require specialized skills in
three-dimensional modelling, texturing, character animation and programming.
However, increasingly powerful tools are becoming available — some at no cost — that
simplify these tasks. Furthermore, large repositories of object and character models are
available, and programming environments for inserting these models into VR systems
are becoming easier to use. Finally, creating interactivity is also becoming easier thanks
to visual programming and scripting languages (such as Virtools and Vizard).
Bulkiness of equipment
The earliest incarnations of VR used HMD helmets that engulfed the user’s entire head
and face, and weighed several pounds. This problem has steadily diminished thanks to
progress in the design of HMDs (some are approaching the size of an ordinary, albeit
heavier, pair of sunglasses).
Cybersickness
A lingering concern for users of VR is simulation sickness, or ‘cybersickness’, which
acutely threatens the widespread adoption of VR for therapeutic or training
applications requiring repeated use over time. Some users are reported to experience
nausea after using VR. A widely accepted explanation for this is the incongruity
between sensory inputs: as visual information provides users with the sense of motion,
vestibular feedback can indicate a degree of movement that is not matched by vision.
Although this continues to be a problem, some potential sources of cybersickness, such
as a lag between the timing of tracked movements and updating of
computer-generated imagery, are being eliminated with technical advances. However,
some cybersickness may persist when aspects of stimulation from the physical
environment, such as gravitational or inertial force, remain in conflict with what is being
experienced in the virtual environment (for example, flying in a plane).
REVIEWS
NATURE REVIEWS
|
NEUROSCIENCE VOLUME 12
|
DECEMBER 2011
|
755
© 2011 Macmillan Publishers Limited. All rights reserved
0CVWTG4GXKGYU^0GWTQUEKGPEG
C
DE
/QVJ #DFQOGP
RQUKVKQP
UGPUQT
6GVJGT
9KPF
UQWTEG
/WNVKEJCPPGN
UKNKEQPOKETQRTQDG
sα
%WTXGF
RTQLGEVKQP
UETGGP
Place fields
Populations of hippocampal
place cells that enable the
formation of spatial memories.
Collectively, these ‘fields’
enable the encoding and
recall of complex spatial
relationships.
flight arena equipped with light-emitting diode (LED)
visual displays, olfactory stimulation and motion track-
ing26,27. Regardless of whether the insects are tethered or
untethered, the results from these studies demonstrate
that insects respond to these virtual environments in
ways that match their behaviour in the real world, imply-
ing that they perceive these virtual stand-ins as in some
way equivalent to the natural environment.
The navigation skills of small mammals is also
increasingly studied using specially tailored VR sys-
tems. For instance, a 360 degree enclosure that is totally
isolated from smells, sounds and external visual infor-
mation can be used to stimulate visual, olfactory and
proprioceptive pathways in rats and mice28. It can like-
wise distort what is normally experienced, so that envi-
ronmental context can be enriched or impoverished on
demand. Virtual stimuli may change the behavioural or
biochemical condition of the organism, which can be
measured during or after an experimentalsession.
Researchers have described a VR system used to
study mouse navigation that promotes similar cell firing
rates and spike timings to those recorded in real envi-
ronments29. The system includes a spherical treadmill
on which the head-stabilized animal runs, while a visual
display is projected onto the inner portion of a curved
screen. These researchers use this VR setup in conjunc-
tion with subcellular-resolution microscopy to examine
hippocampal place-cell activity30. The authors report
finding signatures of place fields as the animals moved
along a virtual linear track, including asymmetric ramp-
like membrane depolarization and increased amplitude
of theta oscillations. These results lend empirical support
to a ‘soma–dendritic interference’ model positing excita-
tory dendritic input and inhibitory input near thesoma.
In summary, spatial cognition and navigation
research shows the benefits of using VR for stimulus
presentation. Interactive virtual environments have
enabled us to study human navigation in behavioural
and neural contexts simultaneously, and to rapidly make
changes to environments to explore a host of theoretically
important questions in both humans andanimals.
Multisensory integration. Virtual environments are
designed for multimodal sensory stimulation, making
them ideal for multisensory integration research (for
example, binding disparate inputs such as sight, sound
and touch into a unified perceptual experience).
This multimodal stimulus capacity is exemplified
by research on the body-transfer illusion. The percep-
tual integration of multimodal information that arrives
simultaneously at our sensory organs is vital to our
perception of the world and of ourselves. Although we
experience our sense of self as a stable, durable percept,
this experience is actually surprisingly modifiable, sug-
gesting continual updating of our perception of bodily
and conscious state. This is strikingly demonstrated by
the effect known as the body-transfer illusion31, which
has classically been demonstrated by the rubber-hand
illusion32. When a rubber hand (visible in the position
normally occupied by our actual hand) and our actual
hand (hidden from view by a screen) synchronously
receive touch feedback (that is, we feel stroking on the
real hand and simultaneously see it on the rubber hand),
participants begin to experience the sensation that the
imposter limb is part of their ownbody.
Several researchers have used VR to demonstrate a
full-body version of this effect33,34. Initial studies used
simple displays to show participants a stereoscopic view
of their own video-recorded body displaced spatially
from its actual position (effectively changing their
perspective from first to third person). Synchronous
visual and tactile feedback led participants to indi-
cate (through questionnaire responses or physiologi-
cal measures) a sense of ownership over their spatially
displacedself.
More recent work has shown the power of VR by
using more elaborate virtual worlds in which par-
ticipants control, observe and interact with computer-
generated avatars31. This technique enabled the research-
ers to analyse which variables are important for the
body-transfer illusion and to compare their results with
studies that used less-sophisticated stimuli. By manipu-
lating perspective (first versus third person) as well as
the timing of visuotactile stimulation (synchronous ver-
sus asynchronous), they were able to determine that a
Figure 1 | Virtual reality environments for studying insect navigation. Virtual reality
(VR) environments for animals contain the same essential components as VR systems
developed for humans (BOX1). Sensory stimulation must be provided while the animal’s
location is tracked in space (at least for closed-loop simulations where the animal’s
behaviour influences the displayed information). This figure shows specialized chambers
for studying insect navigation. a | A cylindrical enclosure that allows untethered flight. The
sides of the chamber are covered with light-emitting diodes that display changing patterns
to the insect’s compound eye. The insect’s location is tracked with cameras, and the
direction of flight can be influenced by releasing a puff of odorant (for example, female
pheromone) into the chamber. b | A tethered insect in front of a curved rear-projection
screen. c | The geometric shapes displayed to the animal on the screen. The animal ‘flies’
around the space and adjusts its wing and abdomen positions to change course or to avoid
the virtual obstacles. Part c is modified from REF.25 © (2002) Elsevier.
REVIEWS
756
|
DECEMBER 2011
|
VOLUME 12 www.nature.com/reviews/neuro
© 2011 Macmillan Publishers Limited. All rights reserved
Binding problem
The integration of sensory cues
and information in higher-level
cortical regions underlies
cognition and consciousness.
Binding requires large-scale
synchronization of cortical
activity to create a unified
perceptual experience.
Theory of mind
The ability to empathize with
another individual. It involves
the tendency of humans to
attribute mental states — such
as goals, beliefs and knowledge
— to another individual that
are in some way analogous
with our own mental state.
Mentalizing
Mentalizing is the process
of interpreting the intention of
others, allowing one to
anticipate the behaviour of
objects and individuals.
first person perspective is key for obtaining the effect,
whereas visuotactile synchronicity might actually be
dispensable.
Multimodal stimulus control is also important for
inducing a sense of ‘presence’ (BOX2) in virtual environ-
ments, which is believed to be of crucial importance for
the effectiveness of VR training in medical, military and
other educational simulations, as well as for therapeutic
applications in which users respond to environments that
simulate troubling situations from the physical world.
The value of multimodal control has been demonstrated
in studies showing that combined visual and propriocep-
tive feedback leads to a stronger sense of presence than
using a joystick to control responses35,36. VRs multimodal
stimulus capabilities may ultimately shed light on the bind-
ing problem, as researchers have found that the precisely
coordinated synthesis of separate sensory input channels
is necessary to achieve and maintain a sense of presence.
Social neuroscience. VR allows imaging of brain activity
during naturalistic, face-to-face social interactions, and
has shed light on the interpretation of biological motion
cues, theory of mind development and responses to
displays of distress.
For example, a series of VR studies has helped to
identify brain regions that are involved in interpret-
ing others’ face and eye movements35,36. Participants
approached by a virtual character exhibiting an angry
expression consistently display activation of the sup-
erior temporal sulcus (STS), as well as the lateral fusi-
form gyrus and a region of the middle temporal gyrus.
Similar results have been reported for judgements of gaze
avoidance or engagement.
This research has potential for providing an under-
standing of the theory of mind deficit that is thought to
occur in autism spectrum disorders37. For example, in
normal controls, when a virtual other shifts gaze in an
unexpected direction (for example, looking in the oppo-
site direction of a suddenly appearing virtual object) the
result is increased right posterior STS (pSTS) activation.
In children with autism, however, there is no difference in
activation between expected- and unexpected-direction
shifts. These findings highlight the importance of the
pSTS for interpreting others’ intentions, and could
ultimately prove valuable for treating children with
autism. VRs high ecological validity is an asset to such
potentially translational research.
The interactive realism of VR also aids research
on mentalizing. In such studies, participants normally
respond to stories, cartoons or movies about others,
and the simple, repetitive tasks that are typically used
are very different from the spontaneous, occasional
mentalizing we do in real life38. Combining VR with
brain imaging allows the examination of brain activ-
ity during spontaneous mentalizing. For example, in a
taxi-driving task, participants were made to ferry unseen
passengers to various destinations in a virtual replica of
London. Subjects responded to audio cues from cus-
tomers along with other irrelevant audio cues, and they
also had to interpret the behaviour of visible others on
the street (for example, people about to cross the street
and cars moving in traffic). During mentalizing events
(regardless of whether considering the intentions of the
unseen passengers or the visible others on the street),
the authors consistently found increased right pSTS
activation. During events involving visible others, they
also found medial prefrontal cortex (mPFC) activity. The
authors suggest that the pSTS might be involved not only
in detecting bodily cues of intent but also in analysing
the goals of that behaviour, whereas the mPFC may be
involved in predicting future actions of visible others.
This combination of realistic mentalizing and fMRI
would have been unfeasible before the advent of virtual
environments.
VR also enables researchers to ask questions that
might otherwise be limited by ethical concerns. For
example, VR has been used to replicate the famous
obedience study of Milgram, complete with palpable
distress on the part of participants tasked with ‘shocking’
virtual confederates39. This result begs the question: why
would participants be averse to supplying virtual shocks
to non-existent people? A replication of the Milgram
study in conjunction with fMRI sheds light on whether
behaviour is related to empathic concern for the virtual
character, or rather is based on personal distress cre-
ated in the participant by the sight of another in pain40.
Although they did find activation in areas known to be
involved in affective processing, the researchers found
no activation in the anterior cingulate cortex and insula
— areas known to be associated with empathic response.
This result suggests that observing a virtual other in dis-
tress creates personal discomfort for the observer, rather
than empathy for the virtual character. This of course
does not necessarily imply a similar response to pain in
the original Milgram study. Prior studies have reported
evidence of activation of cortical pain centres during the
observation of real faces expressingpain41.
Finally, a recent application of VR involves ‘hyper-
scanning’ — observing the interaction of more than one
participant as they are each being scanned by a separate
fMRI system. Thus, hyperscanning allows researchers to
measure the reactions of multiple participants — each
within their own imaging system — to shared social sit-
uations in a VR environment. This approach has been
applied to neuroeconomics research using participants
in separate fMRI scanners42. The authors find periods at
various points throughout the task when regions of each
brain are active at the same time (coherent activation)
across subjects. Scanning each individual separately as
they perform a task would require identifying observ-
able events in the task environment that can be used to
locate synchronized neural responses between partici-
pants during analysis. But responses to environmental
events are often too weak to be identified in the data (for
example, when interacting participants are trying to pre-
dict each other’s behaviour), making it likely that several
periods of activation coherence across participants will
be missed. The possibility of using internet-connected
virtual interactive scenarios in which several subjects
can carry out interactive tasks while in scanners at dis-
tant locales makes the use of VR particularly attractive
for this kind of research.
REVIEWS
NATURE REVIEWS
|
NEUROSCIENCE VOLUME 12
|
DECEMBER 2011
|
757
© 2011 Macmillan Publishers Limited. All rights reserved
0CVWTG4GXKGYU^0GWTQUEKGPEG
CD
Therapeutic applications
In addition to basic research into brain function, sev-
eral researchers have demonstrated the effectiveness
of VR for therapeutic applications. VR has successfully
been applied in at least three domains: psychiatric dis-
orders, pain management and neurorehabilitation. VR
offers some distinct advantages over standard therapies,
including precise control over the degree of exposure to
therapeutic scenarios (for example, treating fear of flying
without requiring patients to fly in a plane), the possibil-
ity of tailoring scenarios to individual patient needs, and
even the capacity to provide therapies that might other-
wise be impossible. For example, one team has included
artefacts and images that are directly related to a person’s
past inside virtual environments43.
Psychiatric disorders. VR offers a controlled user experi-
ence that is akin to dosage control in psychiatric treat-
ments, along with a potentially high degree of realism
to bolster the transfer of results to the real world (FIG.2).
VR treatment has been applied to a range of disorders,
including fear conditioning44, anxiety disorders45 and
brain damage46.
One of the most widely explored applications of
VR to psychiatric rehabilitation is in the area of phobia
treatment. Phobias are commonly treated with exposure
therapy, which systematically introduces a feared object
or situation to the patient, beginning with a small ‘dose’,
such as imagining the phobic stimulus, and graduating
to more anxiety-provoking situations. Over time, the
patient may gain a sense of control over the environ-
ment and thus over their fear. VR has the potential to
solve many problems that are common to real-world
exposure therapy and has generally produced favourable
results19,47. The virtual environment permits therapists
to adjust the degree of exposure and attain a high level
of consistency across sessions. In addition, therapy
involving real-world exposure (for example, handling
real spiders) is simply not an option for some patients.
Simulations may also provide easier access to difficult-
to-arrange real-world situations (such as airplane flights,
or facing animals or large audiences).
Several studies have compared outcomes from real-
world and VR exposure therapy for acrophobics43,48,49
using real locales (for example, a rooftop or balcony)
and VR equivalents43,48,49. Although overall anxiety was
slightly lower in virtual environments, the amount of
decline in anxiety from pre- to post-test was similar for
real and virtual locales. Studies of VR exposure treat-
ment for other phobias, such as arachnophobia45,47, fear
of flying (aviophobia)50,51, agoraphobia52–54, claustropho-
bia45,47 and fear of public speaking (glossophobia)47,55,
have produced similar results47. It remains to be seen
whether VR will ultimately be able to produce the stress
levels — and improvements in patients’ phobias — that
are equivalent to those produced by real-world exposure
therapy; however, currently available VR may be a valu-
able therapeutic starting point for those who are unable
to withstand the greater stress of real-world exposure
th erapy.
VR therapy has been compared with another com-
mon type of therapy — imaginal exposure — in patients
with a fear of flying56,57. A major problem with imaginal
exposure is that not all patients can imagine the stress-
ful situation realistically enough to inspire high anxiety.
In these studies, VR was found to aid in recreating the
psychological experience of flying, and patients experi-
encing VR exhibited more anxiety and a correspond-
ingly greater decline in anxiety over time than patients
undergoing imaginaltherapy.
Imaginal exposure therapy is also used to treat
post-traumatic stress disorder (PTSD), subject to the
same limitations described above. In a case study of
PTSD brought on by exposure to the terrorist attacks
of 11September 2001 (REFS58,59), researchers found a
large reduction in PTSD symptoms in survivors after
VR exposure therapy. They report a 90% reduction in
symptoms after six (approximately 1hour long) VR
sessions over several weeks. Encouraged by results like
these, VR is increasingly being used in hospitals to evalu-
ate soldiers on active duty and to diminish the response
to traumatic memories and environments in returning
soldiers60–68.
Pain remediation. One neurological application of VR
is to aid analgesia. Virtual environments provide per-
ceptual representations of ones body and the world that
can shift the patient’s attention and slightly alter the
perceived properties of pain69–72. VR pain relief results
from VR’s capacity for multimodal stimulation and
interactivity.
For example, consider Ramachandran’s famous
demonstration of phantom limb pain reduction using a
mirror box to provide visual input from the remaining
symmetric limb73. This provides visual feedback that is
Figure 2 | Examples of virtual environments for therapeutic application.
a | An example of a simulation used in exposure therapy to treat fear of flying. This type of
simulation allows observers to experience the sensation of flying in a commercial jet,
including turbulence and landing, from a first-person perspective. Other examples
include simulations for acrophobia, public speaking and cue reactivity (reaction to
drug-related environmental cues). b | A simulation used in pain remediation. Specifically,
it has been applied to burn victims for distraction to reduce the pain of bandage
changing. A user navigates the environment, which is designed to conjure thoughts of
cold, during treatment. The distraction created with this simulation has yielded
impressive pain reduction results, over and above the pain reduction produced by opioid
pain medications. Image a is courtesy of WorldViz. Image b is courtesy of Stephen
Dagadakis © Hunter Hoffman (Worldbuilding by Jeff Bellinghausen and Chuck Walter,
Brian Stewart, Howard Abrams and Duff Hendrickson).
REVIEWS
758
|
DECEMBER 2011
|
VOLUME 12 www.nature.com/reviews/neuro
© 2011 Macmillan Publishers Limited. All rights reserved
analogous to moving the missing limb to a more com-
fortable position. However, the effect may be limited
because sensorimotor signals from the non-amputee
side of the brain are activated rather than the disor-
dered signals occurring on the actual amputation side.
The problem can be remedied using a VR version of this
treatment. By placing location sensors on the limb stump
and allowing the patient to move it, the correct side of
the brain receives kinaesthetic feedback while the visual
system receives feedback of a virtual limb moving to a
more comfortable position74,75.
The sense of ‘presence’ afforded by virtual environ-
ments also seems to underlie effective analgesia. A widely
publicized application has been the use of VR for pain
remediation in patients with burns. Interacting with
a virtual winter terrain (FIG.2) has reduced subjective
pain in burn victims by inducing thoughts of ‘cold’76,77.
It must be pointed out, however, that the results of this
pain reduction are limited to the period during which
the patient is actually engaged with the VR environment
and do not seem to extend beyond this period outside
the VR environment. Although not yet widespread, this
approach has found application in some hospital settings.
Maximizing sensory immersion and hence one’s
sense of presence in the virtual environment seems to
strengthen analgesic effectiveness. A review of studies
of VR pain analgesia for burn victims finds that more
highly immersive VR equipment (for example, a high-
end head-mounted display (HMD)) corresponds with
greater levels of relief78. There is little evidence of pain
relief when patients view VR stimuli on a computer
screen, in monoscopic three-dimensional video or using
a limited-field-of-view HMD. Furthermore, the content
of the VR simulation must be compelling (for example,
virtual scuba diving versus strolling around a virtual
room) to be effective for painrelief.
VR’s interactivity also has a role. In a recent study,
participants either passively watched video game foot-
age through an HMD or actively played the game while
experiencing a cold pressor to induce discomfort79.
Those actually playing the game were able to tolerate
higher levels of discomfort. Both immersion and interac-
tivity have the effect of increasing presence, and indeed
several researchers have demonstrated that increased
presence correlates with more effective pain relief80–82.
Neurorehabilitation. Neurorehabilitation applications
have been focused on two areas: balance disorders
and their underlying multisensory integration mecha-
nisms83,84, and recovery of function after stroke11,41,85,86.
VR simulations can be highly engaging, which provides
crucial motivation for rehabilitative applications that
require consistent, repetitive practice. Furthermore, the
tracking systems used in VR provide an excellent tool for
recording and following minute changes and improve-
ments over time. Indeed, immersive multimodal VRs
that link head, hand and body movement to changes in
visual and auditory stimuli have proven useful for the
recovery of motor function and postural stability83,84,87,88.
The timing of multimodal stimulation has been linked
to recovery from postural and gait disorders. Several
studies have shown that postural sway exhibits greater
variance with age and in patients with balance disorders.
Selective modification of one or more sensory channels
has been found to reduce the amount of variance exhib-
ited. This has been done by presenting selectively timed
tactile and visual motion cues. Specifically, training with
synchronized haptic, auditory and visual cues has been
shown to foster reductions in unintended postural sway
over trials, particularly in patients with acquired brain
injuries, such as stroke89. There is evidence that VR helps
to engage primary and secondary motor areas related to
recovery of muscle control after stroke90,91.
Similar studies have examined children with gait dis-
order due to cerebral palsy92. After walking on a track
while observing a virtual tile floor for 20minutes, par-
ticipants showed improvements in walking speed and
stride length, particularly those with the lowest baseline
speed and stride lengths (as measured before the VR
task). Similar trends have been reported for patients
with gait disturbances related to multiple sclerosis93.
Again, the results seem to indicate that the timing of
multimodal stimulation in VR (seeing a virtual tile floor
under foot while hearing footsteps on the floor) pro-
vides feedback that helps the patient to understand that
they are currently walking steadily, and helps the brain
to bypass damaged areas to some extent in those cases
where the sensorimotor vividness of the environment
engages reflexive responses.
VR has also found promise in stimulating the recov-
ery of function in patients who have suffered a stroke.
To interact with the virtual environment, patients were
given a force-feedback-enabled data glove (containing
an exoskeleton of computer-controlled finger actua-
tors that modify forces to simulate surface resistance),
and after 2weeks of desktop-VR tasks, improvements
in individual finger control, thumb and finger range of
motion, and thumb and finger speed were observed94,95.
These results were retained after a week, highlighting the
benefits of VR for rehabilitation. These authors attrib-
ute much of the improvement to increased motivation
to engage in rehabilitative exercises. The exercises are
embedded in a real-world context or a game and can be
more engaging than a sterile medical office. Several stud-
ies on the use of VR for upper-body exercise with feed-
back (for example, visual, auditory or haptic information
indicating how close a patient has come to a desired
performance goal) show significant improvement in the
movement, use and control of patients’ hands, relative to
baseline and to other rehabilitation approaches such as
patient-guided exercise and group physical therapy96,97.
Some caveats. Although the studies reviewed above
point to the promise of VR therapy for psychiatric reha-
bilitation, there are limitations. For one, cost remains
an issue. Depending on the situation, an immersive
VR system can cost tens of thousands of US dollars,
although many of the studies reported here highlight
results obtained with commonly available computer
equipment. Another problem is the lack of standardiza-
tion of VR solutions. A more uniform approach to VR
system design would probably simplify and speed up the
REVIEWS
NATURE REVIEWS
|
NEUROSCIENCE VOLUME 12
|
DECEMBER 2011
|
759
© 2011 Macmillan Publishers Limited. All rights reserved
adoption of VR therapies. This problem may soon be
resolved, as companies are striving to offer turnkey VR
systems. However, perhaps the most substantial problem
is the programming requirements for making and modi-
fying virtual environments. This is a major roadblock to
widespread adoption, although standardization of VR
content would greatly ameliorate the problem, and this
situation too is gradually improving.
Although there is healthy growth in the use of VR
solutions among researchers, and a growing body of
supporting results, members of the mainstream medi-
cal community are probably still many years from
widespread adoption. This may change once sufficient
clinical evidence has been accrued and systems are sim-
ple and robust to use. BOX3 contains a more complete
discussion of VRs current limitations.
Connecting the brain to virtual environments
Most applications of VR in neuroscience focus on influ-
encing and measuring changes in brain activity, but
another application is the creation of brain–computer
interfaces that establish a direct link between the nerv-
ous system and virtual environment properties98. For
example, electrical recordings from the central nervous
system and muscle activity can be used to control digital
objects (ordinarily done by using a joystick or mouse)
inVR.
Brain–computer interfaces range in their degree of
invasiveness. Implantable brain–machine interfaces,
such as those being developed with non-human pri-
mates99,100 and humans101,102, use virtual environments
as a medium to present movement-related feedback in
a closed-loop system (for example, for training primates
to reach and grasp virtual objects with a robotic arm
and for training quadriplegics to manipulate virtual
switches to control aspects of the environment, respec-
tively). Some brain–computer interface developers use
EEG to achieve similar results103. Recent research has
shown that humans have a remarkable ability to learn
to focus attention and voluntarily influence activity in
MTL neurons (by increasing or decreasing their firing
rate) to control on-screen images104. These findings may
ultimately lead to technologies that respond to user
intentions105.
A primary benefit of linking brains to VR environ-
ments is to provide safe practice environments. For
example, virtual environments can serve as a surrogate
for training patients to use neuromotor prosthetics
before attempting to use a new prosthetic in the real
world106.
Recent work aimed at developing methods of study-
ing sensorimotor disorders using a combination of
robotic arms, simple virtual environments and fMRI107
is further bridging the gap between brain–machine
interfaces and rehabilitation research. Participants lie in
an fMRI scanner and control a plastic robotic arm near
their waist, while viewing (on a monitor) graspable vir-
tual objects and the location of the virtual arm. Although
the number of papers and applications in this area is
increasing, the underlying neuroscientific principles
are just beginning to be understood108.
Conclusions and future trends
As we have discussed, VR makes it possible to examine
brain activity during dynamic, complex and realistic
situations. In applied domains such as rehabilitation,
VR methods continue to accrue validating results. This
trend should continue as these methods become widely
adopted and are extended to the study of different
neuroscience areas and a wider range of therapies.
The future of VR in neuroscience is strongly tied
to developments in technology that help to immerse
the user in convincing, life-like sensorimotor illu-
sions. For example, panoramic high-resolution HMDs
are now available with a field of view greater than 120
degrees (the human field of view encompasses nearly
180 degrees). However, the widespread adoption of
VR is likely to involve smaller, less-expensive systems,
and will be bolstered by the increasing proliferation of
consumer devices.
A likely trend for VR will be towards greater user
mobility. This will be valuable for understanding neural
activity in ‘in the wild’ (that is, in the real world) and
for understanding the role of the body in cognitive per-
formance (that is, embodied cognition). To this end, a
promising variant of VR that lends itself to movement
outside the laboratory is known as augmented reality
(AR). In AR, the user views the real world through a
display (either head-mounted or screen-based) equipped
with a position-sensing device and a camera (although
some HMDs allow the user to see directly through,
rather like a pair of sunglasses). This view of the world
is augmented by the addition of computer-generated,
location-specific objects and information (that is, digi-
tal items are overlaid on the natural environment where
and when they are needed). AR may prove valuable for
spatial cognition research, as it allows participants to
navigate in real-world locales while learning the loca-
tions of virtual landmarks and features. These virtual
details could easily be removed or changed to evaluate
what has been learned. This would also allow researchers
to study the contributions of body movement through
a space of unlimited size. AR may also prove valuable
for rehabilitation. For example, displaying a virtual
limb whose movements are to be matched by a patient
trying to recover function could allow for incremental
challenge and heighten motivation to practice.
Technologies for observing neural activity while the
subject is mobile currently exist. For example functional
near-infrared imaging (fNIR) is an evolving technology
that, like fMRI, measures cortical activity; however,
whereas fMRI requires a strong magnet, fNIR relies
on beams of light, making it small and mobile. Near-
infrared light is delivered to the surface of the scalp, and
light scatter varies depending on oxygenation level in
the assessed regions, providing an indirect measure of
neural activity. Although limited to a few centimetres
of cerebral cortex, and not as precise as fMRI, fNIR is
promising for brain–computer interface technology,
as well as for use in adaptive training simulations that
adjust to the user’s cognitive or emotional state in real
time109. In addition to fNIR, EEG systems can be used
in mobile contexts. In fact, several companies have
REVIEWS
760
|
DECEMBER 2011
|
VOLUME 12 www.nature.com/reviews/neuro
© 2011 Macmillan Publishers Limited. All rights reserved
1. Loomis, J.M. & Blascovich, J.J. Immersive virtual
environment technology as a basic research tool in
psychology. Behav. Res. Methods Instrum. Comput.
31, 557–564 (1999).
2. Tarr, M.J. & Warren, W. H. Virtual reality in behavioral
neuroscience and beyond. Nature Neurosci. 5,
1089–1092 (2002).
3. Schultheis, M.T. & Rizzo, A.A. The application of
virtual reality technology in rehabilitation. Rehabil.
Psychol. 46, 296–311 (2001).
4. Holden, M.K. Virtual environments for motor
rehabilitation: review. Cyberpsychol. Behav. 8,
187–211 (2005).
5. Rizzo, A.A. & Kim, G.J. A SWOT analysis of the field of
virtual reality rehabilitation and therapy. Presence 14,
119–146 (2005).
6. Sveistrup, H. Motor rehabilitation using virtual reality.
J.Neuroeng. Rehabil. 1, 10 (2004).
7. Henderson, A., Korner-Bitensky, N. & Levin, M. Virtual
reality in stroke rehabilitation: a systematic review of
its effectiveness for upper limb motor recovery. Top.
Stroke Rehabil. 14, 52–61 (2007).
8. Adamovich, S.V., Fluet, G.G., Tunik, E. & Merians,
A.S. Sensorimotor training in virtual reality: a review.
NeuroRehabilitation 25, 29–44 (2009).
9. Biocca, F. & Levy, M. Communication in the Age of
Virtual Reality (Lawrence Erlbaum Associates,
Hillsdale, 1995).
10. Gibson, J.J. The Senses Considered as Perceptual
Systems (Houghton-Mifflin, Boston, 1966).
11. Henderson, J. & Hollingsworth, A. The role of fixation
position in detecting scene changes across saccades.
Psychol. Sci. 10, 438–443 (1999).
12. Astur, R. etal. fMRI hippocampal activity during a
virtual radial arm maze. Appl. Psychophysiol.
Biofeedback 30, 307–317 (2005).
By combining a virtual radial arm maze with fMRI,
this paper shows that human navigation may rely
on frontal cortex activity in addition to
hippocampal activity.
13. Shipman, S. & Astur, R. Factors affecting the
hippocampal BOLD response during spatial memory.
Behav. Brain Res. 187, 433–441 (2008).
14. Bohbot, V., Lerch, J., Thorndycraft, B., Iaria, G. &
Zijdenbos, A. Gray matter differences correlate with
spontaneous strategies in a human virtual navigation
task. J.Neurosci. 27, 10078–10083 (2007).
Using a virtual radial maze to study human
navigation strategies, this paper shows that
individual differences in amount of hippocampal
and caudate grey matter correspond to preferred
navigation strategy.
15. Driscoll, I., Hamilton, D., Yeo, R., Brooks, W. &
Sutherland, R. Virtual navigation in humans: the
impact of age, sex, and hormones on place learning.
Horm. Behav. 47, 326–335 (2005).
16. Moffat, S., Kennedy, K., Rodrigue, K. & Raz, N.
Extrahippocampal contributions to age differences in
human spatial navigation. Cereb. Cortex 17,
1274–1282 (2007).
This study uses a virtual water maze to study age
differences in human navigation, and suggests an
age-related shift towards a non-spatial strategy to
compensate for changes in hippocampal activity.
17. Voermans, N. etal. Interaction between the human
hippocampus and the caudate nucleus during route
recognition. Neuron 43, 427–435 (2004).
18. Frings, L. etal. Lateralization of hippocampal activation
differs between left and right temporal lobe epilepsy
patients and correlates with postsurgical verbal learning
decrement. Epilepsy Res. 78, 161–170 (2008).
19. Frings, L. etal. Gender-related differences in
lateralization of hippocampal activation and cognitive
strategy. Neuroreport 17, 417–421 (2006).
20. Ekstrom, A. etal. Cellular networks underlying human
spatial navigation. Nature 425, 184–187 (2003).
Using a virtual navigation task, this study records
place fields in the human hippocampus.
21. Weidemann, C., Mollison, M. & Kahana, M.
Electrophysiological correlates of high-level perception
during spatial navigation. Psychon. Bull. Rev. 16,
313–319 (2009).
22. Jacobs, J. etal. Right-lateralized brain oscillations in
human spatial navigation. J.Cogn. Neurosci. 22,
824–836 (2010).
23. Jacobs, J., Kahana, M., Ekstrom, A., Mollison, M. &
Fried, I. A sense of direction in human entorhinal cortex.
Proc. Natl Acad. Sci. USA 107, 6487–6492 (2010).
24. Nowak, N. T., Resnick, S. M., Elkins, W. & Moffat, S. D.
Sex differences in brain activation during virtual
navigation: a functional MRI study. Proc. of the 33rd
Annual Meeting of the Cognitive Science Soc. (Boston,
Masachusetts, USA) [online], http://csjarchive.cogsci.
rpi.edu/proceedings/2011/papers/0638/paper0638.
pdf (2011).
25. Gray, J., Pawlowski, V. & Willis, M. A method for
recording behavior and multineuronal CNS activity
from tethered insects flying in virtual space.
J.Neurosci. Methods 120, 211–223 (2002).
This paper describes one of the first successful
attempts at creating a VR system for studying flight
behaviour and neural activity in tethered insects.
26. Fry, S., Rohreseitz, N., Straw, A. & Dickinson, M.
TrackFly: virtual reality for a behavioral system
analysis in free-flying fruit flies. J.Neurosci. Methods
171, 110–117 (2008).
This paper describes a free-flight VR environment
designed for studying the flight behaviour of
untethered insects.
27. Fry, S.N. etal. Context-dependent stimulus
presentation to freely moving animals in 3D.
J.Neurosci. Methods 135, 149–157 (2004).
28. Holscher, C., Schnee, A., Dahmen, H., Setia, L. &
Mallot, H.A. Rats are able to navigate in virtual
environments. J.Exp. Biol. 208, 561–569 (2005).
This paper details a VR system for studying rodent
navigation and demonstrates for the first time that
rats can learn spatial tasks in a virtual
environment.
29. Harvey, C.D., Collman, F., Dombeck, D.A. & Tank, D.W.
Intracellular dynamics of hippocampal place cells during
virtual navigation. Nature 461, 941–946 (2009).
This study combines invivo neural recording with a
track-ball VR system for studying rodent
navigation, and reports hippocampal place-cell
activity during movement.
30. Dombeck, D.A., Harvey, C. D., Tian, L., Looger, L.L. &
Tank, D.W. Functional imaging of hippocampal place
cells at cellular resolution during virtual navigation.
Nature Neurosci. 13, 1433–1440 (2010).
31. Slater, M., Spanlang, B., Sanchez-Vives, M.V. &
Blanke, O. First person experience of body transfer in
virtual reality. PLoS ONE 5, e10564 (2010).
This paper demonstrates the power of VR for
providing simultaneous realism and control. The
authors find that viewer-perspective is more
important than visuotactile stimulation in
producing the body-transfer illusion.
32. Botvinick, M. & Cohen, J. Rubber hands ‘feel’ touch
that eyes see. Nature 391, 756 (1998).
33. Ehrsson, H.H. The experimental induction of out-of-
body experiences. Science 317, 1048 (2007).
34. Lenggenhager, B., Tadi, T., Metzinger, T. & Blanke, O.
Video ergo sum: manipulating bodily self-
consciousness. Science 317, 1096–1099 (2007).
In this influential paper, the authors demonstrate
that the body-transfer illusion can be produced for
full-body perception with virtual stimuli.
35. Slater, M., Usoh, M. & Steed, A. Taking steps: the
influence of a walking technique on presence in virtual
reality. ACM Trans. Comput. Hum. Interact. 2,
201–219 (1995).
36. Slater, M. & Steed, A. A virtual presence counter.
Presence 9, 413–434 (2000).
37. Pelphrey, K.A. & Carter, E.J. Charting the typical and
atypical development of the social brain. Dev.
Psychopathol. 20, 1081–1102 (2008).
38. Spiers, H. & Maguire, E. Spontaneous mentalizing
during an interactive real world task: an fMRI study.
Neuropsychologia 44, 1674–1682 (2006).
39. Slater, M. etal. A virtual reprise of the Stanley Milgram
obedience experiments. PLoS ONE 1, e39 (2006).
40. Cheetham, M., Pedroni, A.F., Antley, A., Slater, M. &
Jancke, L. Virtual Milgram: emphathic concern or personal
distress? Evidence from functional MRI and dispositional
measures. Front. Hum. Neurosci. 3, 29 (2009).
41. Botvinick, M. etal. Viewing facial expressions in pain
engages cortical areas involved in the direct
experience of pain. Neuroimage 25, 312–319 (2005).
42. Montague, P.R., Berns, G.S. & Cohen, J.D.
Hyperscanning: simultaneous fMRI during linked social
interactions. NeuroImage 16, 1159–1164 (2002).
43. Riva, G. etal. Interreality in practice: bridging virtual
and real worlds in the treatment of posttraumatic
stress disorders. Cyberpsychol. Behav. Soc. Netw. 13,
55–65 (2010).
44. Alvarez, R.P., Johnson, L. & Grillon, C. Contextual-
specificity of short-delay extinction in humans: renewal
of fear-potentiated startle in a virtual environment.
Learn. Mem. 14, 247–253 (2007).
45. Gorini, A. & Riva, G. Virtual reality in anxiety
disorders: the past and the future. Expert Rev.
Neurother. 8, 215–233 (2008).
46. Rose, F.D., Brooks, B.M. & Rizzo, A.A. Virtual reality
in brain damage rehabilitation: a review.
CyberPsychol. Behav. 8, 241–262 (2005).
47. Riva, G. Virtual reality in psychotherapy: review.
CyberPsychol. Behav. 8, 220–230 (2005).
48. Emmelkamp, P.M., Bruynzeel, M., Drost, L. & van der
Mast, C.A. Virtual reality treatment in acrophobia: a
comparison with exposure invivo. CyberPsychol.
Behav. 4, 335–339 (2001).
49. Emmelkamp, P.M. etal. Virtual reality treatment
versus exposure invivo: a comparative evaluation in
acrophobia. Behav. Res. Ther. 40, 509–516 (2002).
This paper demonstrates that VR exposure therapy
rivals insitu exposure therapy for acrophobia, and
that the results can be achieved with low-cost,
readily available equipment.
50. Maltby, N., Kirsch, I., Mayers, M. & Allen, G.J. Virtual
reality exposure therapy for the treatment of fear of
flying: a controlled investigation. J.Consult. Clin.
Psychol. 70, 1112–1118 (2002).
51. Rothbaum, B.O., Hodges, L., Smith, S., Lee, J.H. &
Price, L. A controlled study of virtual reality exposure
therapy for the fear of flying. J.Consult. Clin. Psychol.
68, 1020–1026 (2000).
developed wireless EEG systems for use in brain–com-
puter interaction applications (for example, wheelchair
control or video-game control). These systems are
very inexpensive and may prove robust enough to be
used in a variety of research applications (for example,
brain–computer interface development). However, the
precision of these devices is currently an open question.
Other likely developments include improvements
to sensory displays that have lagged behind visual and
auditory displays, including displays for taste, smell
and touch. This area has received limited attention
over the past 20years110112, with many challenges still
ahead113. However, research in this area is on the rise.
For example, researchers are currently working on a
head-worn system that can supply smell and taste to the
audio-visual experience, potentially creating a new level
of immersion and presence114.
Overall, the trend appears to be for computer-generated
media and consumer devices to become more like
the VR systems once found only in advanced VR and
simulation laboratories. As these software and hardware
components become ubiquitous, VR may increasingly
be viewed as an ordinary part of neuroscience research
andtherapy.
REVIEWS
NATURE REVIEWS
|
NEUROSCIENCE VOLUME 12
|
DECEMBER 2011
|
761
© 2011 Macmillan Publishers Limited. All rights reserved
52. Viaud-Delmon, I., Warusfel, O., Seguelas, A., Rio, E. &
Jouvent, R. High sensitivity to multisensory conflicts in
agoraphobia exhibited by virtual reality. Eur.
Psychiatry 21, 501–508 (2006).
53. Cardenas, G., Munoz, S., Gonzalez, M. & Uribarren, G.
Virtual reality applications to agoraphobia: a protocol.
CyberPsychol. Behav. 9, 248–250 (2006).
54. Vincelli, F. etal. Virtual reality assisted cognitive
behavioral therapy for the treatment of panic
disorders with agoraphobia. Stud. Health Technol.
Inform. 85, 552–559 (2002).
55. de Carvalho, M.R., Freire, R.C. & Nardi, A.E. Virtual
reality as a mechanism for exposure therapy. World
J.Biol. Psychiatry 11, 220–230 (2010).
56. Reger, G. etal. Effectiveness of virtual reality exposure
therapy for active duty soldiers in a military mental
health clinic. J.Trauma. Stress 24, 93–96 (2011).
57. Wiederhold, B.K. etal. The treatment of fear of flying:
a controlled study of imaginal and virtual reality
graded exposure therapy. IEEE Trans. Inf. Technol.
Biomed. 6, 218–223 (2002).
58. Difede, J., Hoffman, H. & Jaysinghe, N. Innovative use
of virtual reality technology in the treatment of PTSD
in the aftermath of September 11. Psychiatr. Serv. 53,
1083–1085 (2002).
59. Difede, J. etal. Virtual reality exposure therapy for the
treatment of posttraumatic stress disorder following
September 11, 2001. J.Clin. Psychiatry 68,
1639–1647 (2007).
60. Wood, D.P. etal. Combat-related post-traumatic
stress disorder: a case report using virtual reality
graded exposure therapy with physiological
monitoring with a female Seabee. Mil. Med. 174,
1215–1222 (2009).
61. Reger, G.M., Gahm, G.A., Rizzo, A.A., Swanson, R. &
Duma, S. Soldier evaluation of the virtual reality Iraq.
Telemed. J.e-Health 15, 101–104 (2009).
62. Macedonia, M. Virtual worlds: a new reality for
treating post-traumatic stress disorder. IEEE Comput.
Graph. Appl. 29, 86–88 (2009).
63. Gorrindo, T. & Groves, J.E. Computer simulation and
virtual reality in the diagnosis and treatment of
psychiatric disorders. Acad. Psychiatry 33, 413–417
(2009).
64. Wood, D.P. etal. Combat related post traumatic
stress disorder: a multiple case report using virtual
reality graded exposure therapy with physiological
monitoring. Stud. Health Technol. Inform. 132,
556–561 (2008).
65. Reger, G.M. & Gahm, G.A. Virtual reality exposure
therapy for active duty soldiers. J.Clin. Psychol. 64,
940–946 (2008).
66. Parsons, T.D. & Rizzo, A.A. Affective outcomes of
virtual reality exposure therapy for anxiety and
specific phobias: a meta-analysis. J.Behav. Ther. Exp.
Psychiatry 39, 250–261 (2008).
67. Gerardi, M., Rothbaum, B.O., Ressler, K., Heekin, M.
& Rizzo, A. Virtual reality exposure therapy using a
virtual Iraq: case report. J.Trauma. Stress 21,
209–213 (2008).
68. Beck, J.G., Palyo, S.A., Winer, E.H., Schwagler, B.E.
& Ang, E.J. Virtual reality exposure therapy for PTSD
symptoms after a road accident: an uncontrolled case
series. Behav. Ther. 38, 39–48 (2007).
69. Rutter, C.E., Dahlquist, L.M. & Weiss, K.E. Sustained
efficacy of virtual reality distraction. J.Pain 10,
391–397 (2009).
70. Mahrer, N.E. & Gold, J.I. The use of virtual reality for
pain control: a review. Curr. Pain Headache Rep. 13,
100–109 (2009).
71. Gold, J.I., Belmont, K.A. & Thomas, D.A. The
neurobiology of virtual reality pain attenuation.
CyberPsychol. Behav. 10, 536–544 (2007).
72. Magora, F., Cohen, S., Shochina, M. & Dayan, E.
Virtual reality immersion method of distraction to
control experimental ischemic pain. Isr. Med. Assoc. J.
8, 261–265 (2006).
73. Ramachandran, V.S. & Rogers-Ramachandran, D.
Synaesthesia in phantom limbs induced with mirrors.
Proc. R.Soc. Lond. B 263, 377–386 (1996).
74. Murray, C., Patchick, E., Caillette, F., Howard, T. &
Pettifer, S. Can immersive virtual reality reduce
phantom limb pain? Stud. Health Technol. Inform.
119 , 407–412 (2006).
75. Cole, J., Crowle, S., Austwick, G. & Slater, D.H.
Exploratory findings with virtual reality for phantom
limb pain: from stump motion to agency and
analgesia. Disabil. Rehabil. 31, 846–854 (2009).
76. Hoffman, H.G. etal. Water-friendly virtual reality pain
control during wound care. J.Clin. Psychol. 60,
189–195 (2004).
This study involved burn victims, and showed that
patients interacting with a virtual environment
designed to induce thoughts of ‘cold’ reported less
pain than control patients.
77. Hoffman, H.G. etal. Modulation of thermal-pain
related brain activity with virtual reality: evidence from
fMRI. NeuroReport 15, 1245–1248 (2004).
78. Malloy, K.M. & Milling, L.S. The effectiveness of
virtual reality distraction for pain reduction: a
systematic review. Clin. Psychol. Rev. 30, 1011–101 8
(2010).
79. Law, E.F. etal. Videogame distraction using virtual
reality technology for children experiencing cold pressor
pain: the role of cognitive processing. J.Pediatr.
Psychol. 23Jul 2010 (doi:10.1093/jpepsy/jsq063).
80. Gutierrez-Maldonado, J., Gutierrez-Martinez, O.,
Loreto, D., Penaloza, C. & Nieto, R. Presence,
involvement and efficacy of a virtual reality
intervention on pain. Stud. Health Technol. Inform.
154, 97–101 (2010).
81. Wender, R. etal. Interactivity influences the magnitude
of virtual reality analgesia. J.Cyber. Ther. Rehabil. 2,
27–33 (2009).
82. Hoffman, H.G. etal. Virtual reality pain control during
burn wound debridement in the hydrotank. Clin.
J.Pain 24, 299–304 (2008).
83. Jeka, J.Light touch contact: not just for surfers.
Neuromorphic Engineer 3, 5–6 (2006).
84. Jeka, J.J., Kiemel, T., Creath, R., Horak, F.B. &
Peterka, R. Controlling human upright stance: velocity
information is more accurate than position or
acceleration. J.Neurophysiol. 92, 2368–2379
(2004).
85. Cameirão, M. S., Badia, S. B., Oller, E. D. & Verschure,
P. F. M. J. Neurorehabilitation using the virtual reality
based Rehabilitation Gaming System: methodology,
design, psychometrics, usability and validation.
J.Neuroeng. Rehabil. 7, 48 (2010).
86. Gaggioli, A., Meneghini, A., Morganti, F., Alcaniz, M.
& Riva, G. A strategy for computer-assisted mental
practice in stroke rehabilitation. Neurorehabil. Neural
Repair 20, 503–507 (2006).
87. Earhart, G.M., Henckens, J.M., Carlson-Kuhta, P. &
Horak, F.B. Influence of vision on adaptive postural
responses following standing on an incline. Exp. Brain
Res. 203, 221–226 (2010).
88. Dozza, M., Horak, F.B. & Chiari, L. Auditory biofeedback
substitutes for loss of sensory information in maintaining
stance. Exp. Brain Res. 178, 37–48 (2007).
89. Holden, M.K., Dyar, T. A., Schwamm, L. & Bizzi, E.
Virtual-environment-based telerehabilitation in
patients with stroke. Presence 14, 214–233
(2005).
90. August, K. etal. fMRI analysis of neural mechanisms
underlying rehabilitation in virtual reality: activating
secondary motor areas. Conf. Proc. IEEE Eng. Med.
Biol. Soc. 3692–3695 (2006).
91. Adamovich, S.V., August, K., Merians, A.S. & Tunik, E.
A virtual reality-based system integrated with fMRI to
study neural mechanisms of activation observation-
execution: a proof of concept study. Restor. Neurol.
Neurosci. 27, 209–223 (2009).
92. Baram, Y. & Lenger, R. Virtual reality visual feedback
cues for gait improvement in children with gait
disorders due to cerebral palsy. Proc. of the 19th
Meeting of the European Neurological Soc. (Milan,
Italy) [online], http://registration.akm.ch/einsicht.
php?XNABSTRACT_ID=89948&XNSPRACHE_
ID=2&XNKONGRESS_ID=97&XNMASKEN_
ID=900 (2009).
93. Baram, Y. & Miller, A. Virtual reality cues for
improvement of gait in patients with multiple sclerosis.
Neurology 66, 178–181 (2006).
94. Merians, A.S., Poizner, H., Boian, R., Burdea, G. &
Adamovich, S. Sensorimotor training in a virtual
reality environment: does it improve functional
recovery poststroke? Neurorehabil. Neural Repair 20,
252–267 (2006).
95. Adamovich, S.V. etal. Design of a complex virtual
reality simulation to train finger motion for persons
with hemiparesis: a proof of concept study.
J.Neuroeng. Rehabil. 6, 28 (2009).
96. Henderson, A., Korner-Bitensky, N. & Levin, M. Virtual
reality in stroke rehabilitation: a systematic review of
its effectiveness for upper limb motor recovery. Top.
Stroke Rehabil. 14, 52–61 (2007).
97. Merians, A.S. etal. Virtual reality — augmented
rehabilitation for patients following stroke. Phys. Ther.
82, 898–915 (2002).
98. Lecuyer, A. etal. Brain-computer interfaces, virtual
reality, and videogames. Computer 41, 66–72 (2008).
99. Carmena, J.M. et al. Learning to control a brain-
machine interface for reaching and grasping by
primates. PLoS Biol. 1, e42 (2003).
100. Lebedev, M.A. & Nicoleleis, M.A. Brain machine
interfaces: past, present and future. Trends Neurosci.
29, 536–546 (2006).
101. Donoghue, J., Nurmikko, A., Friehs, G. & Black, M.
Development of a neuromotor prosthesis for humans.
Suppl. Clin. Neurophysiol. 57, 588–602 (2004).
102. Donoghue, J.P. Bridging the brain to the world: a
perspective on neural interface systems. Neuron 60,
511–521 (2008).
103. Wolpaw, J.R., McFarland, D.J., Vaughan, T.M. &
Schalk, G. The Wadsworth Center Brain-Computer
Interface (BCI) research and development program.
IEEE Trans. Neural Syst. Rehabil. Eng. 11, 204–207
(2003).
104. Cerf, M. etal. On-line, voluntary control of human
temporal lobe neurons. Nature 467, 1104–1108
(2010).
105. Ma, C. & He, J. A novel experimental system for
investigation of cortical activities related to lower limb
movements. Conf. Proc. IEEE Eng. Med. Biol. Soc. 1,
2679–2682 (2006).
106. Scott, S.H. Converting thoughts into action. Nature
442, 141–142 (2006).
107. Shadmehr, R. & Wise, S.P. The Computational
Neurobiology of Reaching and Pointing: A Foundation
for Motor Learning. (MIT Press, Cambridge, USA, 2005).
108. Helms-Tillery, S.I., Taylor, D.M. & Schwartz, A.B.
Training in cortical control of neuroprosthetic devices
improves signal extraction from small neuronal
ensembles. Rev. Neurosci. 14, 107–119 (2003).
109. Bunce, S.C., Izzetoglu, M., Izzetoglu, K. & Onaral, B.
Functional near-infrared spectroscopy: an emerging
neuroimaging modality. IEEE Eng. Med. Biol. Mag. 25,
54–62 (2006).
110. Barfield, W. & Danas, E. Comments on the use of
olfactory displays for virtual environments. Presence
5, 109–121 (1995).
111. Cater, J.P. The nose have it! Presence 1, 493–494
(1992).
112. Keller, P.E., Kouzes, R.T. & Kangas, L.J. in Interactive
Technology and the New Paradigm for Healthcare
(Studies in Health Technology and Informatics) (eds
Satava, R. M., Morgan, K., Sieburg, H. B.,
Mattheus, R. & Christensen, J. P.) 168–172 (IOS
Press, Washington DC, USA, 1995).
113. Yanagida, Y., Kawato, S., Noma, H., Tomono, A. &
Tesutani, N. Projection based olfactory display with
nose tracking. Proc. of the IEEE Virtual Reality Conf.
2004 [online], http://ieeexplore.ieee.org/xpl/freeabs_
all.jsp?arnumber=1310054 (2004).
114. Zimmer, H., Mecklinger, A. & Lindenberger, U. (eds)
Handbook of Binding and Memory: Perspectives from
Cognitive Neuroscience (Oxford Univ. Press, USA,
2006).
115. Cholewiak, R.W. & Collins, A.A. Vibrotactile pattern
discrimination and communality at several body sites.
Percept. Psychophys. 57, 724–737 (1995).
116. Krueger, M. Artificial Reality (Addison-Wesley, New
York, 1991).
117. Biocca, F. The cyborg’s dilemma: progressive
embodiment in virtual environments. J.Comput.
Mediat. Commun. 23Jun 2006
(doi:10.1111/j.1083-6101.1997.tb00070.x).
118. Lombard, M. & Ditton, T. At the heart of it all: the concept
of presence. J.Comput. Mediat. Commun. 23Jun 2006
(doi:10.1111/j.1083-6101.1997.tb00072.x).
119. Meehan, M., Insko, B., Whitton, M. & Brooks F. P. Jr.
Physiological measures of presence in stressful virtual
environments. ACM Trans. Graph. 21, 645–652 (2002).
Acknowledgements
This work was supported in part by grant number R31-10062
to F.A.B. from the World Class University (WCU) project of the
Korean Ministry of Education, Science and Technology
(MEST) and the Korea National Research Foundation (NRF)
through Sungkyunkwan University. The project was also sup-
ported in part by the AT&T and Newhouse endowments
awarded to F.A.B.
Competing interests statement
The authors declare no competing financial interests.
FURTHER INFORMATION
Corey J. Bohil’s homepage: http://psychology.cos.ucf.edu/
faculty_bohil.php
ALL LINKS ARE ACTIVE IN THE ONLINE PDF
REVIEWS
762
|
DECEMBER 2011
|
VOLUME 12 www.nature.com/reviews/neuro
© 2011 Macmillan Publishers Limited. All rights reserved
... Overall, VR comes along with many major advantages. For example, HMDs are costeffective, and VEs can be modified effortlessly following the scientific or individuals' therapeutic needs while maintaining complete experimental control of all relevant variables (Bohil et al., 2011;Gregg & Tarrier, 2007;Neo et al., 2021). Moreover, these virtually endless possibilities permit the conduction of experiments that would not be possible in real life, e.g., investigating human spatial memory and navigation using the Morris water task (Driscoll et al., 2005 as cited in Bohil et al., 2011) the virtual human EPM (Biedermann et al., 2017) or even safety behavior in the event of a fire (Kinateder, Ronchi, Gromer, et al., 2014;Kinateder, Ronchi, Nilsson, et al., 2014). ...
... For example, HMDs are costeffective, and VEs can be modified effortlessly following the scientific or individuals' therapeutic needs while maintaining complete experimental control of all relevant variables (Bohil et al., 2011;Gregg & Tarrier, 2007;Neo et al., 2021). Moreover, these virtually endless possibilities permit the conduction of experiments that would not be possible in real life, e.g., investigating human spatial memory and navigation using the Morris water task (Driscoll et al., 2005 as cited in Bohil et al., 2011) the virtual human EPM (Biedermann et al., 2017) or even safety behavior in the event of a fire (Kinateder, Ronchi, Gromer, et al., 2014;Kinateder, Ronchi, Nilsson, et al., 2014). Besides, it is also possible to combine VR with self-report feedback and record physiological data in real-time for further insights. ...
Thesis
Full-text available
Anxiety research is one of the major psychological research domains and looks back on decades of research activity. Traditionally, novel theories and approaches are tested utilizing animal models. One way to study inherent anxiety in rodents is the elevated plus-maze (EPM). The EPM is a plus-shaped platform with two closed, i.e., walled, arms and two open unwalled arms. If given the opportunity to freely explore the apparatus, rodents instinctively avoid the open arms to protect themselves from predators. Hence, they spent less time on open and more time on closed arms, which is behaviorally associated with general anxiety. In the course of the pharmacological validation, it was found that this exploratory pattern can be reversed by anxiolytic substances, e.g., benzodiazepines, or potentiated by anxiogenics. One of the significant advantages of the EPM is that no prior training session is required in contrast to conditioning studies, thus allowing to observe natural behavior. Therefore, together with the economic and uncomplicated setup, the EPM has become a standard preclinical rodent anxiety test over the decades. In order to validate these rodent anxiety tests, there have recently been attempts to retranslate them to humans. A paramount of cross-species validation is not only the simple transferability of these animal tests but also the observation of anxiety behaviors that are evolutionarily conserved across species. Accordingly, it could be possible to conclude various factors associated with the etiology and maintenance of anxiety disorders in humans. So far, convincing translations of the EPM to humans are still lacking. For that reason, the primary aim of this dissertation is to retranslate the EPM throughout three studies and to evaluate cross-species validity critically. Secondly, the undertaken studies are set out to observe ambulatory activity equivalent to rodent EPM behavior, i.e., open arm avoidance. Thirdly, the undertaken studies aimed to assess the extent to which trait anxiety influences human exploratory activity on the platform to associate it with the assumption that rodent EPM-behavior is a reflection of general anxiety. Finally, virtual reality (VR) was the method of choice to maintain the economic advantage and adjust the EPM size to humans. Study 1 (N = 30) was set up to directly transfer the rodent EPM regarding test design and experimental procedure using a Computer Automatic Virtual Environment (CAVE). The results revealed that humans unlike rodents display a general open arms approach during free exploration. However, open arm avoidance was associated with high trait anxiety and acrophobia (fear of height), which was initially assessed as a control variable due to the virtual platform height. Regression analyses and subjective anxiety ratings hinted at a more significant influence of acrophobia on open arm avoidance. In addition, it was assumed that the open arms approach might have resulted from claustrophobic tendencies experienced in the closed arms due to the high walls. Study 2 (N = 61) sought to differentiate the influence of trait anxiety and acrophobia and adapt the virtual EPM to humans. Therefore, parts of the platform held a semi-transparent grid-floor texture, and the wall height on the closed arms was reduced to standard handrail level. Moreover, participants were priorly screened to exclude clinically significant levels of acrophobia, claustrophobia, and agoraphobia. The data on general exploratory activity showed no arm preference. Regression analyses confirmed that acrophobia is related to open arm avoidance, corroborating the finding of Study 1. Surprisingly, for trait anxiety, the result of Study 1 could not be replicated. Instead, for trait anxiety, no significant effect was found indicating that predominantly fear of heights shapes human EPM behavior even on a subclinical stage. In Study 3 (N = 57), the EPM was embedded into a city setting to 1) create a more natural human environment and 2) eliminate height. Furthermore, a head-mounted display was utilized for VR presentation, and arousal ratings were introduced. Participants were screened for high and low levels of trait anxiety and agoraphobia, and claustrophobia. Replicating the findings of Study 2, no difference in open and closed arm activity was observed, and no effect was found in relationship with trait anxiety. The data on anxiety ratings and claustrophobia suggest a positive correlation indicating that in this city EPM, claustrophobic tendencies might play a role in closed arm avoidance. In summary, this thesis added valuable insights into the retranslation of a well-established standard anxiety test used in rodents. However, it also majorly challenges current findings on the cross-species validity of the EPM. Various explanatory models for the results are critically discussed and associated with clinical implications concerning future research.
... An interesting option that has recently emerged is the use of virtual reality (VR). Protocols involving VR are applicable in different clinical contexts (Bohil et al., 2011;Gumma and Youssef, 2019;Qian et al., 2020), including as part of training before the use of the physical prosthesis, for people with amputations (Kluger et al., 2019). Furthermore, there is an extensive literature corroborating the embodiment of bodies, limbs, or virtual objects (Cole et al., 2009;Slater et al., 2009;Sengül et al., 2012;Shokur et al., 2016;Buck et al., 2020). ...
... Visual feedback in immersion in a VR environment was chosen based on the results of previous studies that have shown promising effects in a variety of clinical contexts (Bohil et al., 2011;Gumma and Youssef, 2019;Kluger et al., 2019;Qian et al., 2020) and in the induction of the embodiment of a body, limb or virtual object (Slater et al., , 2010Sengül et al., 2012;Shokur et al., 2016). ...
Article
Full-text available
Therapeutic strategies capable of inducing and enhancing prosthesis embodiment are a key point for better adaptation to and acceptance of prosthetic limbs. In this study, we developed a training protocol using an EMG-based human-machine interface (HMI) that was applied in the preprosthetic rehabilitation phase of people with amputation. This is a case series with the objective of evaluating the induction and enhancement of the embodiment of a virtual prosthesis. Six men and a woman with unilateral transfemoral traumatic amputation without previous use of prostheses participated in the study. Participants performed a training protocol with the EMG-based HMI, composed of six sessions held twice a week, each lasting 30 mins. This system consisted of myoelectric control of the movements of a virtual prosthesis immersed in a 3D virtual environment. Additionally, vibrotactile stimuli were provided on the participant’s back corresponding to the movements performed. Embodiment was investigated from the following set of measurements: skin conductance response (affective measurement), crossmodal congruency effect (spatial perception measurement), ability to control the virtual prosthesis (motor measurement), and reports before and after the training. The increase in the skin conductance response in conditions where the virtual prosthesis was threatened, recalibration of the peripersonal space perception identified by the crossmodal congruency effect, ability to control the virtual prosthesis, and participant reports consistently showed the induction and enhancement of virtual prosthesis embodiment. Therefore, this protocol using EMG-based HMI was shown to be a viable option to achieve and enhance the embodiment of a virtual prosthetic limb.
... More recently, VR has been adopted by neuroscientists to create interactive multimodal, sensory, clinical, experimental and social environments. VR also offers the advantage of a high level of control over the environment in neuroscientific research (Bohil et al. 2011). ...
... A related concept underlying VR therapy as a treatment is that it may promote brain neuroplasticity by engaging users in multisensory training (Teo et al. 2016). Cheaper and smaller VR systems are being developed, which will allow the measurement of neural activity in the real world and advance our understanding of the role of the body in cognitive performance i.e. during embodied cognition (Bohil et al. 2011;Wilson and Golonka 2013). Theories of embodied cognition propose that cognition is shaped by the body and is understood in terms of a physical body that interacts with the world (Wilson 2002). ...
Article
Full-text available
Recent information technologies such as virtual reality (VR) and augmented reality (AR) allow the creation of simulated sensory worlds with which we can interact. Using programming language, digital details can be overlaid onto displays of our environment, confounding what is real and what has been artificially engineered. Natural language, particularly the use of direct verbal suggestion (DVS) in everyday and hypnotic contexts, can also manipulate the meaning and significance of objects and events in ourselves and others. In this review, we focus on how socially rewarding language can construct and influence reality. Language is symbolic, automatic and flexible and can be used to augment bodily sensations e.g. feelings of heaviness in a limb or suggest a colour that is not there. We introduce the term ‘suggested reality’ (SR) to refer to the important role that language, specifically DVS, plays in constructing, maintaining and manipulating our shared reality. We also propose the term edited reality to encompass the wider influence of information technology and linguistic techniques that results in altered subjective experience and review its use in clinical settings, while acknowledging its limitations. We develop a cognitive model indicating how the brain’s central executive structures use our personal and linguistic-based narrative in subjective awareness, arguing for a central role for language in DVS. A better understanding of the characteristics of VR, AR and SR and their applications in everyday life, research and clinical settings can help us to better understand our own reality and how it can be edited.
... These companies introduced a new generation of devices capable of displaying immersive environments with a faster refresh rate, hence less motion sickness [2]. Many VR applications have been developed focused on video games, but researchers are using it for different fields [3][4][5][6], including rehabilitation [7][8][9]. For instance, phantom pain is a known problem where up to 90% of amputees suffer pain from a missing limb; some traditional therapies include using a mirror to cheat the brain into thinking the limb is still there and hence stop the pain. ...
Article
Full-text available
Over seven million people suffer from an impairment in Mexico; 64.1% are gait-related, and 36.2% are children aged 0 to 14 years. Furthermore, many suffer from neurological disorders, which limits their verbal skills to provide accurate feedback. Robot-assisted gait therapy has shown significant benefits, but the users must make an active effort to accomplish muscular memory, which usually is only around 30% of the time. Moreover, during therapy, the patients’ affective state is mostly unsatisfied, wide-awake, and powerless. This paper proposes a method for increasing the efficiency by combining affective data from an Emotiv Insight, an Oculus Go headset displaying an immersive interaction, and a feedback system. Our preliminary study had eight patients during therapy and eight students analyzing the footage using the self-assessment Manikin. It showed that it is possible to use an EEG headset and identify the affective state with a weighted average precision of 97.5%, recall of 87.9%, and F1-score of 92.3% in general. Furthermore, using a VR device could boost efficiency by 16% more. In conclusion, this method allows providing feedback to the therapist in real-time even if the patient is non-verbal and has a limited amount of facial and body expressions.
... Virtual reality offers a platform for a function-led measure that simulates daily activities in a standardized manner [18][19][20][21]. Virtual reality is an immersive environment that allows people to manipulate, visualize, and interact with computers in a complex manner, providing many new opportunities for clinical research and assessments [22]. Function-led virtual reality assessments may enhance ecological validity and better predict real-world functioning abilities [23,24]. ...
Article
Full-text available
The California Verbal Learning Test, Second Edition (CVLT-II) and the Virtual Environment Grocery Store (VEGS) use list learning and recognition tasks to assess episodic memory. This study aims to: (1) Replicate prior construct validity results among a new sample of young adults and healthy older adults; (2) Extend this work to a clinical sample of older adults with a neurocognitive diagnosis; (3) Compare CVLT-II and VEGS performance among these groups; and (4) Validate the independence of CVLT and VEGS episodic memory performance measures from executive functioning performance measures. Typically developing young adults (n = 53) and older adults (n = 85), as well as older adults with a neurocognitive diagnosis (n = 18), were administered the CVLT-II, VEGS, and D-KEFS CWIT. Results found that (1) the relationship of the VEGS and CVLT-II measures was highly correlated on all variables, (2) compared to the CVLT-II, participants (particularly older adults) recalled fewer items on the VEGS, and (3) the CVLT-II and VEGS were generally independent of D-KEFS CWIT. It appeared that the VEGS may be more difficult than the CVLT-II, possibly reflecting the word length effect. Performance may have also been impacted by the presence of everyday distractors in the virtual environment.
... As previously mentioned, this method has greater benefits than graphic-based VR as it can capture the real environment, providing a high level of visual realism that can increase participant engagement. Moreover, this technology is inexpensive and easy-to-use (Bohil et al., 2011). Furthermore, the user-friendly design makes 360 • technologies more suitable for the assessment of patients with mild to severe impairments (Sbordone, 1996;Realdon et al., 2019) who may have some difficulties interacting with more sophisticated devices. ...
Article
Full-text available
Traditional neuropsychological evaluations are usually carried out using psychometric paper and pencil tests. Nevertheless, there is a continuous discussion concerning their efficacy to capture lifelike abilities. The introduction of new technologies, such as Virtual Reality (VR) and 360° spherical photos and videos, has improved the ecological validity of the neuropsychological assessment. The possibility of simulating realistic environments and situations allows clinicians to evaluate patients in realistic activities. Moreover, 360° photos and videos seem to provide higher levels of graphical realism and technical user-friendliness compared to standard VR, regardless of their limitations in terms of interactivity. We developed a novel 360° tool, ObReco-2 (Object Recognition version 2), for the assessment of visual memory which simulates a daily situation in a virtual house. More precisely, patients are asked to memorize some objects that need to be moved for a relocation. After this phase, they are asked to recall them after 15 min and later to recognize them in the same environment. Here we present a first study about the usability of ObReco-2, and a second one exploring its clinical efficacy and updated usability data. We focused on Free Recall and Recognition scores, comparing the performances obtained by the participants in the standard and the 360° test. The preliminary results support the use of 360° technology for enhancing the ecological value of standard memory assessment tests.
... Along with the growing popularity of virtual reality (VR) platforms and visual display technologies, VR is a promising technology shaping the future of online shopping. According to Fortune Business Insight, the global VR market was valued at $4.42 billion in 2020 and is predicted to increase to $84.09 billion by 2028. 1 VR applications have been widely studied across industries, namely: education (Huang et al., 2010;Jaradat & Imlawi, 2021;Merchant et al., 2014), tourism (Kang, 2020;Skard et al., 2021), military training (Pallavicini et al., 2016), healthcare (Bohil et al., 2011), games and sports (Faiola et al., 2013;McMahan et al., 2012;Oagaz et al., 2022), advertising & media (De Gauquier et al., 2019;Sukoco & Wu, 2011), and among other. ...
Article
Full-text available
With the rapid development of virtual reality (VR) hardware devices and virtual technologies, VR applications in retailing are increasingly becoming an inevitable trend of future shopping, particularly essential to the “metaverses” in the digitalization 4.0 era. Thanks to its superiority over traditional e-commerce, virtual stores can provide highly interactive and immersive sensory shopping experiences to system users or end customers. Todays, both retailers and scholars, pay much attention to impulse buying behavior. Nevertheless, this domain remains unaddressed in the context of virtual commerce. Drawing upon the Stimulus-Organism-Response paradigm, this article investigates the effectiveness of VR shopping platform factors, namely interactivity and vividness, that impacts consumers’ internal stages (telepresence, perceived diagnosticity, and playfulness) and lead to the urge to buy impulsively in virtual shopping stores. A full 2 × 2 × 2 factorial design experiment was conducted with 227 participants. The empirical results show that interactivity and vividness positively impact telepresence, perceived diagnosticity, and playfulness, which trigger consumers' urge to buy impulsively. Furthermore, the role of the impulsiveness trait in driving the urge to buy impulsively was further explicitly investigated in the VR environment. Detailed discussions and future research directions were also provided.
Article
Background. The application of virtual reality (VR) in clinical settings is growing rapidly, with encouraging results. As VR has been introduced into complementary and alternative medicine (CAM), a systematic review must be undertaken to understand its current status. Aim. This review aims to evaluate and summarize the current applications of VR in CAM, as well as to explore potential directions for future research and development. Methods. After a brief description of VR technology, we discuss the past 20 years of clinical VR applications in the medical field. Then, we discuss the theoretical basis of the combination of VR technology and CAM, the research thus far, and practical factors regarding usability, etc., from the following three main aspects: clinical application, teaching, and scientific research. Finally, we summarize and propose hypotheses on the application of VR in CAM and its limitations. Results. Our review of the theoretical underpinnings and research findings to date leads to the prediction that VR and CAM will have a significant impact on future research and practice. Conclusion. Although there is still much research needed to advance the science in this area, we strongly believe that VR applications will become indispensable tools in the toolbox of CAM researchers and practitioners and will only grow in relevance and popularity in the era of digital health.
Thesis
Full-text available
Die Rehabilitation von Gangstörungen bei Patienten mit MS und Schlaganfall erfolgt häufig mithilfe eines konventionellen Laufbandtrainings. Einige Studien haben bereits gezeigt, dass durch eine Erweiterung dieses Trainings um eine virtuelle Realität die Motivation der Patienten gesteigert und die Therapieergebnisse verbessert werden können. In der vorliegenden Studie wurde eine immersive VR-Anwendung (unter Verwendung eines HMD) für die Gangrehabilitation von Patienten evaluiert. Hierbei wurden ihre Anwendbarkeit und Akzeptanz geprüft sowie ihre Kurzzeiteffekte mit einer semi-immersiven Präsentation (unter Verwendung eines Monitors) und mit einem konventionellen Laufbandtraining ohne VR verglichen. Der Fokus lag insbesondere auf der Untersuchung der Anwendbarkeit beider Systeme und der Auswirkungen auf die Laufgeschwindigkeit und Motivation der Benutzer. Im Rahmen einer Studie mit Innersubjekt-Design nahmen zunächst 36 gesunde Teilnehmer und anschließend 14 Patienten mit MS oder Schlaganfall an drei experimentellen Bedingungen (VR über HMD, VR über Monitor, Laufbandtraining ohne VR) teil. Sowohl in der Studie mit gesunden Teilnehmern als auch in der Patientenstudie zeigte sich in der HMD-Bedingung eine höhere Laufgeschwindigkeit als beim Laufbandtraining ohne VR und in der Monitor-Bedingung. Die gesunden Studienteilnehmer berichteten über eine höhere Motivation nach der HMD-Bedingung als nach den anderen Bedingungen. Es traten in beiden Gruppen keine Nebenwirkungen im Sinne einer Simulator Sickness auf und es wurden auch keine Erhöhungen der Herzfrequenzen nach den VR-Bedingungen detektiert. Die Bewertungen des Präsenzerlebens waren in beiden Gruppen in der HMD-Bedingung höher als in der Monitor-Bedingung. Beide VR-Bedingungen erhielten hohe Bewertungen für die Benutzerfreundlichkeit. Die meisten der gesunden Teilnehmer (89 %) und Patienten (71 %) präferierten das HMD-basierte Laufbandtraining unter den drei Trainingsformen und die meisten Patienten könnten sich vorstellen, es häufiger zu nutzen. Mit der vorliegenden Studie wurde eine strukturierte Evaluation der Anwendbarkeit eines immersiven VR-Systems für die Gangrehabilitation geprüft und dieses erstmals in den direkten Vergleich zu einem semi-immersiven System und einem konventionellen Training ohne VR gesetzt. Die Studie bestätigte die Praktikabilität der Kombination eines Laufbandtrainings mit immersiver VR. Aufgrund ihrer hohen Benutzerfreundlichkeit und der geringen Nebenwirkungen scheint diese Trainingsform besonders für Patienten geeignet zu sein, um deren Trainingsmotivation und Trainingserfolge, wie z. B. die Laufgeschwindigkeit, zu steigern. Da immersive VR-Systeme allerdings nach wie vor spezifische technische Installationsprozeduren erfordern, sollte für die spezifische klinische Anwendung eine Kosten-Nutzen-Bewertung erfolgen.
Chapter
Neurological disorders are one of the most common causes of motor/cognitive impairments leading to adult disability. Neurorehabilitation is defined as a complex rehabilitation process directed to recovery from a nervous system injury, and to minimize or compensate the associated functional limitations. The frequent incomplete recovery of the neurological patients induces to the introduction of novel neurorehabilitative treatments, tailored to the patients, targeting the specific motor or cognitive disorders. The aim of this chapter is to bring together the latest findings on new technologies including virtual reality across the multiple research fields of rehabilitation in neurological disorders, mapping key developments and innovations such as telerehabilitation systems.
Article
Full-text available
The use of virtual-reality technology in the areas of rehabilitation and therapy continues to grow, with encouraging results being reported for applications that address human physical, cognitive, and psychological functioning. This article presents a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis for the field of VR rehabilitation and therapy. The SWOT analysis is a commonly employed framework in the business world for analyzing the factors that influence a company's competitive position in the marketplace with an eye to the future. However, the SWOT framework can also be usefully applied outside of the pure business domain. A quick check on the Internet will turn up SWOT analyses for urban-renewal projects, career planning, website design, youth sports programs, and evaluation of academic research centers, and it becomes obvious that it can be usefully applied to assess and guide any organized human endeavor designed to accomplish a mission. It is hoped that this structured examination of the factors relevant to the current and future status of VR rehabilitation will provide a good overview of the key issues and concerns that are relevant for understanding and advancing this vital application area.
Article
A male advantage is often reported for measures of visuospatial performance, including measures of spatial navigation; however, few papers have addressed sex differences in brain activity during performance of these navigation tasks. We used functional MRI to compare the brain activation between young adult men and women during performance in a virtual environment (VE). Men and women did not differ in performance, but sex differences were apparent in the functional neuroanatomical correlates of navigation. In particular there was increased activation of the posterior cingulate retrosplenial cortex in men, and, in women with perfect recall performance, increased activation of the parahippocampal gyrus. These two areas are keys to successful navigation. Our results demonstrate that even when men and woman are well-matched on navigation performance, they appear to use different brain mechanisms to achieve the same behavioral end point.