Content uploaded by Jocelyn Parong
Author content
All content in this area was uploaded by Jocelyn Parong on Jan 24, 2020
Content may be subject to copyright.
RESEARCH ARTICLE
Cognitive consequences of playing brain‐training games in
immersive virtual reality
Jocelyn Parong |Richard E. Mayer
Psychological and Brain Sciences Department,
University of California, Santa Barbara, Santa
Barbara, California
Correspondence
Jocelyn Parong, Psychological and Brain
Sciences Department, University of California,
Santa Barbara, Building 251, Santa Barbara
93106, CA.
Email: parong@ucsb.edu
Funding information
Office of Naval Research, Grant/Award Num-
ber: N000141612046
Summary
The goal of the present study was to examine the effects of playing an immersive
virtual reality game that included a collection of gamified cognitive tasks, Cerevrum,
on specific components of cognition, including perceptual attention, mental rotation,
working memory, visualization, visual field of view, and visual processing speed.
Participants completed a pretest of cognitive assessments, played one of the two
mini‐games within Cerevrum (Stardust or Heroes) for 1.5 hr over three 30‐min sessions
and then completed a posttest of cognitive assessments and a questionnaire about
interest and engagement during the game. An inactive control group completed only
the pretest and posttest. Results showed no significant differences among the Heroes
group, Stardust group, and control group on the posttest scores, even when
controlled for pretest scores. These findings do not support the claim that playing
brain‐training games for a short period results in transfer of cognitive training to
nongame venues.
KEYWORDS
immersive virtual reality, brain training games, cognitive skill training, game‐based learning, video
games
1|INTRODUCTION
1.1 |Objective and rationale
Can you improve your cognitive skills by playing brain‐training games?
This is the question that motivates the present study. Brain‐training
games are games that are intended to improve players' cognitive skills,
such as perceptual attention, spatial cognition, or executive function
(Mayer, 2014). More specifically, brain‐training games add gamified
elements to cognitive tasks in an effort to improve some component
of cognition. The gamification of cognitive training often includes aes-
thetics such as more detailed art or music, mechanics such as a reward
system often based on levels and points, and an immersive storyline
designed to entertain the players, which may induce more motivation
in users than other cognitive training programs (Anguera & Gazzaley,
2015; Kapp, 2012). Some examples of brain‐training games include
Lumos Lab's Lumosity and Nintendo's Brain Age, which both include
a suite of gamified cognitive tasks, each aimed at improving a specific
cognitive skill.
Brain‐training games more properly could be called cognitive skill
training games (or cognitive training games), and they fit more broadly
into the category of cognitive training programs. Cognitive training
programs are defined as “mentally challenging training regimens that
aimed to train cognitive mechanisms or skills on objective (i.e., not
reported or self‐reported) behavioral measures of cognitive skills or
academic achievement”(Sala et al., 2018). Some examples include
the ACTIVE trial, which included training in reasoning, memory, and
speed‐of‐processing; the IHAMS and SKILL studies, which both
included a useful field of view training task; CogMed, which included
working memory training particularly for those with learning and
attention problems, such as ADHD; and other working memory
training programs that have included training in the n‐back task or
visuospatial memory tasks (Ball et al., 2002; Edwards et al., 2005;
Jaeggi, Buschkuehl, Jonides, & Perrig, 2008; Klingberg, Forssberg, &
Received: 17 October 2018 Revised: 29 May 2019 Accepted: 12 June 2019
DOI: 10.1002/acp.3582
Appl Cognit Psychol. 2020;34:29–38. © 2019 John Wiley & Sons, Ltd.wileyonlinelibrary.com/journal/acp
29
Westerberg, 2002; Rebok et al., 2014; Willis et al., 2006; Wolinsky
et al., 2011; Wolinsky, Vander Weg, Howren, Jones, & Dotson, 2013).
In particular, the game we focus on in this study is Cerevrum, which
is a suite of brain‐training games played in immersive virtual reality
(IVR). The cognitive skills we focus on are the educationally relevant
skills of perceptual attention, mental rotation, working memory, visual-
ization, visual field of view, and visual processing speed. The research
method is what can be called cognitive consequences research in
which we compare the change in cognitive skill of a group that plays
the target game (e.g., Cerevrum) versus a group that engages in a
control activity (e.g., not playing a game). Overall, the goal of this study
is to determine whether playing Cerevrum for a short period causes an
improvement in the educationally relevant cognitive skills required in
the game.
In recent years, as video games have become more ubiquitous in
households and classrooms, researchers and video game developers
have envisioned using them as tools to make consumers smarter or
enhance consumers' cognitive skills (Shaffer, 2006; Squire, 2011).
Strong claims have been made by companies promoting these so‐
called “brain‐training”video games. For example, Nintendo's Brain
Age slogan was “train your brain in minutes a day,”and Lumos Lab's
Lumosity claimed that “Lumosity exercises are engineered to train a
variety of core cognitive functions”(Simons et al., 2016). Additionally,
a survey by AARP found that over half of consumers believed that
brain‐training could improve memory, sharpen intellectual skills, pre-
vent memory loss, improve attention, or increase IQ (David & Gelfeld,
2015). Because of the implications of these claims, it is important to
rigorously examine the cognitive effects of these brain‐training games,
so consumers, as well as educators, can make informed choices in
using their programs.
Overall, the idea of using video games or other programs to
enhance cognitive skills sounds promising and may be imperative for
improving everyday skills, as well as skills needed for academic or
career success. For example, executive function, a set of cognitive
skills required for focused, goal‐directed behavior, has been shown
to have implications in the classroom and be a strong predictor of aca-
demic success (Best, 2014; St. Clair‐Thompson & Gathercole, 2006).
Because of this, brain‐training games may be seen as educationally rel-
evant. If the claims of brain‐training game companies are correct, the
implications for students and teachers could be substantial. However,
at the present time, there is mixed support for these claims and more
thorough testing of these games is needed.
1.2 |Cognitive theories of transfer
Theories of transfer posit various cognitive outcomes of playing brain‐
training games. As a player plays a game, he or she repeatedly prac-
tices a particular cognitive skill or skills over time. The degree to which
the skill practiced in the game can be applied to a novel task outside
the game context is the amount of transfer of cognitive skill that
occurs (Mayer & Wittrock, 1996). Anderson and Bavelier (2011) argue
that playing video games places high demands on certain cognitive
components, which should facilitate increases in those components.
Researchers have defined two types of transfer: near transfer and far
transfer (Barnett & Ceci, 2002; Mayer, 2014). Near transfer refers to
the generalization of practice on a task to a similar task; as a learner
learns a skill, it only transfers to a new task to the extent that the tasks
have common elements. An example of near transfer occurs when a
player engages in a brain‐training game aimed at a particular skill and
shows more improvement than a control group on tests of that skill
outside of the game environment. On the other hand, far transfer
refers to a player improving unrelated tasks or improving cognitive
skills in general after practicing a task. An example of far transfer
would be when playing a video game results in improving the mind
in general (Singley & Anderson, 1989). For example, if a player were
to play Tetris for a certain amount of training time, near transfer theo-
ries would predict that he or she would only improve in mentally rotat-
ing 2D Tetris shapes but not other cognitive skills (Sims & Mayer,
2002). However, far transfer theories would predict that not only
would 2D mental rotation skill be enhanced but also other cognitive
skills not trained in the game would also be enhanced, perhaps such
as spatial visualization or perceptual attention (Sims & Mayer, 2002).
A middle ground theory between near and far transfer is the the-
ory of specific transfer of general skills, which predicts that the under-
lying cognitive skills trained in video games should transfer to other
tasks that required the same underlying cognitive skills (Mayer,
2014; Sims & Mayer, 2002). For example, the theory of specific trans-
fer of general skills would predict that playing Tetris would enhance
mental rotation in general, which should transfer to mental rotation
of other shapes in novel tasks, but not other cognitive skills, such as
general intelligence or working memory. Thus, this theory would pre-
dict that brain‐training games that target a specific cognitive skill
should improve performance on other tasks outside the game that
require the same skill (which reflects near transfer) but not on other
tasks that require different skills (which reflects far transfer). In the
present study, we test for both near and far transfer but recognize
that the larger literature in problem‐solving transfer suggests that
effective cognitive training is unlikely to create far transfer (Mayer &
Wittrock, 1996, 2006).
1.3 |Research on the effectiveness of cognitive
training on near and far transfer
1.3.1 |Cognitive training evidence
The underlying assumption of cognitive training is that at least some
cognitive mechanisms can be improved by repeated exposure to cog-
nitively demanding exercises due to neural plasticity (Karbach & Schu-
bert, 2013; Sala et al., 2018). Although some cognitive training
programs have shown positive effects (e.g., Ball et al., 2002; Jaeggi
et al., 2008), others have found null effects (e.g., Redick et al., 2013;
Rickard, Bambrick, & Gill, 2012). Meta‐analyses of various cognitive
training interventions have revealed some evidence for near transfer
to related cognitive skills and mixed evidence for far transfer to less
related cognitive skills. One meta‐analysis reported that working
PARONG AND MAYER
30
memory training led to near transfer performance on verbal and visuo-
spatial memory tasks, but no significant effects on far transfer tasks,
such as nonverbal ability, verbal ability, word decoding, reading com-
prehension, and arithmetic (Melby‐Lervåg, Redick & Hulme, 2016).
However, another meta‐analysis of working memory training specifi-
cally using the n‐back paradigm, in which participants must identify
whether a stimulus was the same as a stimulus presented ntrials back,
found a significant positive effect on far transfer measures of fluid
intelligence, including the Abstract Reasoning from the Differential
Aptitude Test, Raven's Advanced Progressive Matrices, and Blocks
and Matrix Reasoning from the Wechsler Adult Intelligence Scale
(Au et al., 2015).
A second‐order meta‐analysis found that working memory training
induced near transfer to memory tasks, which is moderated by the
type of population, but not far transfer to reasoning, speed, or lan-
guage tasks. Additionally, other types of cognitive training showed null
effects, particularly when placebo effects and publication bias were
controlled for (Sala et al., 2018). Thus, these meta‐analyses generally
reveal that cognitive training programs have small to medium effects
for near transfer tasks and small to null effects for far transfer.
1.3.2 |Brain‐training game evidence
Evidence for the effectiveness of brain‐training games have been sim-
ilar to other cognitive training programs; there is some evidence for
near transfer to similarly trained tasks and less evidence for far trans-
fer to dissimilar tasks of brain‐training games. In a meta‐analysis,
Adams and Mayer (2014) reported a large near‐transfer effect of
brain‐training games, including Dr. Kawashima's Brain Training and
Brain Age, on executive function skills (d= 1.04), and null to small
far‐transfer effects spatial cognition skills (d= .03) and perceptual
attention (d= .31). Sala et al. (2018) reported that video game cogni-
tive training led to a small effect for far transfer, and when placebo
effects and publication bias were controlled for, the effect equaled
zero. Simons et al. (2016) corroborate these findings and found evi-
dence that brain‐training programs improve performance on trained
tasks (near transfer), whereas there is less evidence that they improve
similarly related tasks or untrained tasks performed outside the game
(far transfer).
One specific example of the discrepancy in evidence for brain‐
training games surrounds the brain‐training game, Lumosity. Some
researchers have found that Lumosity was effective (Hardy et al.,
2015), whereas others have not found the same effects (Bainbridge
& Mayer, 2018; Kable et al., 2017). Hardy et al. (2015) compared a
group that played Lumosity for at least 15 min five times per week
for 10 weeks to an active control group that completed crossword
puzzles for the same amount of time. The groups completed a battery
of seven cognitive assessments, including the forward span, backward
span, Raven's Progressive Matrices, grammatical reasoning, arithmetic
reasoning, go/no‐go, and a search task, before and after training. The
results showed that those who played Lumosity had significantly
greater improvements in a composite score of the cognitive assess-
ments and subjectively reported better cognitive functioning than
those who completed crossword puzzles. However, the participants
in the study had different expectations, as the participants in the
Lumosity group were those who already had a free Lumosity account
and were compensated with a 6‐month subscription to the program;
thus, they were presumably more apt to perform better than those
in the control group, which may explain the improvements on the
self‐report and objective performance measures. Simons et al. (2016)
caution that these results should be interpreted with caution because
of serious methodological flaws and the potential of conflict of inter-
est created by having five of the seven authors being employed by
the company that sells Lumosity. It is also worth noting that a lawsuit
by the U.S. Federal Trade Commission against Lumos Labs claiming
deceptive advertising resulted in a $2 million fine for the company
(Simons et al., 2016).
In contrast to these findings, Bainbridge and Mayer (2018) trained
groups in specific games in Lumosity, which targeted attention skills or
cognitive flexibility skills for at least 15 hr over at least 73 sessions.
They compared the groups' improvements between pre‐and post‐
training assessments in attention and flexibility to an inactive control
group that played no game and found only that the flexibility group
improved on the Stroop task and useful field of view task (UFOV) task,
which does not provide strong evidence for transfer of cognitive skills.
Even though the cognitive skills tapped by the assessments were quite
similar to the cognitive skills practiced in the games, students who
played the game did not generally outperform those who did not. In
another study, 10 weeks of playing Lumosity games did not cause
improvements on tests of cognitive skill or brain activity during deci-
sion making as compared with a control group (Kable et al., 2017).
In part, the discrepancy may come from a lack of standardized
research methods for evaluating brain‐training programs (Green
et al., 2019). The support cited by brain‐training companies often
has methodological shortcomings, such as comparing pre‐and post‐
intervention performance on a cognitive test within a single interven-
tion group rather than comparing the gain in performance between an
intervention group and a control group or failure to randomly assign
participants to treatments (Simons et al., 2016).
1.4 |Current study
In the current study, we aim to further close the gaps between the
claims made by brain‐training companies, the theories that favor the
use of brain‐training games, and the empirical evidence to support
them. The video game of interest, Cerevrum, is a relatively new
brain‐training game played in IVR. Cerevrum claims that it “definitely
will improve your intelligence”and that it targets “the entire spectrum
of cognitive ability: memory, perceptual speed, multitasking, executive
function, and attention”(Cerevrum, Inc., 2017a). Similar to other brain‐
training type games, it consists of two mini‐games, each with gamified
tasks that are intended to target general intelligence and specific cog-
nitive skills, including multiple object tracking, working memory, 2D
and 3D mental rotation, and visualization. Using IVR rather than tradi-
tional media, such as desktop VR, may also offer unique affordances
PARONG AND MAYER 31
for cognitive skill training. For example, immersion has been shown to
increase a learner's feeling of presence, or the feeling of “being there”
in a virtual world, which may increase motivation for learning or atten-
tion on the target material (Bailenson et al., 2008; Kafai, 2006).
Cerevrum Inc. (2017b) has reported that students aged 18 to 24
who played both mini‐games, including all six gamified tasks, for 15
30‐min sessions over 3 months improved from a pretest to posttest
in tests of abstraction ability, conceptual‐logical thinking, figural syn-
thesis, spatial thinking, and operational logical memory. The focus of
the present study is to determine the effectiveness of playing each
mini‐game individually on the specific cognitive skills they target.
Based on the specific transfer of general skills theory, we predicted
that there would be transfer to nongame tasks that require the cogni-
tive skills that were trained in the games (near transfer) but not far
transfer of the cognitive skills to nongame tasks that were not trained
in the games. Specifically, based on the specific transfer of general
skills theory, we predicted that compared with a control group, the
Stardust group would improve in multiple object tracking, working
memory, and mental rotation, and the Heroes group would improve
in visualization, working memory, and mental rotation as those are
the cognitive skills targeted by each task in the two mini‐games,
respectively.
In light of the fact that this is the first experiment to test the
effectiveness of Cerevrum and in deference to the practical logistics
of having individual participants in IVR, our approach was to provide
exposure for a short duration of three 30‐min learning episodes. Our
goal was to provide preliminary evidence concerning the effects of
playing Cerevrum games for a short duration. We have obtained
significant effects with this level of exposure in previous a series of
studies involving learning executive function skills with custom
designed computer games (Mayer, Parong, & Bainbridge, 2019;
Parong et al., 2017). We also have used this level of exposure in sev-
eral similar studies with non‐VR games such as Lumosity (Bainbridge &
Mayer, 2018), Portal (Adams, Pilegard, & Mayer, 2016), and Tetris
(Pilegard & Mayer, 2018).
2|METHOD
2.1 |Participants and design
Adams and Mayer (2014) reported an effect size of d= 1.04 for brain‐
training games improving executive function skills. Based on this
effect size, an a priori power analysis revealed that 39 total partici-
pants are required to detect the effect. Participants were 81 under-
graduate students from the University of California, Santa Barbara
(65 females, ages 18–24, M= 19.40, SD = 1.09). Twenty‐one
participants were assigned to play the Stardust mini‐game, 22 were
assigned to play the Heroes mini‐game, and 38 were assigned to an
inactive control group. Participants were recruited through a partici-
pant recruitment website and were compensated 30 dollars for
completing the study.
2.2 |Materials
The materials used in this experiment included an IVR brain‐training
game, Cerevrum, a set of computer‐based cognitive assessments, and
a short participant questionnaire
1
to solicit the participants' age,
gender, gaming experience, and interest and enjoyment of playing
Cerevrum.Cerevrum consisted of two mini‐games, each with three
gamified tasks that required a specific cognitive skill. In the first
mini‐game, Stardust, the player was placed on a spaceship, and his or
her overall goal was to destroy enemy spaceships using three weapons
before his or her own spaceship was destroyed. Each of the three
weapons, shown in Figure 1, involved a different subgame that
required players to use a cognitive skill: Laser Drones, which required
multiple object tracking; Pew‐Pew, which required working memory;
and Firestorm, which required 2D mental rotation. Laser Drones pre-
sented the player with a number of colored spheres, which turned
gray and randomly shuffled around the player's field of view. The
player was then shown one color, and he or she had to identify a
sphere matching that color. In Pew‐Pew, the player was briefly pre-
sented with a row of characters with identifying features (e.g., color,
shape, and number of spikes). Two characters were then marked,
and the player's task was to indicate whether the two characters were
identical. The third weapon, Firestorm, presented colored spheres in
four concentric circles. The player's task was to create as many
adjacent pairs of same‐colored spheres by rotating the circles or swap-
ping adjacent circles.
In the second mini‐game, Heroes, the player's goal was to protect a
prized gem from incoming enemies by sending heroes to fight the ene-
mies. Similar to the first mini‐game, as shown in Figure 2, each of the
three heroes involved a different subgame that required a cognitive
skill: Executive Cubes, which required mental rotation; Constellation
Memory, which required working memory; and Polygons, which
required visualization. In Excessive Cubes, the player was shown a 3D
arrangement of cubes, each with identifying marks (e.g., squares on
one side and triangles on another side). The player's task was to
memorize the configuration of arrays by rotating the whole figure.
Then, the figure disappeared and reappeared with one or more new
cubes, which the player had to identify. In Constellation Memory, the
player first memorized a set of colored spheres with shapes. The
player was then shown a new set of spheres that included one sphere
that matched the original set, which the player had to identify. Finally,
in Polygons, a rotating target polygon (octahedron or cube) with a
shape on each face was presented to the player. The player's
task was to identify the matching polygon from a selection of three
other polygons.
The cognitive assessments used for the pre and posttests included
a multiple‐object tracking task (MOT) to measure perceptual attention,
a mental‐rotation task to measure spatial processing, the n‐back task
to measure working memory, a paper folding task to measure visuali-
zation, a race task to measure visual processing speed, and a UFOV
1
The participant questionnaire also included questions about the game‐playing experience,
but these data were not included because of concerns about possible confusing wording of
the questions.
PARONG AND MAYER
32
to measure visual field of view. In the MOT, the participant was pre-
sented with 10 white cross‐shaped objects on a black background. A
number of objects (2, 3, or 4) would then briefly flash before all 10
objects randomly moved around the screen. The participant's task
was to track the objects that flashed and identify their end positions
on the screen. The objects moved around the screen for a total of
10 to 20 s. Participants were presented with six trials in a random
order. The mental‐rotation task was adapted from Shepard and
Metzler's (1971) mental rotation task. Two 3D figures were presented,
and the participant's task was to indicate whether they were the
same or different. Trials were the same when one figure could be
rotated to be superimposed on the other figure. Participants were
presented with 24 trials in a random order, with 12 the same and 12
different trials.
The n‐back sequentially presented individual letters to participants
for 1.5 s each, and their task was to press the Spacebar when the cur-
rent letter matched the letter presented nnumber of letters back. In
this experiment, a two‐back task with 66 letters, including 18 target
letters, was used. The paper folding task was a computerized version
of the same task from the Kit of Factor‐Referenced Cognitive Tests
(Ekstrom, French, & Harman, 1979). Each trial presented a set of
images depicting a square piece of paper being folded and punched
with a single hole. The participant's task was to identify which image
represented what the piece of paper would look like unfolded from
a set of five images. The 10 trials were presented in a fixed order from
easiest to hardest. In the race task, two objects were displayed at dif-
ferent locations on the left side of the screen. The objects then moved
at different speeds in a straight line towards a vertical line on the right
side of the screen. The participant's task was to identify as quickly and
as accurately as possible which object would cross the vertical finish
line first. The task had 16 trials presented in a random order. Finally,
the UFOV first briefly displayed a fixation cross. Then, a star was
briefly displayed along with distractor square images for 80 ms,
followed by a white noise mask screen. The participant's task was to
identify the location of the star on eight spokes around the fixation
cross. The UFOV task had 16 trials presented in a random order.
2.3 |Apparatus
Cerevrum was displayed on an HTC Vive, which included a head‐
mounted display and two wireless hand controllers, using Steam soft-
ware on an Alienware desktop computer. The controllers allowed the
user to interact with the virtual environment using intuitive gestures,
FIGURE 1 Screenshots of the gamified tasks in the Stardust mini‐game in Cerevrum, including Laser Drones (left), Pew‐Pew (center), and
Firestorm (right) [Colour figure can be viewed at wileyonlinelibrary.com]
FIGURE 2 Screenshots of the gamified tasks in the Heroes mini‐game in Cerevrum, including Excessive Cubes (left), Constellation Memory
(center), and Polygons (right) [Colour figure can be viewed at wileyonlinelibrary.com]
PARONG AND MAYER 33
and users received haptic feedback (i.e., vibrations) for certain interac-
tions. The console also included wall‐mounted sensors in the room to
allow the software to map the space in which the user could move.
2.4 |Procedure
Participants were randomly assigned to the Stardust,Heroes, or control
conditions. In the two VR game conditions, participants completed
three sessions over 9 days. In the first session, participants completed
a pretest that consisted of the six cognitive assessments and played
30 min of their assigned video game. In the second session, they com-
pleted 30 min of their assigned game. In the final session, participants
completed 30 min of their assigned game, a participant questionnaire
soliciting demographic information, and a posttest of the same six cog-
nitive assessments from the pretest. Thus, there were no tests after
each session in order to avoid the possibility of a testing effect in
which the act of taking a test is a form of instruction. They were
tested individually in a lab with a large space of approximately 12 ×
12 ft to allow for moving around in the IVR environment. Although
previous brain‐training studies have used a longer training duration
upwards of 20 hr, a 90‐min training duration as chosen for this study
to examine the immediate effects of a short amount of training and 20
or more hours may be impractical for players.
Participants in the control condition completed two sessions a
week apart. They completed the pretest of cognitive assessments in
the first session and the posttest and participant questionnaire in the
second session. The data that support the findings of this study are
available from the corresponding author upon reasonable request.
3|RESULTS
3.1 |Scoring
For the MOT, mental rotation, paper folding, and UFOV tasks, accu-
racy for number of correct trials was calculated. For the n‐back task,
d′, a sensitivity index of discriminating the target letters and nontarget
letters, was calculated. For the race task, the average response time
for all trials was calculated. Due to computer errors, data were not
collected from one participant in the Stardust condition for the mental
rotation task, two participants in the control condition in the n‐back
task, and one participant in the control condition in the race task.
3.2 |Do the groups differ on basic characteristics?
One‐way analyses of variance were conducted to test for preexisting
difference among the Stardust,Heroes, and control groups. The results
showed that the groups did not differ significantly in age, F(2, 77) =
2.75, p= .070, or in pretest scores on the MOT task, F(2, 77) =
0.73, p= .486, mental rotation task, F(2, 76) = 1.87, p= .161, n‐back
task, F(2, 75) = 2.21, p= .116, paper folding task, F(2, 77) = 0.65, p=
.523, race task, F(2, 76) = 0.09, p= .919, or UFOV task F(2, 77) =
1.90 p= .156. A chi‐square showed that the groups also did not differ
in the proportion of men and women in each group, χ
2
(2, N= 78) =
1.50, p= .471. We conclude that the groups did not differ on
basic characteristics.
3.3 |Does playing Cerevrum improve cognitive skills?
Analyses of covariance were run to test for differences among the
Stardust,Heroes, and control groups on each cognitive assessment
posttest score, with their respective pretest score as a covariate. As
seen inTable 1, scores between the three groups did not differ signif-
icantly for the MOT task, F(2, 77) = 0.56, p= .575, mental rotation
task, F(2, 76) = 1.95, p= .149, n‐back task, F(2, 75) = 2.81 p=
.067, paper folding task, F(2, 77) = 0.44, p= .647, race task, F(2,
76) = 0.41, p= .668, or UFOV task, F(2, 77) = 0.77, p= .468. The
scores for each posttest cognitive assessment were also combined
into an overall cognitive composite score using an averaged z‐score
for each task, and there were no significant differences among the
three groups, F(2, 74) = 1.06, p= .351.
Finally, as seen in Table 2, when we combined the two game‐
playing groups (Stardust and Heroes) and compared the combined
group to the control group, analyses of covariance showed that
students who played the game did not differ significantly from those
who did not on posttest scores (with pretest scores as covariates)
on the MOT task, F(1, 77) = 1.01, p= .318, mental rotation task,
TABLE 1 Cognitive assessment performance between each VR game group and control group
Task
Stardust Heroes Control
Pretest Posttest Pretest Posttest Pretest Posttest
MOT (score out of 18) 11.81 (1.78) 12.81 (2.09) 11.82 (2.75) 12.41 (2.30) 12.32 (2.96) 12.47 (2.08)
Mental rotation (score out
of 24)
17.75 (3.98) 21.30 (2.41) 15.32 (4.31) 18.68 (4.82) 16.66 (3.93) 19.13 (3.49)
n‐back (d′) 3.10 (.68) 3.02 (.99) 2.72 (.90) 3.37 (.63) 2.61 (1.04) 3.05 (.77)
Paper folding (score out of 10) 6.90 (2.26) 7.19 (2.73) 7.18 (2.06) 7.45 (2.30) 6.58 (2.02) 7.34 (1.76)
Race task (average RT) 3630.73 (2039.92) 2881.86 (1939.11) 3889.59 (1528.18) 3305.94 (1230.26) 3623.75 (2016.28) 2902 (1986.35)
UFOV (score out of 16) 15.04 (1.46) 15.29 (1.52) 14.39 (1.88) 15.00 (1.45) 14.59 (1.38) 15.13 (1.23)
Composite score (z) 0.15 (0.46) 0.04 (0.62) −0.07 (0.45) 0.01 (0.40) −0.04 (0.44) −0.04 (0.47)
PARONG AND MAYER
34
F(1, 77) = 2.08, p= .153, n‐back task, F(1, 77) = 0.04, p= .837, paper
folding task, F(1, 77) = 1.05, p= .308, race task, F(1, 77) = 0.18, p=
.669, or UFOV task, F(1, 77) = 0.97, p= .327. There were also no dif-
ferences in the composite score between the combined VR game
group and control group, F(1, 75) = 0.01, p= .947. We conclude that
playing Cerevrum for 1.5 hr over three sessions did not cause improve-
ments on cognitive skills assessed outside the game context.
3.4 |Is game performance correlated with cognitive
task performance?
Correlations between game scores from each of the six games and
scores on each cognitive task from the pretest were calculated.
Performance on the working memory mini‐game, Pew‐Pew, correlated
significantly with performance on the race task, r(24) = −.43, p= .037
(lower scores indicate faster reaction times). Scores on the 3D mental
rotation game, Constellation Memory, correlated significantly with per-
formance on the MOT task, r(22) = .50, p= .019, paper folding task,
r(22) = .43, p= .047, and UFOV task, r(22) = .43, p= .043. Performance
on the visualization game, Polygons, correlated significantly with
performance on the race task, r(22) = .57, p= .006. All other correla-
tions were not significant. We conclude that there are not strong
relationships between the games designed to target specific cognitive
skills and the cognitive assessments designed to measure them.
4|DISCUSSION
4.1 |Empirical contributions
The results from this experiment do not provide strong evidence that
playing an IVR brain‐training game for 1.5 hr enhances specific com-
ponents of cognition. This work is consistent with previous work
showing a lack of effectiveness of the brain‐training game, Lumosity
(Bainbridge & Mayer, 2018), as well as brain‐training games in general
with healthy adults (Mayer, 2014). A new contribution of this study is
that the lack of evidence for the effectiveness of brain‐training games
extends to games played in IVR. Based on these findings, we suggest
that “brain training”may not be an appropriate name for the genre
of games to which Cerevrum and Lumosity belong.
4.2 |Theoretical contributions
Neither near nor far transfer theories were supported by the results.
Previous research has shown more evidence for near transfer than
far transfer. According to the identical elements theory, the likelihood
of transfer in our study should have been related to degree to which
the tasks in the game tasks and cognitive assessments had overlapping
elements (Thorndike, 1906; Thorndike & Woodworth, 1901). How-
ever, our results may suggest that the games and tasks may not be
identical enough for near transfer.
In particular, the theory of specific transfer of general skills was not
supported by these results. In short, cognitive skills practiced in a
game context did not transfer to performing those same skills in a
nongame context. Additionally, there was no evidence that cognitive
skills practiced in a game context transferred to performing related
but different cognitive skills in a nongame context. In contrast to find-
ings involving brain‐training games, Bediou et al. (2018) reported that
playing action video games can cause changes in the cognitive skills
practiced in the games that transfer to nongame contexts, such as
perceptual attention. Thus, although the theory of specific transfer
of general skills was not supported for brain‐training games, there is
evidence that it can apply to action video games.
Why do we fail to see transfer of cognitive skill training with brain‐
training games? One explanation is that for cognitive skills to transfer,
certain research‐based design criteria need to be met (Anderson &
Bavelier, 2011; Bediou et al., 2018; Mayer, 2014): The player must
engage in repeated practice on the target cognitive skill, the player
must practice the target cognitive skill in a variety of contexts, the
player must be engaged in increasing levels of challenge that maintain
high challenge throughout the game, the player must receive feedback
embedded within the game, and the player should not be distracted by
activity that is not relevant to the instructional goal. For example,
Anderson and Bavelier (2011) and Bediou et al. (2018) have shown
how first‐person shooter games require repeated practice on a target
cognitive skill within a variety of settings. Ericsson (2006) has summa-
rized research showing the value of deliberate practice, that is, prac-
tice with embedded feedback at increasing levels of challenge as is
implemented in action video games (Green, Sugarman, Medford,
Klobusicky, & Bavelier, 2012). Mayer (2009) has summarized research
showing the benefits of designing multimedia instruction that does
TABLE 2 Cognitive assessment performance between the combined VR game group and control group
Task
Cerevrum Control
Pretest Posttest Pretest Posttest
MOT (score out of 18) 11.81 (2.30) 12.60 (2.18) 12.32 (2.96) 12.47 (2.08)
Mental rotation (score out of 24) 16.58 (4.29) 19.93 (4.04) 16.66 (3.93) 19.13 (3.49)
n‐back (d′) 2.90 (.82) 3.20 (.83) 2.61 (1.04) 3.05 (.77)
Paper folding (score out of 10) 7.05 (2.14) 7.33 (2.50) 6.58 (2.02) 7.34 (1.76)
Race task (average RT) 3763.17 (1779.43) 3098.83 (1610.39) 3623.75 (2016.28) 2902 (1986.35)
UFOV (score out of 16) 14.81 (1.69) 15.14 (1.47) 14.59 (1.38) 15.13 (1.23)
Composite score (z) 0.04 (0.46) 0.02 (0.51) −0.04 (0.44) −0.04 (0.47)
PARONG AND MAYER 35
not cause the learner to engage in extraneous activity. Overall, many
action video games meet these evidence‐based criteria, whereas some
suites of brain‐training games do not.
4.3 |Practical contributions
The present study adds to the growing literature questioning the
effectiveness of off‐the‐shelf brain‐training games. A practical implica-
tion of the results of this study is that playing a suite of gamified cog-
nitive tasks for a short period may not be the best way to train
cognitive skills. In contrast, more success has been reported—even in
short play durations—for cognitively designed games that adhere to
the design principles outlined in the foregoing section by providing
repeated and focused practice on a specific executive function skill
in varied contexts with increasing levels of challenge and embedded
feedback (Anguera et al., 2013; Parong et al., 2017). Overall, the
present study suggests a shift from suites of brain‐training exercises
on gamified cognitive tasks to more focused games designed based
on cognitive principles of skill learning. Although strong claims are
made for the value of brain‐training games, educational policy
decisions should be informed by research evidence concerning what
works with computer games (Hilton & Honey, 2011; Mayer, 2016).
4.4 |Limitations and future directions
The lack of evidence for game‐playing in the present study could be
due to a number of reasons. First, 1.5 hr may not have been long
enough to sufficiently improve the targeted cognitive skills. It is possi-
ble that playing the game for 10 or 20 or even 50 hr would create
stronger effects, although investing this much time in IVR may be
impractical. Future research should vary the dosage in order to deter-
mine whether longer exposure to a brain‐training game would produce
positive effects.
Second, the game itself may not have been challenging enough for
the college‐aged students in the study. Although we did not have
access to in‐game metrics on progress in the game, as the player
earned points and advanced through the stages within each task, it
was assumed he or she improved in the task. However, although the
game difficulty was adaptive and increased incrementally as player
performance was adequate, 30 min with each task may not have been
enough time for players to reach and stay at an appropriate level of
challenge to enhance the cognitive skill targeted. In the original study
reported by the company that produced Cerevrum, participants com-
pleted over 20 hr of training spread across the six gamified tasks. This
is important in training a cognitive skill as a task that is too easy may
not enhance the target cognitive skill, whereas a task that is too
difficult may cause cognitive fatigue, and in turn, hurt performance.
Third, it is important to train a cognitive skill in a varied setting as
that may help cognitive skills in transferring to other novel tasks.
Future research should examine the variability that brain‐training
companies assign in training to their target audience to further help
explain how these skills might transfer to novel tasks.
Fourth, brain‐training games appear to focus on improving the
mind in general through exposing the player to a variety of different
gamified cognitive tasks. However, research on the nature of human
intelligence suggests that cognitive growth comes through developing
specific cognitive skills (Hunt, 2011; Martinez, 2000), so cognitive
training should focus on a specific targeted skill rather than improving
the mind in general. Future research is needed to address the relative
benefits of brain‐training suites aimed at improving cognition in gen-
eral and focused games aimed at concentrated practice on a targeted
cognitive skill.
The lack of correlations between game performance and cognitive
task performance, particularly between the seemingly trained cogni-
tive skill in the game to a near transfer task of the same skill, is consis-
tent with the findings that playing the game did not transfer to tasks
outside of the game. However, this may be explained by the notion
that the games simply did not load onto the cognitive tasks that they
were intended to train. Although the mini‐games and cognitive tasks
shared similar surface elements (e.g., the multiple object tracking game
looked the same as the multiple object tracking task), future research
should carefully determine the degree to which the games intended
to train a cognitive skill actually load onto the cognitive tasks to
measure that skill.
As noted by Green et al. (2019), there may be difficulties in extrap-
olating from one brain‐training platform to another. The degree to
which the results obtained with Cerevrum can be extrapolated to other
platforms in this space depend on the degree to which the games
share common features and is subject to empirical research. Given
the many potential differences in games that are intended for training
of cognitive skills, it is important that all brain‐training games not be
inappropriately lumped together.
Additionally, brain training in IVR may not be any more effective
than in other media, such as desktop or tablet. The sense of presence
offered by IVR games may not further bolster any effects expected to
be seen from brain‐training games, although this study did not directly
compare brain‐training in virtual reality to more traditional media,
such as playing on a desktop computer. Alternatively, IVR may be
distracting and cause the player to engage in extraneous activity, as
has been found in studies of science simulations (Makransky,
Terkildsen, & Mayer, 2019). Future research should examine the
unique affordances of virtual reality and whether they can be
effectively utilized to improve brain‐training programs.
ACKNOWLEDGEMENT
Preparation of this paper was supported by Grant N000141612046
from the Office of Naval Research.
COMPLIANCE WITH ETHICAL STANDARDS
We adhered to guidelines for ethical treatment of human subjects and
obtained IRB approval.
PARONG AND MAYER
36
CONFLICT OF INTEREST
The authors declare that they have no conflict of interest.
ORCID
Jocelyn Parong https://orcid.org/0000-0001-7076-2535
Richard E. Mayer https://orcid.org/0000-0003-4055-6938
REFERENCES
Adams, D. M., & Mayer, R. E. (2014). Cognitive consequences approach:
What is learned from playing a game? In R. E. Mayer (Ed.), Computer
Games for Learning: An Evidence‐Based Approach (pp. 171–224). Cam-
bridge, MA: The MIT Press.
Adams, D. M., Pilegard, C., & Mayer, R. E. (2016). Evaluating the cognitive
consequences of playing Portal for a short duration. Journal of Educa-
tional Computing Research,54, 173–195. https://doi.org/10.1177/
0735633115620431
Anderson, A. F., & Bavelier, D. (2011). Action game play as a tool to
enhance perception, attention, and cognition. In S. Tobias, & J. D.
Fletcher (Eds.), Computer games and instruction (pp. 307–330). Char-
lotte, NC: Information Age Publishing.
Anguera, J. A., Boccanfuso, J., Rintoul, J. L., Al‐Hashimi, O., Faraji, F.,
Janowich, J., …Gazzaley, A. (2013). Video game training enhances cog-
nitive control in older adults. Nature,501(7465), 97–101. https://doi.
org/10.1038/nature12486
Anguera, J. A., & Gazzaley, A. (2015). Video games, cognitive exercises, and
the enhancement of cognitive abilities. Current Opinion in Behavioral
Sciences,4, 160–165. https://doi.org/10.1016/j.cobeha.2015.06.002
Au, J., Sheehan, E., Tasi, N., Duncan, G. J., Buschkuehl, M., & Jaeggi, S. M.
(2015). Improving fluid intelligence with training on working memory: A
meta‐anlysis. Psychonomic Bulletin and Review,22, 266–377. https://
doi.org/10.3758/s13423‐014‐0699‐x
Bailenson, J., Yee, N., Blascovich, J., Beall, A. C., Lundblad, N., & Jin, M.
(2008). The use of immersive virtual reality in the learning sciences:
Digital transformations of teachers, students, and social context. Jour-
nal of the Learning Sciences,17(1), 102–141. https://doi.org/10.1080/
10508400701793141
Bainbridge, K., & Mayer, R. E. (2018). Shining the light of research on
Lumosity. Journal of Cognitive Enhancement,2,43–62. https://doi.org/
10.1007/s41465‐017‐0040‐5
Ball, K., Berch, D. B., Helmers, K. F., Jobe, J. B., Leveck, M. D., Marsiske, M.,
& Willis, S. L. (2002). Effects of cognitive training interventions with
older adults: A randomized controlled trial. Journal of the American
Medical Association,288, 2271–2281. https://doi.org/10.1001/
jama.288.18.2271
Barnett, S. M., & Ceci, S. J. (2002). When and where do we apply what we
learn? A taxonomy for far transfer. Psychological Bulletin,128,
612–637. https://doi.org/10.1037//0033‐2909.128.4.612
Bediou, B., Adams, D. M., Mayer, R. E., Tipton, E., Green, C. S., & Bavelier,
D. (2018). Meta‐analysis of action video game impact on perceptual,
attentional, and cognitive skills. Psychological Bulletin,144(1), 77–110.
https://doi.org/10.1037/bul0000130
Best, J. R. (2014). Relations between video gaming and children's executive
functions. In F. C. Blumberg (Ed.), Learning by playing: Video gaming in
education (pp. 42–53). New York, NY, US: Oxford University Press.
https://doi.org/10.1093/acprof:osobl/9780199896646.003.0004
Cerevrum, Inc (2017a). Cerevrum [Computer software]. San Francisco, CA:
Cerevrum, Inc.
Cerevrum, Inc. (2017b). Cerevrum Game Research. Retrieved from https://
www.cerevrum.com/research
David, P., & Gelfeld, V. (2015, January 20). 2014 Brain Health Research
Study. Retrieved from http://www.aarp.org/research/topics/health/
info‐2015/staying‐sharper‐study.html
Edwards, J. D., Vance, D. E., Wadley, V. G., Cissell, G. M., Roenker, D. L., &
Ball, K. K. (2005). Reliability and validity of the useful field of view test
scores as administered by personal computer. Journal of Clinical and
Experimental Neuropsychology,27, 529–543. https://doi.org/10.1080/
13803390490515432
Ekstrom, R. B., French, J. W., & Harman, H. H. (1979). Cognitive factors:
Their identification and replication. Multivariate Behavioral Research
Monographs,79(2), 3–84.
Ericsson, K. A. (2006). The influence of experience and deliberate practice
on the development of superior expert performance. In K. A. Ericsson,
N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.), The Cambridge
handbook of expertise and expert performance (pp. 683–703). New York:
Cambridge University Press.
Green, C. S., Bavelier, D., Kramer, A. F., Vinogradov, S., Ansorge, U., Ball, K.
K., …Witt, C. M. (2019). Improving methodological standards in behav-
ioral interventions for cognitive enhancement. Journal of Cognitive
Enhancement,3(1), 2–29.
Green, C. S., Sugarman, M. A., Medford, K., Klobusicky, E., & Bavelier, D.
(2012). The effect of action video game experience of task‐switching.
Computers in Human Behavior,28, 984–994. https://doi.org/10.1016/
j.chb.2011.12.020
Hardy, J. L., Nelson, R. A., Thomason, M. E., Sternberg, D. A., Katovich, K.,
Farzin, F., & Scanlon, M. (2015). Enhancing cognitive abilities with com-
prehensive training: A large, online, randomized, active‐controlled trial.
PLoS ONE,10(9), 17.
Hilton, M. A., & Honey, M. I. (2011). Learning science through computer
games and simulations. Washington, DC: National Academies Press.
Hunt, E. (2011). Human intelligence. New York: Cambridge University Press.
Jaeggi, S. M., Buschkuehl, M., Jonides, J., & Perrig, W. J. (2008). Improving
fluid intelligence with training on working memory. Proceedings of the
National Academy of Sciences, USA,105, 6829–6833. https://doi.org/
10.1073/pnas.0801268105
Kable, J. W., Caufield, M. K., Falcone, M., McConnell, M., Bernardo, L.,
Parthasarathi, T., …Larman, C. (2017). No effect of commercial cogni-
tive training on brain activity, choice behavior, or cognitive
performance. The Journal of Neuroscience,37, 7390–7402. https://
doi.org/10.1523/JNEUROSCI.2832‐16.2017
Kafai, Y. B. (2006). Playing and making games for learning: Instructionist
and constructionist perspectives for game studies. Games and Culture,
1(1), 36–40. https://doi.org/10.1177/1555412005281767
Kapp, K. M. (2012). The gamification of learning and instruction. San
Francisco: Pfieffer.
Karbach, J., & Schubert, T. (2013). Training‐induced cognitive and neural
plasticity. Frontiers in Human Neuroscience,7, 48. https://doi.org/
10.3389/fnhum.2013.00048
Klingberg, T., Forssberg, H., & Westerberg, H. (2002). Training of working
memory in children with ADHD. Journal of Clinical and Experimental
Neuropsychology,24, 781–791. https://doi.org/10.1076/jcen.
24.6.781.8395
Makransky, G., Terkildsen, T. S., & Mayer, R. E. (2019). Adding immersive
virtual reality to a science lab simulation causes more presence but less
learning. Learning and Instruction,60, 225–236. https://doi.org/
10.1016/j.learninstruc.2017.12.007
Martinez, M. E. (2000). Education as the cultivation of intelligence. Mahwah,
NJ: Erlbaum.
Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York: Cambridge
University Press.
PARONG AND MAYER 37
Mayer, R. E. (2014). Computer games for learning: An evidence‐based
approach. Cambridge, MA: MIT Press.
Mayer, R. E. (2016). What should be the role of computer games in educa-
tion? Policy Insights From the Behavioral and Brain Sciences,3(1), 20–26.
https://doi.org/10.1177/2372732215621311
Mayer, R. E. (2019). Computer games in education. Annual Review of Psy-
chology,70, 531–549. https://doi.org/10.1146/annurev‐psych‐
010418‐102744
Mayer, R. E., Parong, J., & Bainbridge, K. (2019). Young adults learning
executive function skills by playing focused video games. Cognitive
Development,49,43–50. https://doi.org/10.1016/j.
cogdev.2018.11.002
Mayer, R. E., & Wittrock, M. C. (1996). Problem‐solving transfer. In D. C.
Berliner, & R. C. Calfee (Eds.), Handbook of educational psychology (pp.
47–62). New York, NY: Prentice Hall International.
Mayer, R. E., & Wittrock, M. C. (2006). Problem solving. In P. Alexander, P.
Winne, & G. Phye (Eds.), Handbook of educational psychology (pp.
287–303). Mahwah. NJ: Erlbaum.
Melby‐Lervåg, M., Redick, T. S., & Hulme, C. (2016). Working memory
training does not improve performance on measures of intelligence or
other Measures of “far Ttansfer”: Evidence from a meta‐analytic
review. Perspectives on Psychological Science,11(4), 512–34. https://
doi.org/10.1177/1745691616635612
Parong, J., Mayer, R. E., Fiorella, L., MacNamara, A., Plass, J., & Homer, B.
(2017). Learning executive function skills by playing focused video
games. Contemporary Educational Psychology,51, 141–151. https://
doi.org/10.1016/j.cedpsych.2017.07.002
Pilegard, C., & Mayer, R. E. (2018). Game over for Tetris as a platform for
cognitive skill training. Contemporary Educational Psychology,54,
29–41. https://doi.org/10.1016/j.cedpsych.2018.04.003
Rebok, G. W., Ball, K., Guey, L. T., Jones, R. N., Kim, H. Y., King, J. W., &
Willis, S. L. (2014). Ten‐year effects of the advanced cognitive training
for independent and vital elderly cognitive training trial on cognition
and everyday functioning in older adults. Journal of the American Geri-
atrics Society,62,16–24. https://doi.org/10.1111/jgs.12607
Redick, T. S., Shipstead, Z., Harrison, T. L., Hicks, K. L., Fried, D. E.,
Hambrick, D. Z., & Engle, R. W. (2013). No evidence of intelligence
improvement after working memory training: A randomized, placebo‐
controlled study. Journal of Experimental Psychology: General,142,
359–379. https://doi.org/10.1037/a0029082
Rickard, N. S., Bambrick, C. J., & Gill, A. (2012). Absence of widespread psy-
chosocial and cognitive effects of school‐based music instruction in 10‐
13‐year‐old students. International Journal of Music Education,30,
57–78. https://doi.org/10.1177/0255761411431399
Sala, G., Aksayli, N. D., Tatlidil, K. S., Tatsumi, T., Gondo, Y., & Gobet, F.
(2018, April 26). Near and far transfer in cognitive training: A second‐
order meta‐analysis. Collabra: Psychology,5(1). https://doi.org/
10.31234/osf.io/9efqd
Shaffer, D. W. (2006). How computer games help children learn. New York,
NY: Palgrave Macmillan. https://doi.org/10.1057/9780230601994
Shepard, R. N., & Metzler, J. (1971). Mental rotation of three‐dimensional
objects. Science,171(3972), 701–703. https://doi.org/10.1126/
science.171.3972.701
Simons, D. J., Boot, W. R., Charbness, N., Gathercole, S. E., Chabris, C. F.,
Hambrick, D. Z., & Stine‐Morrow, E. A. L. (2016). Do brain training pro-
grams work? Psychological Science,17(3), 103–186.
Sims, V. K., & Mayer, R. E. (2002). Domain specificity of spatial expertise:
The case of video game players. Applied Cognitive Psychology,16,
97–115. https://doi.org/10.1002/acp.759
Singley, M. K., & Anderson, J. R. (1989). The transfer of cognitive skill. Cam-
bridge, MA: Harvard University Press.
Squire, K. (2011). Video games and learning: Teaching and participatory cul-
ture in the digital age. New York, NY: Teachers College Press.
St. Clair‐Thompson, H. L., & Gathercole, S. E. (2006). Executive functions
and achievements in school: Shifting, updating, inhibition, and working
memory. The Quarterly Journal of Experimental Psychology,59(4),
745–759. https://doi.org/10.1080/17470210500162854
Thorndike, E. L. (1906). Principles of teaching. New York: Seiler.
Thorndike, E. L., & Woodworth, R. S. (1901). The influence of improvement
in one mental function upon efficiency of other functions. Psychological
Review,8, 247–261.
Willis, S. L., Tennstedt, S. L., Marsiske, M., Ball, K., Elias, J., Koepke, K. M., &
Wright, E. (2006). Long‐term effects of cognitive training on everyday
functional outcomes in older adults. Journal of the American Medical
Association,296, 2805–2814. https://doi.org/10.1001/jama.
296.23.2805
Wolinsky, F. D., Vander Weg, M. W., Howren, M. B., Jones, M. P., &
Dotson, M. M. (2013). A randomized controlled trial of cognitive train-
ing using a visual speed of processing intervention in middle aged and
older adults. PLoS ONE,8(5), e61624. https://doi.org/10.1371/journal.
pone.0061624
Wolinsky, F. D., Vander Weg, M. W., Howren, M. B., Jones, M. P., Martin,
R., Luger, T. M., & Dotson, M. M. (2011). Interim analyses from a
randomised controlled trial to improve visual processing speed in older
adults: The Iowa Healthy and Active Minds Study. BMJ Open,1(2),
e000225. https://doi.org/10.1136/bmjopen‐2011‐000225
How to cite this article: Parong J, Mayer RE. Cognitive conse-
quences of playing brain‐training games in immersive virtual
reality. Appl Cognit Psychol. 2020;34:29–38. https://doi.org/
10.1002/acp.3582
PARONG AND MAYER
38