ArticlePDF Available

Abstract and Figures

Previous survey research has documented students' use of self-regulated study strategies, with a particular interest in self-testing. These surveys indicate that students frequently use flashcards to self-test and that self-testing is primarily used as a way to monitor learning. Whereas previous surveys provide information about whether and why students self-test, they provide minimal information about how and when students choose to self-test. Accordingly, the primary purpose of the current survey was to explore how and when students engage in self-testing. We surveyed 374 undergraduates about the amount of practice and the timing of practice, two factors that strongly affect the efficacy of self-testing. Results indicate that students understand the benefits of practising to higher criterion levels (amount of practice) but do not typically implement or understand the benefits of practising with longer lags (timing of practice). We discuss practical implications for supporting more successful student learning.
Content may be subject to copyright.
How and when do students use flashcards?
Kathryn T. Wissman
, Katherine A. Rawson
, and Mary A. Pyc
Department of Psychology, Kent State University, Kent, OH, USA
Department of Psychology, Washington University in St. Louis, St. Louis, MO, USA
Previous survey research has documented students’ use of self-regulated study strategies, with a
particular interest in self-testing. These surveys indicate that students frequently use flashcards to self-
test and that self-testing is primarily used as a way to monitor learning. Whereas previous surveys
provide information about whether and why students self-test, they provide minimal information about
how and when students choose to self-test. Accordingly, the primary purpose of the current survey was
to explore how and when students engage in self-testing. We surveyed 374 undergraduates about the
amount of practice and the timing of practice, two factors that strongly affect the efficacy of self-
testing. Results indicate that students understand the benefits of practising to higher criterion levels
(amount of practice) but do not typically implement or understand the benefits of practising with
longer lags (timing of practice). We discuss practical implications for supporting more successful
student learning.
Keywords: Self-testing; Criterion learning; Lag effects; Survey.
A wealth of previous research has shown that
testing is beneficial for learning and memory (for
recent reviews, see Rawson & Dunlosky, 2011;
Roediger & Butler, 2011). However, the effec-
tiveness of self-testing depends on several factors,
including the amount and timing of practice.
Although the majority of research on testing
effects has been conducted in laboratory settings
in which practice tests are under experimental
control, in real-world contexts decisions of how
and when to self-test are left up to students. Thus,
the actual benefits of self-testing in most learning
environments will depend on the extent to which
students’ choices reflect those conditions that
facilitate learning. Given the sizeable amount of
laboratory research exploring conditions under
which self-testing is most effective, it is perhaps
surprising that little is known about how students
implement self-testing on their own. Accordingly
the main focus of the current survey was to
explore how and when students self-test, with a
particular interest in students’ self-reported use of
flashcards (given that students often report using
flashcards during self-regulated learning).
Two critical factors that affect the efficacy of self-
testing are the amount of practice and the timing
of practice. Concerning the amount of practice,
increasing the number of times an item is
correctly recalled during practice benefits later
retention (e.g., Karpicke, 2009; Karpicke &
Roediger, 2008; Pyc & Rawson 2009; Vaughn &
Address correspondence to: Kathryn Wissman, Kent State University, Department of Psychology, P.O. Box 5190, Kent, OH
44242-0001, USA. E-mail:
The research reported here was supported by supported by a Collaborative Award from the James S. McDonnell Foundation
21st Century Science Initiative in Bridging Brain, Mind and Behavior. Thanks to Mike Appleman, Nicole Gonzalez, Jeremy Meduri,
Caitlin Metelko, Dan Molnar, Rochelle O’Neil, Katie Senko, and Sara Smith for assistance with data collection and data scoring.
MEMORY, 2012, 20 (6), 568579
2012 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business
Rawson, 2011). For example, Vaughn and Rawson
(2011) presented learners with Lithuanian
English word pairs for testrestudy practice until
items were correctly recalled one to five times.
Correctly recalling an item more than once during
encoding significantly improved final cued recall
two days later (e.g., 31% versus 71% for items
correctly recalled once versus four to five times).
In addition to effects of the amount of practice
within a given learning session, increasing the
number of learning sessions benefits long-term
retention. For example, Rawson and Dunlosky
(2011) presented students with key concept defi-
nitions for testrestudy practice until items were
correctly recalled during initial learning, followed
by anywhere from one to five relearning sessions
on subsequent days. Final cued recall 1 month
later improved substantially as the number of
relearning sessions increased (e.g., 35% versus
62% after one versus four to five relearning
Concerning the timing of practice, self-testing
is most effective when using longer versus shorter
lags between practice trials for a given item, both
within and between sessions (e.g., Bahrick & Hall,
2005; Cepeda, Vul, Rohrer, Wixted, & Pashler,
2008; Karpicke & Bauernschmidt, 2011; Kornell,
2009; Pashler, Zarow, & Triplett, 2003; Pyc &
Dunlosky, 2010; Pyc & Rawson, 2007, 2009, 2012).
Pyc and Rawson (2009) presented learners with
SwahiliEnglish word pairs for testrestudy prac-
tice trials that were separated by either 6 or 34
other items. Items practised with a longer versus
shorter lag were more likely to be recalled on a
final test 25 minutes later (76% versus 55%) or
one week later (30% versus 5%). In sum, self-
testing is most effective when implemented with
higher criterion levels within and across sessions
and longer lags between practice trials.
In contrast to the sizeable amount of laboratory
research in which practice tests are under experi-
menter control, minimal research has explored
how students implement self-testing on their own.
A few studies have used experimenter-controlled
environments to examine students decisions
about the amount or timing of self-testing
(Karpicke, 2009; Kornell & Bjork, 2008a; Kornell
& Metcalfe, 2006). Kornell and Bjork (2008a,
Experiment 3) gave learners 10 minutes to learn
20 SwahiliEnglish word pairs (in anticipation of
a final cued recall test after a 5-minute delay).
After initial study, items were presented for test
restudy trials. After each trial learners were given
the option to remove a word pair or keep it in the
list for further practice. Learners dropped 63% of
word pairs after one correct recall rather than
practising to a higher criterion. These results
appear to suggest that students do not understand
the benefits of higher criterion levels. However, a
time limit of 10 minutes might have compelled
students to drop items after only one correct
recall to ensure enough time was left to practise
unlearned items. More generally, all previous
experimental studies have only given students
control over limited aspects of self-testing in
which much of the implementation of testing is
still under experimenter control.
Survey methods provide a complementary
approach to investigating students use of self-
testing (Carrier, 2003; Hartwig & Dunlosky, 2012;
Karpicke, Butler, & Roediger, 2009; Kornell &
Bjork, 2007; Kornell & Son, 2009; McCabe, 2010).
Surveys avoid the disadvantage of limited control
in experimental studies by providing students
with opportunities to report decisions used in
real-world contexts. Several recent survey studies
have established that students use self-testing and
frequently report using flashcards in particular.
Hartwig and Dunlosky (2012) had students report
which study strategies they use on a regular basis,
and 62% of learners reported using flashcards.
Likewise Karpicke et al. (2009) had students
freely list strategies used during study; 40% of
learners reported using flashcards. Prior surveys
also provide information about why students self-
test. Kornell and Son (2009) asked learners why
they choose to self-test during study; 66% of
learners reported self-testing to determine how
well they knew the information, whereas only
20% said because they learn more than with
restudy. Similarly, in surveys by Hartwig and
Dunlosky (2012) and Kornell and Bjork (2007)
54% and 68% of learners reported self-testing to
figure out how well they had learned information,
and only 27% and 18% said because they learned
more that way than rereading. These findings
suggest that students primarily self-test to moni-
tor how well target information has been learned
rather than as a learning strategy.
Although prior survey research is informative
about whether and why students choose to
self-test, prior survey studies provide minimal
information about how and when students engage
in self-testing. Given that the efficacy of self-
testing is highly dependent on how and when it is
used, the present survey focuses on the details of
these factors in students flashcard use.
Surveys were administered via computer as part
of six larger studies. Across all six studies, 374
undergraduates from Kent State University com-
pleted the survey in return for course credit. Each
participant was given 20 minutes to complete the
survey. Due to the time limit some participants
did not respond to every question. Furthermore
the particular set and sequence of questions
varied somewhat across studies. Accordingly the
number of respondents varies across questions;
sample size for each question is indicated in the
tables below.
For the majority of the survey questions,
response format was open-ended. The few ques-
tions involving an alternate response format are
denoted in the tables below. For scoring purposes,
all open-ended responses were coded by one of
two raters. Data were initially coded by frequency
of response exemplars. The tally for each re-
sponse exemplar included verbatim or close
paraphrases of a particular idea. For example,
Question 1A asked learners why they use flash-
cards to study for classes; responses of ‘‘easy for
me to learn’’, ‘‘learning is easier this way’’, and
‘‘easiest way to learn’’ were all coded under ‘‘easy
to learn’’. Two authors then reviewed the list of
response exemplars and decided which exemplars
could be collapsed into response categories.
Response categories were developed for exem-
plars that indicated a similar kind of information
(e.g., for Question 2C, we collapsed responses of
‘‘quickly recalled’’ and ‘‘easily recalled’’ into one
response category) or to combine infrequent
responses (e.g., for Question 2G, we created the
category ‘‘6 times’’). Overall, response cate-
gories were used infrequently; all responses pre-
sented in the tables are exemplars unless
otherwise denoted as a response category.
Before reporting the primary outcomes of interest
concerning how and when students self-test, we
briefly discuss relevant background information
about respondents and replications of findings
from prior survey studies.
Background information
Appendix A of the online supplement reports
background information about the survey respon-
dents. Most students were freshmen, and a
majority of the sample maintained above a 3.00
grade point average. A significant number of
students reported that teachers and/or professors
taught them study strategies (A13). Interest-
ingly, although a large majority of students
reported having received instruction from tea-
chers about how to study, students reported that
only 23% of instructors recommended flashcards,
and only 25% recommended completing practice
problems or quizzes.
Whether and why students use
Replicating prior survey research, Table 1 shows
that a large number of students report using
flashcards to study (Question 1A). In contrast to
previous findings, our results differ somewhat in
regard to reasons students gave for why they use
flashcards. Whereas the most common reason
students reported self-testing in prior studies
was to evaluate how well target information has
been learned, the most common reason reported
in the current study was to help memorise
information (although see Question 2J for some
evidence that students revisit dropped flashcards
for purposes of checking memory). These diver-
gent outcomes may be due to methodological
differences between studies. For example,
whereas we asked students why they use flash-
cards in particular, prior surveys asked students
why they choose to quiz themselves more gen-
erally (which might or might not involve the use
of flashcards). Additionally, the three prior survey
studies all used a multiple choice format in which
possible reasons were provided but that only
permitted selection of one response option. In
contrast we used an open-ended question. Given
the open-ended nature of the responses it is
unclear whether students meant the act of testing
in and of itself helps memorisation or if testing
helps determine what information still needs to be
memorised. Further clarification on this distinc-
tion awaits future research.
Extending beyond prior research, Table 1
provides information about what kinds of materi-
als students use with flashcards*an overwhelm-
ing 80% of students report using flashcards to
learn vocabulary (Question 1B)*and also shows
that a majority of students (53%) use flashcards in
more than one class (Question 1C).
How students use flashcards: Amount
of practice
Learning to criterion. Table 2 includes questions
about how students use flashcards, focusing
Whether and why students use flashcards
1A. Do you use flashcards to study for your classes? (n247)
Yes 67.6%
No 32.4
Why or why not? (n 154)*
Why (n 114) Why Not
(n 40)
Helps to memorise 60.5%
Easy way to learn 22.8
Lots of vocabulary 14.9
Convenient to carry 8.8
Isolates information 6.1
Helps organisation 2.6
Breaks up information 1.7
Forces rereading 1.7
Helps/forces rewriting 0.9
Not helpful 60.0%
Not useful 35.0
Didnt need to study 5.0
1B. Explain what type of information you put on them
Vocabulary 82.9%
Factual information from notes (people/places/
Key concepts 28.2
Formulas 7.4
Difficult information 3.7
Information from study guide 3.7
1C. In reference to the classes you took last semester, how
often did you use flashcards to study? (n 264)
0 classes 35.6%
1 class 11.4
2 classes 12.1
3 classes 22.3
4 classes 18.6
*For these questions the sum of response percentages
exceeds 100% because participants could report more than
one response.
For this question participants were given five corresponding
response buttons.
How student use flashcards: Amount of practice
2A. Imagine that you are studying with a stack of flashcards.
How do you decide when you have studied a given flashcard
enough? (n327)*
I recall it correctly 39.4%
I recall it correctly more than once 25.7
I dont have to look at the other side 17.4
I dont have to think about it 15.6
I recall it quickly 12.5
I understand the information 8.6
Mix-up and still recall 5.5
Other 3.1
2B. Imagine you are studying a stack of flashcards. Which of
the following would influence how many times you studied an
item? (n272)*
How easily you can remember the item 80.0%
How many times you can correctly recall
Importance of the particular item 58.9
How long you planned to study that day 40.1
Other 4.0
2C. Do you ever remove cards from the stack of flashcards
when you are studying? (n 327)
Yes 73.7%
No 26.3
If yes, please explain when and why you
would drop out cards (n296)*
Need to focus on unknown 40.0%
Remembered multiple times 33.8
Dont need to study what is known 29.8
Quickly/easily recalled $ 12.0
Ones I need to study more 8.4
Helps divide concepts 3.1
Helps organise 2.7
Other 1.8
If no, please explain why you would not
drop out cards (n72)*
Need to know all the material 55.6%
Will remember better more times studied 36.1
Messes up the order 8.3
Dont see the point 8.3
2D. When you take a card out, how often is it because you
tested yourself and were able to remember it correctly?
0.019.9% 8.3%
20.039.9% 3.8
40.059.9% 13.5
60.079.9% 21.8
80.0100.0% 52.6
2E. When you take a card out, how often is it because its too
hard and you dont feel that you can learn it? (n135)
0.019.9% 65.0%
20.039.9% 16.3
40.059.9% 5.9
60.079.9% 8.9
80.0100.0% 3.0
2F. When you take a card out, how often is it because its too
easy and you dont feel you need to practice it? (n135)
0.019.9% 20.7%
specifically on students self-reported use and
beliefs about amount of practice. Question 2A
demonstrates that a large majority of students
(65%) report practising until items are correctly
recalled (collapsing across ‘‘I recall it correctly’’
and ‘‘I recall it correctly more than once’’; these
two response exemplars were treated as mutually
exclusive for coding purposes). Although the
number of correct recalls was not always speci-
fied, this finding suggests that many students
learn to a criterion of at least one correct recall
when using flashcards. With that said, a discon-
certing number of students report making deci-
sions without checking to make sure they are
correct (‘‘I dont have to look at the other side’’,
17%) or based on cue familiarity (‘‘I dont have to
think about it’’, 16%) and/or fluency of retrieval
(‘‘I recall it quickly’’, 13%). Although fluency
of retrieval can accurately predict subsequent
TABLE 2 (Continued )
20.039.9% 11.1
40.059.9% 8.1
60.079.9% 25.2
80.0100.0% 34.8
2G. When using flashcards to self-test your memory, how
many times do you think you should be able to correctly recall
an item during flashcard practice so that you will be able to
remember it later for an exam? (n243)
1 time 0.8%
2 times 4.5
3 times 22.2
4 times 8.2
5 times 22.0
6 times $ 29.6
Non-numeric response $ 12.7
2H. Imagine that you are self-testing using flashcards and
you correctly recall an item once. How much more likely
would you be to remember that item later if you continued
practising until you correctly recall that item again a second
time? (n143)
0.019.9% 2.1%
20.039.9% 7.7
40.059.9% 9.1
60.079.9% 32.3
80.0100.0% 49.0
2I. Do you think there is a point at which correctly recalling an
item more times during practice would no longer benefit your
ability to remember that item later? If so, what do you think
that point is? (n 143)
No, there is not a point 44.8%
Yes, there is a point 55.2
If so, what is the point? (n 78)*
I recall it X number of times 52.6%
I know answer without thinking about it 30.8
I remember the item the next day 12.8
I start to become unfocused 11.5
Other 2.6
2J. If you were to drop out cards from your stack of flashcards
when you are studying, once an item is dropped from practice,
does it ever get put back in the stack of flashcards within that
same session? (n 315)
Yes 84.1%
No 15.9
Why or why not?*
Why (n257) Why Not
To check memory 41.0%
Always useful to restudy 37.7
Need to keep all to learn
Helps to understand
Forget otherwise 7.0
I know it 64.9%
Use it in different session 37.9
Need to focus on unknown 13.5
Confuses if re-added 8.1
TABLE 2 (Continued )
2K. Suppose you were using flashcards to prepare for an
upcoming exam,
and you wanted to get an A
on the exam. On
how many different days would you study with your stack of
flashcards? (n126)
1 day 4.0%
2 days 13.5
3 days 15.1
4 days 13.5
5 days 15.1
6 days 4.8
7 days 14.3
810 days $ 12.7
Other/non-numeric $ 7.2
2L. When using flashcards to self-test, how do you decide if
you have correctly recalled an item? (n274)*
Self check by looking at external
Self check in mind 27.4
Continuously recall correctly 16.4
Answer correctly without hesitation 12.0
Partner check 3.6
Write down correctly 2.5
*For these questions the sum of response percentages
exceeds 100% because participants could report more than
one response.
For this question participants were provided with the list of
For Questions 3D3F participants were given were given a
sliding scale with endpoints marked as ‘‘0% of the time’’ and
‘‘100% of the time’’. For Question 3H participants were given
a sliding scale with endpoints marked as ‘‘100% more likely’’
to ‘‘0% more likely’’.
$Response category: see text for details.
performance in some situations, fluency can be
misleading in other cases (e.g., if the fluently
retrieved information is incorrect, or if fluency is
due to retrieval from working memory rather
than long-term memory). Relying only on the
fluency with which information comes to mind
without evaluating the correctness of the fluently
retrieved information may be particularly proble-
matic if it leads students to incorrectly believe
they know more than they do, which is particu-
larly likely for more complex material (e.g.,
Dunlosky, Hartwig, Rawson, & Lipko, 2011;
Rawson, ONeil, & Dunlosky, 2011).
Nonetheless most students seem to understand
mation during study, which is one key component
of maximising the benefits of self-testing. Conver-
ging evidence for this conclusion comes from
Question 2B, with a majority of students (69%)
reporting that they practice to criterion when
studying with flashcards. Additionally, Question
2D shows that students commonly report dropping
flashcards in their stack because they reached
criterion. Along the same lines Question 2E
indicates that students are not likely to drop cards
prior to reaching criterion. Students also reported
removing items from practice when the item was
easy (Question 2F), but these results are somewhat
harder to interpret given that students might have
interpreted ‘‘too easy’’ to mean they were able to
recall target material correctly, quickly, or based on
some other source (e.g., cue familiarity).
Higher criterion is better. Results thus far
indicate that students typically choose to reach
criterion before dropping flashcards from study.
The next question of interest concerns the extent to
which students practise beyond a single correct
recall. In Question 2A, 26% of students sponta-
neously reported that they practise until informa-
tion is correctly recalled more than once. Question
2C indicates that of the 74% of students who report
dropping flashcards from their stack during study,
34% of these students explicitly stated that they
practised recalling information multiple times be-
fore dropping it. However, these outcomes may
underestimate the prevalence of practising to high-
er criterion levels, in that some students might
simply not have thought to specify a particular
criterion level. Indeed, when asked directly, almost
every single student indicated belief that practising
beyond one correct recall will help later memory of
target material for an exam (Question 2G). With
that said, by asking ‘‘how many times’’, the phras-
ing of Question 2G might have biased some
students to report more than one correct recall
versus only one (although students preferences for
more versus fewer correct recalls are harder to
explain as due to biasing from the prompt).
Question 2H shows that a majority of students
expect to benefit substantially from correctly
recalling information for a second time. Question
2I suggests that some students do not anticipate
diminishing returns from continuing to correctly
recall target material (although a majority of
students do correctly believe that once information
has been recalled a certain number of times, there is
no need to continue to practise it). Finally, Ques-
tion 2J indicates almost all students (84%) report
putting dropped flashcards back into their stack
during a practice session. Moreover, when students
report not putting a flashcard back into their stack,
it is interesting to note that a common reason is
because they intend to practise it at a later time
(38%). Lastly, Question 2K indicates that students
believe it is important to restudy flashcards on
more than one day in order to obtain an A on an
exam, suggesting that students understand the
benefit of engaging in multiple learning sessions.
Concerning criterion learning and amount of
practice, one related aspect is important to
consider. Students report learning to criterion
and that a higher criterion is beneficial, but how
do students know when they have correctly
recalled information? Question 2L shows that
fewer than half of students report checking
answers against external information (e.g., turn-
ing the flashcard over or checking against notes).
Importantly, even when students check their
answers against external information, they often
still overestimate the correctness of their own
answer (e.g., Dunlosky et al., 2011; Lipko
et al., 2009; Rawson et al., 2011). Even more
disconcerting is the fact that 27% of students
reported checking answers ‘‘in their mind’’, given
that research has shown that evaluating the
correctness of a response without comparison to
external information produces even higher levels
of overconfidence (Dunlosky & Rawson, in press;
Lipko et al., 2009). In either case, overconfidence
can produce detrimental effects on learning
because students do not effectively regulate
practice (Dunlosky & Rawson, in press; Rawson
et al., 2011). For present purposes this outcome
suggests that students may not actually achieve
their intended criterion level if they prematurely
terminate practice due to overconfidence in the
accuracy of their responses.
When students use flashcards: Timing
of practice
Table 3 includes questions about when students
use flashcards with specific respect to the timing
of practice. Most strikingly, several outcomes
indicate that students do not understand the
benefits of longer lags during study. For example,
Question 3A indicates that a majority of students
prefer to use a smaller versus larger stack of
flashcards when studying. Of those students who
reported a preference for using a smaller stack, a
How students use flashcards: Timing of practice
3A. Imagine that you have a total of 60 flashcards for a class. You need to know all 60 items for the next exam. When studying the
flashcards, how many cards would you practise at a time? In other words, would you practise all 60 in one big stack, or would you
divide them up into smaller stacks and then study one stack at a time? Below indicate how many flashcards you would practise at a
time (n279)
60 28.7%
30 9.0
20 21.0
15 14.0
10 18.3
Other 9.0
Why? (n 253)*
1015 (n87) 2030 (n72) 60 (n 74)
Need to know them all 4.6% 11.1% 63.5%
Easier to remember a smaller stack 67.8 44.4 2.7
Less overwhelming 34.5 40.3
Divides into ones I know/dont know 15.3 20.3
Influences how information is encoded/learned 4.6 4.2 8.1
Needs to be somewhat difficult 1.1 6.9 6.8
Confusing to divide 1.1 1.4 2.7
Split into categories/chapters $ 5.7 2.8 1.3
See card more times 4.6 4.2
3B. Do you think having a larger or small stack of flashcards is better for studying? (n 263)*
Small 72.2%
Medium 1.9
Large 16.3
Doesnt matter 6.8
Depends 4.2
Why? (n 229)* Small (n176) Medium/Large (n42)
Less to remember 53.7% 9.5%
Learn quicker 23.9 4.8
Narrows focus 19.1 11.9
See card more times 8.5 2.4
Gets confusing 8.0
Helps to feel accomplished 6.4 4.8
Brain only holds so much 6.4
Need to know all the information 3.2 64.3
Time between cards better 1.1 4.8
3C. If you were going to study a stack of flashcards, what do you think would work best and why? Study them one at a time
repeatedly (for example, 1, 1, 1, 2, 2, 2, 3, 3, 3, 4, 4, 4). Study them mixed together (for example, 1, 2, 3, 4, 1, 2, 3, 4, 1, 2, 3, 4).
Mixed 80.9
Repeated 17.1
Other 3.9
*For these questions the sum of response percentages exceeds 100% because participants could report more than one response.
For this question participants were given six corresponding response buttons.
$Response category: see text for details.
substantial percentage (68%) said the reason was
because it is easier. In addition, even when
students said they would use a larger stack,
none of them reported doing so because it would
improve learning. Question 3B provides conver-
ging evidence showing that when students are
specifically asked whether using a smaller versus
larger stack of flashcards is better for studying, an
overwhelming 72% believe smaller is superior,
whereas research has shown that retention is
better after practising with a larger versus smaller
stack (e.g., Kornell, 2009). Furthermore, not a
single student indicated that the reason for their
preference on stack size was because it is better
for retention. However, although students do not
appear to understand the advantage of long
versus short spacing, Question 3C shows that at
least most students (81%) believe that spaced
study is better than massed study.
Although students understanding of lag effects
was of primary interest, Appendix A of the online
supplement provides results for questions that are
relevant to another aspect of timing. Of interest, a
recently emerging issue in the testing effect
literature focuses on interleaving versus blocking
of practice for items from different categories
(e.g., Kornell & Bjork, 2008b; Taylor & Rohrer,
2010). Despite frequent reports of using flash-
cards for more than one class (Question 1C and
A14), Question A15 indicates that almost no
one said they would mix flashcards of different
topics when studying.
Putting the pieces together: Calendar
Tables 2 and 3 report outcomes from fine-grain
questions that prompted students to report either
on amount of practice or on timing of practice.
Table 4 reports outcomes from a more global
survey question that was designed to explore how
students would make decisions about both the
amount and timing of practice and how flashcard
use might be combined with the use of other
strategies. For this question students were shown
a visual calendar display for the month of
February along with the informational prompt
and questions shown in Table 4.
Not surprisingly almost all students said that
they would study the night before an exam (93%)
and a fair number reported studying the morning of
an exam (38%). Although prior research has shown
that massed study can enhance performance on
immediate tests, cramming before an exam is not
sufficient for longer-term retention (e.g., Balota,
Duchek, & Paullin, 1989; Karpicke & Roediger,
2007; Rawson & Kintsch, 2005). Responses to
Question 4A are consistent with outcomes de-
scribed above, in that almost all students reported
that they would study on more than one day.
However, results also extend the finding that
students do not understand lag effects. For students
who reported they would study in two or more
sessions, the majority reported that they would only
separate their study sessions by about one day.
Responses to Question 4B indicate that a
majority of students (65%) said they would use
two or three different strategies when studying for
the exam. However, students also reported that
they would rely more heavily on restudying than
on self-testing. Restudying notes had the highest
reported percentage for strategies used the morn-
ing of an exam, the night before an exam, or any
other day, and 52% of students said they would
restudy notes during every study session. In
contrast only 34% said they would use flashcards
the morning of an exam and 38% said they would
use flashcards the night before the exam. Like-
wise only 34% said they would use flashcards on
other days, and only a minority (24%) said they
would use flashcards on every other day they
studied. Although the percentage of students who
said they would use flashcards is lower than in
other survey questions described above (e.g., 68%
in Question 1A), note that Question 4B asked
students specifically about preparing for an exam
in General Psychology. To revisit Question 1B, a
majority of students reported using flashcards for
vocabulary (83%) versus information from notes
or key concepts (29% and 28%, respectively).
Given that General Psychology typically involves
learning key concepts and related material rather
than vocabulary (as in foreign language classes),
the lower percentage of students who reported
they would use flashcards in Question 4B is
consistent with the percentage who reported
using flashcards for key concepts in Question 1B.
The current survey extends beyond previous survey
research by exploring how and when students self-
test, with an emphasis on flashcard use. Of primary
interest, two key patterns emerged: First, most
students report engaging in criterion learning and
understand the benefits of practising to higher
criterion levels. Second, students do not appear to
implement or understand the benefits of longer lags
when self-testing with flashcards. Concerning learn-
ing to criterion, a majority of students choose to
reach at least one correct recall during practice
before dropping a given flashcard (Questions 2A,
2B, and 2D), although it is interesting to note that
students do not always check their answer against
external information (Question 2L). Concerning
criterion level, some students spontaneously re-
ported going beyond one correct recall during
practice (Questions 2A and 2C) and almost all
students believe that going beyond one correct
recall is important (Questions 2G and 2H). In
addition a majority of students report using more
than one practice session (Question 2K). Concern-
ing lag, responses indicate a majority of students opt
testing (Questions 3A and 3B). Of critical impor-
tance is that not one student (regardless of preferred
stack size) indicated awareness of the relationship
between lag and learning (Questions 3A and 3B).
Overall the results suggest that when students
self-test with flashcards they make advantageous
decisions regarding the amount of practice but detri-
mental decisions regarding the timing of practice.
Note that some questions in the current survey
tapped students beliefs whereas others tapped
students behaviours. We used this multi-faceted
approach because what students believe about
effective study strategies and what they actually
do when studying likely do not completely align
(e.g., due to time pressure). For example, nearly all
students believe that correctly recalling an item
more than once has a substantial benefit on later
retrieval (Question 2H), whereas only 34% of
students spontaneously reported practising to a
Responses to calendar question
Here is a calendar for the month of February. Imagine that
today is Feb 1st and you have an exam in General Psychology
on Feb 27th. Suppose you wanted to get an A on this exam.
In the field to the right, answer these questions in as much
detail as possible:
4A. Which day or days would you study for the exam (write
exact dates)? (n286)
Participants reporting they would study the
morning of the exam
Participants reporting they would study the
night before the exam
Not counting the morning of the exam or the night before the
exam, number of other days participants reported they would
0 days 6.3%
1 day 13.0
2 days 9.1
3 days 13.3
4 days 8.7
5 days 7.3
6 days 9.8
78 days 10.8
9 days 21.7
For participants who reported studying on two or more
sessions (n273), the mean lag between sessions:
1 day (1.01.5) 66.3%
2 days (1.62.5) 21.2
3 days (2.63.5) 7.0
4 days (3.64.7) 3.3
7 days (6.57.5) 2.2
4B. What exactly would you be doing in each of these study
sessions to prepare for the exam? Next to each date you wrote
down, write what you would be doing during that session to
prepare for the exam (n 269)
Number of strategies participants reported using to study
across sessions:
1 strategy 20.0%
2 strategies 32.6
3 strategies 31.9
4 strategies 11.8
5 strategies 3.3
6 strategies 0.4
For participants who indicated they would study the morning
of the exam (n108), how did they say they would study? For
participants who indicated they would study the night before
the exam (n 247), how did say they would study? For
participants who indicated they would study on at least one
other day (n 252), how did they say they would study?*
Morning Night
Restudy notes 70.4% 68.8% 65.6%
Restudy textbook 30.6 36.4 40.0
Flashcards 34.3 38.4 34.0
Practice test/
15.7 14.6 10.2
Other 21.3 25.5 21.2
TABLE 4 (Continued )
How many sessions on other day(s) participants reported they
would use flashcards, restudy the textbook, and restudy notes
to prepare for exam (n 252)
Flashcards Restudy
Every session 24.2% 30.6% 51.6%
No session 55.6 48.8 23.0
Some but not all
20.2 20.6 25.4
*This question only includes participants who provided
responses for both how and when they would study. The sum
of response percentages exceeds 100% because participants
could report more than one response
higher criterion when studying (Question 2C).
Additionally, if students do not report engaging in
a particular behaviour, questions about beliefs help
to diagnose whether suboptimal behaviour is due
to a metacognitive knowledge deficit or an im-
plementation deficit. For example, 71% of students
report dividing up information into smaller stacks
(Question 3A); the finding that 72% of students
believe using a smaller versus larger stack of
flashcards is better (Question 3B) suggests that
the suboptimal behaviour is primarily due to a
metacognitive knowledge deficit. More generally,
we would advocate for the inclusion of both belief
and behaviour questions in future survey research.
Our findings concerning how and when stu-
dents use flashcards have several important im-
plications for educational practice. Perhaps most
importantly the results showing a general lack of
understanding about how lag affects learning, and
that relatively few students effectively implement
lag either within or between practice sessions,
indicate that students should be educated about
the benefits of lag and how to effectively imple-
ment lag when self-testing. Another unanticipated
but important outcome concerns the extent to
which students reported using flashcards for a
relatively restricted range of materials (in Ques-
tion 1B, 83% of students report using flashcards
for vocabulary whereas only 29% report using
flashcards for key concepts). Thus students would
also likely benefit from education about the
utility of flashcards for learning a wider range of
material. In addition, responses concerning how
students evaluate the correctness of their answers
(Question 2A) suggest that students should also
be educated about the importance of explicitly
checking their answers, particularly when using
flashcards to learn complex material.
Beyond implications for more effective flash-
card use in particular, the current survey results
have practical implications for supporting success-
ful student learning with the use of self-testing
more generally. Although a majority of students
choose to self-test during study, the current results
suggest that students would benefit from more
information or training about how to effectively
implement self-testing. One straightforward ap-
proach would be for instructors to teach students
about the importance of the amount and timing of
self-testing. Unfortunately the current results
suggest that relatively few instructors advise
students to use self-testing as a learning strategy
(Question A13), much less how to self-test
effectively. Given that few instructors recommend
self-testing to students, educators themselves may
be unaware of the robust effects of self-testing
and the factors that influence the efficacy of self-
testing. Ideally teacher training programmes
could educate instructors on the benefits of self-
testing, factors that influence the effectiveness of
self-testing, and ways to incorporate self-testing
methods into curricula. Another approach to
enhance the effective use of self-testing is through
study skills classes or workshops increasingly
offered by universities to support student achieve-
ment. Ideally these programmes would target
first-semester freshmen, who may have never
been taught about the use of self-testing. The
goal would be to educate students on how and
when to self-test in ways that facilitate learning
(e.g., higher criterion levels and longer lags).
More generally, promoting students awareness
of the benefits of self-testing and providing
knowledge of how to effectively self-test in real-
world contexts will support more successful
student learning.
Manuscript received 30 November 2011
Manuscript accepted 12 April 2012
First published online 8 June 2012
Bahrick, H. P., & Hall, L. K. (2005). The importance of
retrieval failures to long-term retention: A meta-
cognitive explanation of the spacing effect. Journal
of Memory and Language, 52, 566577.
Balota, D. A., Duchek, J. M., & Paullin, R. (1989). Age-
related differences in the impact of spacing, lag, and
retention interval. Psychology and Aging, 4,39.
Carrier, M. L. (2003). College students choices of study
strategies. Perceptual and Motor Skills, 96,5456.
Cepeda, N. J., Vul, E., Rohrer, D., Wixted, J. T., &
Pashler, H. (2008). Spacing effects in learning: A
temporal ridgeline of optimal retention. Psycholo-
gical Science, 19, 10951102.
Dunlosky, J., Hartwig, M. K., Rawson, K. A., &
Lipko, A. R. (2011). Improving college students
evaluation of text learning using idea-unit standards.
Quarterly Journal of Experimental Psychology, 64,
Dunlosky, J., & Rawson, K. A. (in press). Overconfi-
dence produces underachievement: Inaccurate self
evaluations undermine students’ learning and reten-
tion. Learning and Instruction.
Hartwig, M. K., & Dunlosky, J. D. (2012). Study
strategies of college students: Are self-testing and
scheduling related to achievement? Psychonomic
Bulletin & Review, 19, 126134.
Karpicke, J. D. (2009). Metacognitive control and
strategy selection: Deciding to practice retrieval
during learning. Journal of Experimental Psychol-
ogy: General, 138, 469486.
Karpicke, J. D., & Bauernschmidt, A. (2011). Spaced
retrieval: Absolute spacing enhances learning re-
gardless of relative spacing. Journal of Experimental
Psychology: Learning, Memory,and Cognition, 37,
Karpicke, J. D., Butler, A. C., & Roediger, H. L. III.
(2009). Metacognitive strategies in student learning:
Do students practice retrieval when they study on
their own? Memory, 17, 471479.
Karpicke, J. D., & Roediger, H. L. III. (2007). Expand-
ing retrieval practice promotes short-term retention,
but equally spaced retrieval enhances long-term
retention. Journal of Experimental Psychology:
Learning, Memory, and Cognition, 33, 704719.
Karpicke, J. D., & Roediger, H. L. III. (2008). The
critical importance of retrieval for learning. Science,
319, 966968.
Kornell, N. (2009). Optimising learning using ash-
cards: Spacing is more effective than cramming.
Applied Cognitive Psychology, 23, 12971317.
Kornell, N., & Bjork, R. A. (2007). The promise and
perils of self-regulated study. Psychonomic Bulletin
& Review, 14, 219224.
Kornell, N., & Bjork, R. A. (2008a). Optimising self-
regulated study: The benets and costs of
dropping ashcards. Memory, 16, 125136.
Kornell, N., & Bjork, R. A. (2008b). Learning concepts
and categories: Is spacing the ‘‘enemy of induc-
tion’’? Psychological Science, 19, 585592.
Kornell, N., & Metcalfe, J. (2006). Study efcacy and
the region of proximal learning framework. Journal
of Experimental Psychology: Learning,Memory and
Cognition, 32, 609622.
Kornell, N., & Son, L. K. (2009). Learners choices and
beliefs about self-testing. Memory, 17, 493501.
Lipko, A. R., Dunlosky, J., Hartwig, M. K.,
Rawson, K. A., Swan, K., & Cook, D. (2009). Using
standards to improve middle-school students accu-
racy at evaluating the quality of their recall. Journal
of Experimental Psychology: Applied, 15, 307318.
McCabe, J. (2010). Metacognitive awareness of learn-
ing strategies in undergraduates. Memory & Cogni-
tion, 39, 462476.
Pashler, H., Zarow, G., & Triplett, B. (2003). Is
temporal spacing of tests helpful when it inates
error rates? Journal of Experimental Psychology:
Learning, Memory and Cognition, 29, 10511057.
Pyc, M. A., & Dunlosky, D. (2010). Toward an under-
standing of students allocation of study time: Why
do they decide to mass or space their practice?
Memory & Cognition, 38, 431440.
Pyc, M. A., & Rawson, K. A. (2007). Examining the
efciency of schedules of distributed retrieval prac-
tice. Memory and Cognition, 35, 19171927.
Pyc, M. A., & Rawson, K. A. (2009). Testing the
retrieval effort hypothesis: Does greater difculty
correctly recalling information lead to higher levels
of memory? Journal of Memory and Language, 60,
Pyc, M. A., & Rawson, K. A. (2012). Why is test
restudy practice benecial for memory? An evalua-
tion of the mediator shift hypothesis. Journal of
Experimental Psychology: Learning, Memory, and
Cognition, 38, 737746.
Rawson, K. A., & Dunlosky, J. (2011). Optimising
schedules of retrieval practice for durable and
efcient learning: How much is enough? Journal of
Experimental Psychology: General, 140, 283302.
Rawson, K. A., & Kintsch, W. (2005). Rereading effects
depend upon time of test. Journal of Educational
Psychology, 97,7080.
Rawson, K. A., ONeil, R. L., & Dunlosky, J. (2011).
Accurate monitoring leads to effective control and
greater learning of patient education materials.
Journal of Experimental Psychology: Applied, 17,
Roediger, H. L. III., & Butler, A. C. (2011). The critical
role of retrieval practice in long-term retention.
Trends in Cognitive Science, 15,2027.
Taylor, K., & Rohrer, D. (2010). The effects of
interleaved practice. Applied Cognitive Psychology,
24, 837848.
Vaughn, K. E., & Rawson, K. A. (2011). Diagnosing
criterion-level effects on memory: What aspects of
memory are enhanced by repeated retrieval? Psy-
chological Science, 22, 11271131.
Background information and outcomes of secondary interest.
A11. What is your college standing, based on number of
credit hours completed? (n372)
Freshman 66.1%
Sophomore 15.9
Junior 8.9
Senior 9.1
A12. What was your approximate GPA in high school?
(n370) What is your approximate GPA in college? (n 308)
Below 2.00 0.8% 2.9%
2.00 2.49 4.1 10.7
2.50 2.99 18.4 25.3
3.00 3.49 35.1 32.5
3.50 4.00 38.4 28.6
4.00 3.2 n/a
A13. Has anyone ever given you advice or taught you how
you should study? (n135)
Yes 66.7%
No 33.3
When was it (what grade were you in)? (n 67)*
Grade school 13.4%
Middle school 11.0
High school 47.8
College 40.3
All grades 6.0
Who was it (for example, a teacher or a friend)? (n 86)*
Teacher/professor 75.6%
Friend 23.3
Parent 19.8
Other 7.0
TABLE 5 (Continued )
What did they tell you to do when studying? (n 73)*
Overall Teacher/professor
Flashcards 37.0% 22.5%
Make outline 27.4 27.5
Read/reread material 27.4 30.0
Practice problems/quizzes 24.7 25.0
Find out best method for self 17.8 17.5
Make up system (rhymes,
colour code)
12.3 12.5
Work in groups 6.8 5.0
Make charts/graphs 4.1 5.0
Other 4.1 5.0
A14. Have you ever been in a situation in which you were
using flashcards to study two topics at the same time (for
example, might have had two exams, in Spanish and
Chemistry)? (n149)
Yes 59.1%
No 40.9
A15. Suppose you had sets of flashcards for two different
classes. Would you mix the cards from the two topics and study
them together or would you keep the cards from the two
topics separate? (n150)
Dont mix 97.8%
Mix 2.2
If you said dont mix, explain why you would study them
separately (n141)*
Confusing 63.1%
Different subjects 53.9
Mixed hurts memorisation 17.0
Separate helps memorisation 6.4
Makes more challenging 3.5
Classes coincide 2.1
*For these questions the sum of response percentages
exceeds 100% because participants could report more than
one response.
APPENDIX A (Continued)
Copyright of Memory is the property of Psychology Press (UK) and its content may not be copied or emailed to
multiple sites or posted to a listserv without the copyright holder's express written permission. However, users
may print, download, or email articles for individual use.
... Indeed, students report dropping cards from study to make better use of study time [20][21]. However, students often overestimate how much they have learned [22] and thus make suboptimal decisions on when to drop an item from study. ...
... Several studies have shown participants will drop a card from study after only one correct recall [9,10,20] which is not enough to build long-term knowledge [23][24][25]. Thus, learners fail to recognize the benefits of repeated and spaced practice [20][21]. ...
... The research is clear that learners make many sub-optimal decisions when using flashcards [9,10,[20][21][22]. How can we improve flashcards-based learning to overcome these pitfalls? ...
Full-text available
Flashcards are a popular study tool, however learner decisions can lower their effectiveness. One such decision is whether or not to drop a concept from study. Using objective mastery criteria that adaptively determine when to add or drop an item from study based on performance may improve learning outcomes in flashcard-based tasks. The effectiveness of adaptive flashcard-based learning may also vary based on the cognitive ability of the learner. The current study examined the impact of adaptive mastery instructional strategies on learning butterfly species and whether or not the impact of adaptive mastery varies by cognitive ability. Three learning conditions were compared: a No Add/Drop group (all items remain in the deck throughout study), a Mastery Drop group (start with all items, then drop after an item is mastered), and a Mastery Add group (start with three items, add items once mastered). A pre-post-transfer test design was used both immediately after training and one week later. Participants also completed the symmetry span task and a change detection task to evaluate cognitive ability. Results show the worst overall immediate pre-post learning gains in the Mastery Drop condition compared to the Mastery Add and No Add/Drop conditions which showed similar learning gains. This pattern went away when looking at delayed pre-post learning gains. Cognitive ability did not have any impact on learning performance, suggesting that similar strategies work equally well across all levels of cognitive ability. These results suggest adaptively adding cards is better than dropping them, though if there are no time constraints, leaving all concepts in the deck leads to the best overall learning in the short term.
... Although the retrieval practice literature does not focus on flashcards per se, it is plausible that its findings can be extrapolated to flashcard use given that the retrieval practice effect is robust across different test formats and methods of engaging in self-testing (e.g., Rowland, 2014). Surveys of paper flashcard use indicate that self-testing is a common activity among students when using this study tool (Wissman et al., 2012). The present survey investigated whether similar patterns occur for digital flashcards. ...
... The retrieval practice effect is enhanced by correct answer feedback (e.g., Pan, Hutter, et al., 2019; for review see Rowland, 2014), as it can help learners to maintain their own correct response (Butler et al., 2008) or adjust their response if they made errors (Kang et al., 2007). However, surveys of paper flashcard use suggest that about one-third of students infrequently check the back of their flashcards after testing themselves (Wissman et al., 2012), and in one empirical study, students even dropped flashcards from study after no correct retrievals if they felt so unduly confident in their retrieved response that they declined to check the back of the flashcard (Kornell & Bjork, 2008). The present survey explored the possibility that feedback may also be underutilised among users of digital flashcards. ...
... They might also avoid dropping flashcards from further study, thereby maintaining the spacing between items and strengthening learning via additional practice. However, students do not typically capitalise on the potential benefits of spacing when using paper flashcards, as they often prefer to use smaller flashcard sets when studying and believe that smaller sets are better for learning than larger flashcard sets (Wissman et al., 2012). (For a more detailed discussion and comparison of implementing successive relearning in different flashcard programmes, see Dunlosky & O'Brien, 2020). ...
Over the past two decades, digital flashcards—that is, computer programmes, smartphone apps, and online services that mimic, and potentially improve upon, the capabilities of traditional paper flashcards—have grown in variety and popularity. Many digital flashcard platforms allow learners to make or use flashcards from a variety of sources and customise the way in which flashcards are used. Yet relatively little is known about why and how students actually use digital flashcards during self-regulated learning, and whether such uses are supported by research from the science of learning. To address these questions, we conducted a large survey of undergraduate students (n = 901) at a major U.S. university. The survey revealed insights into the popularity, acquisition, and usage of digital flashcards, beliefs about how digital flashcards are to be used during self-regulated learning, and differences in uses of paper versus digital flashcards, all of which have implications for the optimisation of student learning. Overall, our results suggest that college students commonly use digital flashcards in a manner that only partially reflects evidence-based learning principles, and as such, the pedagogical potential of digital flashcards remains to be fully realised.
... Given these findings, we advise practitioners to use short-answer testing rather than multiple-choice testing to foster learning in university teaching and that practitioners encourage learners to use short-answer questions when learning on their own. Wissman et al., 2012). Digital flashcards and online quizzes have the common purpose of responding to questions about the learning content. ...
... Chapter V Learners and lecturers often use computer-assisted techniques to revise learning content. Conventional techniques include the use of (electronic) flashcards and clicker questions in offline courses Wissman, Rawson, & Pyc, 2012) immediately answered by the learners. Learners using these technologies, knowingly or unknowingly benefit from the testing effect, also known as retrieval practice effect or testenhanced learning. ...
Full-text available
Improving retention of learned content by means of a practice test is a learning strategy that has been researched since a century and has been consistently found to be more effective than comparable learning strategies such as restudy (i.e., the testing effect). Most importantly, practicing test questions has been found to outperform restudy even when no additional information about the correct answers was provided to practice test takers, rendering practice tests effective and efficient in fostering retention of learning content. Since 15 years, additional scientific attention is devoted to this memory phenomenon and additional research investigated to what extend practicing test questions is relevant in real-world educational settings. This dissertation first presents the evidence for testing effects in applied educational settings by presenting key publications and presenting findings from a methodological review conducted for this purpose. Within this dissertation, theories are presented why practicing test questions should benefit learning in real-world educational settings even without the provision of additional information and key variables for the effectiveness of practicing test questions are presented. Four studies presented in this dissertation aimed at exploring these assumptions in actual university classrooms while also trying to implement new methods of practicing learning content and thus augment course procedures. Findings from these studies—although not often consistent—will be incorporated and interpreted in the light of the theoretical accounts on the testing effect. The main conclusion that can be drawn from this dissertation is that, given the right circumstances, practicing test questions can elicit beneficial effects on the retention of learning content that are independent of additional information and thus taking a practice test per se, can foster retention of real-world learning content.
... How do students attempt to learn declarative concepts in real-world contexts? Survey research suggests that students practice recalling the definitions of these concepts (Wissman et al., 2012), with empirical evidence indicating that using retrieval practice for definitions is an effective learning technique (Dunlosky & Rawson, 2015;Tauber et al., 2018). Interestingly, other research suggests that provided examples may be an effective learning technique for enhancing understanding of declarative concepts, but these studies have treated examples as supplementary support for learning the abstract concepts (rather than focused on learning examples directly as primary targets of learning; e.g., Rawson et al., 2015;Zamary & Rawson, 2018a, 2018b. ...
... The current research evaluated the utility of treating examples as primary targets of learning for supporting declarative concept application. We compared this approach of learning examples to targeting definitions for study, which students indicate doing when using retrieval practice as a study strategy (Wissman et al., 2012) and is the common approach implemented in empirical research investigating the effects of retrieval practice on retention of definitions (e.g., Dunlosky & Rawson, 2015;Tauber et al., 2018). Across both experiments, application of declarative concepts was greater following examples practice compared to definitions practice (see Figure 3). ...
Declarative concepts are abstract concepts denoted by key terms and short definitions that can be applied in a variety of scenarios (e.g., positive reinforcement in psychology; Rawson et al., 2015). One common learning goal for declarative concepts is to instill knowledge that students can use to support the application of content in novel scenarios. Given theoretical perspectives and empirical evidence from related literatures, one promising approach for supporting declarative concept application is learning examples. The purpose of the current research was to evaluate the utility of using examples as primary targets of learning for declarative concept application. In two experiments, participants read a textbook passage that included the definition and an example of 10 declarative concepts. Participants then learned the target material by recalling either the definition or the example of each concept. Across both experiments, declarative concept application was greater following practice focused on learning examples versus definitions. Results suggest that using this strategy may be an effective technique for supporting the application of definitions, which are foundational to many introductory courses. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
... Hartwig and Dunlosky (2012); Geller et al. (2018), and Morehead et al. (2016) found even higher levels (more than 70%) of students' reported use of self-testing by questions or practice problems. Yet self-testing and practice testing were rarely the reported prime method (except, for example, when studying with flashcards for vocabulary or key concepts, Wissman et al., 2012). ...
We doubt the prevailing interpretation of lower Judgments of Learning (JOLs) for testing over rereading to reflect learners' favoritism of an ineffective activity. We argue that JOLs for testing are biased due to a negative feedback effect. In three preregistered experiments (Nfinal = 306), we eliminated the feedback effect by asking students to only imagine learning with the described activities (rereading/testing) after reading a text and by capturing offline-JOLs (off-JOLs = being decoupled from the current learning experience) as a function of an imaginary final test delay (5 min/1 week/2 weeks). In 5-min conditions, off-JOLs consistently reflected no differences between rereading and testing; in 1-week and 2-week conditions, two (of three) experiments demonstrated an advantage of testing over rereading. These results are consistent with actual learning outcomes in an experiment using the same text and activities (Rummer et al., 2017, Exp. 1). Learners’ metacognitive judgments resembled actual learning outcomes more accurately than suggested by previous research.
... The choice of e-learning assisted by flashcards is because flashcards are an effective learning method so that students can learn factual knowledge and increase motivation in active learning. Flashcards have several important implications for the learning process, where the use of flashcard media can be embedded into connected applications as interactive quizzes [14], as a modified game-based-teaching material in the form of a digital quiz [15]. The use of flashcards makes it easier for students to understand the material being taught so as to make students not feel bored, and can deepen students' understanding. ...
Full-text available
Since the outbreak of the pandemic, it requires educators and students at all levels of education to be able to adapt to virtual learning. Based on the needs analysis that the researchers conducted through an online questionnaire for science class of XI students, their obstacle during distance learning was that 64% of students still had difficulty understanding physics material. Students have difficulty analyzing problems in physics problems because there are too many formulas that must be understood, as well as learning factors for students who are less effective. Most of them are interested in interactive learning methods by emphasizing audiovisual, as well as animations and learning videos. Therefore, the study aims to develop e-learning assisted by flashcards to improve the scientific literacy in high school students, because it is easily accessible on any device, can be via a laptop or smartphone. The average validation results by media experts are 89.58%, material experts are 90.65%, learning experts are 94.15% and by class XI physics teachers are 96.00%. The average value of the overall validation test by experts is 92.6% with the interpretation of “very feasible” as a learning medium with an N-Gain of 0.44 on a medium increase interpretation.
... The extensive flashcard learning literature has demonstrated that flashcards help students during knowledge acquisition, but the way students engage with the materials can have a large impact on learning. In general, students use flashcards to prepare for exams by memorizing the information, testing themselves on the information, and assessing their proficiency (Hartwig and Dunlosky 2012;Karpicke, Butler, and Roediger 2009;Kornell and Bjork 2007;Wissman, Rawson and Pyc 2012). The way students use flashcards to self-test is related to the classic finding in the literature called the testing effect, which states that testing oneself is more beneficial to long-term retention than just studying or re-reading the material. ...
Full-text available
In an effort to modernize training, the United States Navy and Marine Corps have placed an emphasis on identifying effective, learner-centric instructional methods. One avenue is to apply individualized training techniques, such as adaptive sequencing to flashcard-based study, a popular tool used for independent study. Therefore, the goal of this research was to compare two adaptive sequencing methods to identify the most efficient and effective approach for long-term retention. Participants learned armored vehicle identification in one of three conditions, 1) Adaptive Response-Time Based-Sequencing (ARTS), which uses accuracy and reaction time to prioritize flashcards adaptively, 2) Leitner, which uses accuracy to create decks and prioritize flashcards adaptively, and 3) Random, which was the control condition that sequenced the flashcards randomly. We found no differences between the three conditions in terms of learning efficiency and delayed learning gains. However, we found that participants in the Leitner condition completed training significantly faster those in the other conditions. After controlling for in-training measures, we found participants in the Leitner condition had the lowest delayed learning gains and there were no significant differences between the ARTS and Random conditions. These data suggest that the “efficiency” associated with the Leitner condition translated to the worst long-term retention. Additionally, neither adaptive approach outperformed the random sequencing condition, suggesting that random sequencing may provide its own spacing due to the number of cards in the deck. More research is needed to determine whether adaptive sequencing provides added value as an adaptive approach for single-session flashcard study.
... split the picture and its description on Fig. 3). This approach is based on the so-called flashcard method (Wissman, et al., 2012), where on one side of the card is written a fact, a title or a task and on the other side is a solution or related fact (e.g. English and Spanish word, mathematical example and its result, historical event name and its year, chemical symbol and an element name, etc.). ...
Full-text available
Conference Paper
The article deals with the technique of memorizing a larger number of pairs of related facts (e.g. words in mother tongue and foreign language) using the flashcard method transformed into the classic card Memory Game. For this purpose, the new authoring mobile application Own Memory is introduced, whose major advantage is support for creating custom card sets, complemented by many original useful options. In addition to images, the application adds support for text and sound cards that can be used both separately and combined. These features make it easier to creating of sets with an effective educational component that allows to learn paired facts with the long-time proven flashcard method, but in a much more fun form than the classic version. This learning with the game can be enjoyed separately, against the artificial intelligence of the selected level, or with friends or family even on devices with a larger screen. Thanks to the open format of cards sets, it can be prepared outside of this application in any favourite software. The development of the application Own Memory continues, but already in its current release it has proven to be a quality and useful tool for a modern way of effective learning, where it is important to store a larger number of different paired facts into the long-term memory.
Student engagement and its measurement remain contested by different research disciplines that aim to evaluate education. In computer analytics, engagement is momentary, and longer-term engagement arises from accumulated momentary engagement. By contrast, policymakers see engagement as fundamentally longer-term, emerging from the student’s identification and sense of belonging. The definition of engagement is a long-debated topic, and this lack of agreement is a major problem for the field. Behavioral engagement is dedicated time and effort, which may be necessary and is certainly contributory to the outcomes of learning and development. Various engagement theories have included behavioral, cognitive, emotional, and social elements. Some engagement paradigms have subsumed other contributors to learning, such as motivation, cognitive strategies, and social communities. Despite much recent progress in our understanding of learning science, the relationship of the benefits of momentary to longer-term engagement is insufficiently researched. We and others have attempted to tease apart engagement’s causes and consequences. Some technological approaches (e.g., gamification) boost momentary engagement to kindle longer-term engagement; this may be the mechanism of active learning and frequent formative assessments. The opposite pedagogical approach uses longer-term, internalized motivation as a substrate that allows consistent triggering of momentary engagement by traditional pedagogies. Whereas metrics of behavioral engagement are sometimes agreed upon, the emotional-cognitive-behavioral meta-construct of engagement may cause more confusion than clarity when evaluating outcomes such as development and learning. Thus, using engagement as an outcome, instead of measuring learning and development directly, solves some policy problems, but it exacerbates many scientific problems regarding mechanism.KeywordsStudent engagementStudent engagement in higher educationSEHEShort-term engagementMoment-to-moment engagementMicro engagement
The paper highlights the psychological and pedagogical conditions for the use of ICT tools in teaching syntax and reveals the ways of introducing the experience of using them in the professional training of future language teachers within the course "Modern Ukrainian Literary Language: Syntax". The research literature on the didactic principles of ICT use in the educational process of future teachers, the application of ICT in the process of language training of students-philologists was reviewed. The study stresses the significant role of syntax in the linguistic hierarchy. Syntax helps students get aware of the laws of phrase and sentence structure, their expressive potential and means, and ways of their implementation in any discourse. It is important to follow the provisions developed by M. Zhaldak and S. Selian, N. Berger, Z. Dovedan. The paper outlines didactic principles, methods and techniques of ICT use in the professional training of future language teachers. The paper suggests some criteria for building a system of exercises and assignments with the help of ICT. It describes the ways of applying ICT tools in mixed and distance learning of students-philologists. A system of cognitive, training, research and creative exercises is presented. The study established that the use of digital narrative, mental maps, infographics, flashcards, electronic dictionaries in the training of future teachers of Ukrainian language and literature has a positive effect on students' mastering the educational material, it motivates and stimulates students; encourages students to apply self-cognition, self-learning, reflection, self-improvement; develops critical and creative thinking; offers them different strategies for memorizing information; enables interactive interaction between all participants in the educational process; provides the possibility of interval revision; makes the learning process active and interesting. The prospects of the research are determined in the study.
Full-text available
Previous research has shown better text learning after rereading versus 1 reading of a text. However, rereading effects have only been explored using immediate tests, whereas most students face delays between study and test. In 2 experiments, 423 college students read a text once, twice in massed fashion, or twice with 1 week between trials. Students were tested either immediately or 2 days after study. On an immediate test, performance was greater after massed versus single reading, whereas performance for distributed rereading was not significantly greater than after single reading. On a delayed test, performance was greater after distributed versus single reading, whereas performance for massed rereading and single reading no longer differed significantly. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Full-text available
The spacing effect—that is, the benefit of spacing learning events apart rather than massing them together—has been demonstrated in hundreds of experiments, but is not well known to educators or learners. I investigated the spacing effect in the realistic context of flashcard use. Learners often divide flashcards into relatively small stacks, but compared to a large stack, small stacks decrease the spacing between study trials. In three experiments, participants used a web-based study programme to learn GRE-type word pairs. Studying one large stack of flashcards (i.e. spacing) was more effective than studying four smaller stacks of flashcards separately (i.e. massing). Spacing was also more effective than cramming—that is, massing study on the last day before the test. Across experiments, spacing was more effective than massing for 90% of the participants, yet after the first study session, 72% of the participants believed that massing had been more effective than spacing. Copyright © 2009 John Wiley & Sons, Ltd.
Full-text available
Although the memorial benefits of testing are well established empirically, the mechanisms underlying this benefit are not well understood. The authors evaluated the mediator shift hypothesis, which states that test-restudy practice is beneficial for memory because retrieval failures during practice allow individuals to evaluate the effectiveness of mediators and to shift from less effective to more effective mediators. Across a series of experiments, participants used a keyword encoding strategy to learn word pairs with test-restudy practice or restudy only. Robust testing effects were obtained in all experiments, and results supported predictions of the mediator shift hypothesis. First, a greater proportion of keyword shifts occurred during test-restudy practice versus restudy practice. Second, a greater proportion of keyword shifts occurred after retrieval failure trials versus retrieval success trials during test-restudy practice. Third, a greater proportion of keywords were recalled on a final keyword recall test after test-restudy versus restudy practice.
Although substantial research has demonstrated the benefits of retrieval practice for promoting memory, very few studies have tested theoretical accounts of this effect. Across two experiments, we tested a hypothesis that follows from the desirable difficulty framework [Bjork, R. A. (1994). Memory and metamemory considerations in the training of human beings. In J. Metcalfe, A. Shimamura, (Eds.), Metacognition: Knowing about knowing (pp. 185–205). Cambridge, MA: MIT Press], the retrieval effort hypothesis, which states that difficult but successful retrievals are better for memory than easier successful retrievals. To test the hypothesis, we set up conditions under which retrieval during practice was successful but differentially difficult. Interstimulus interval (ISI) and criterion level (number of times items were required to be correctly retrieved) were manipulated to vary the difficulty of retrieval. In support of the retrieval effort hypothesis, results indicated that as the difficulty of retrieval during practice increased, final test performance increased. Longer versus shorter ISIs led to more difficulty retrieving items, but higher levels of final test performance. Additionally, as criterion level increased, retrieval was less difficult, and diminishing returns for final test performance were observed.
Applied a self-report approach to 2 examinations of different types of study strategies in an upper-division college course. The 1st examination was a closed-book multiple-choice tests of 60 items. 46 students returned questionnaires. The 2nd examination was an open-book, open-note multiple-choice test of 30 items. For a closed-book examination, study strategies that could promote deep processing correlated positively with scores but were not likely to be used by the students. For an open-book, open-note examination, strategies that might have led to confusion regarding the locations of material in the textbook and lecture notes correlated negatively with scores, although they were not likely to be used by the 58 students. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Previous research shows that interleaving rather than blocking practice of different skills (e.g. abcbcacab instead of aaabbbccc) usually improves subsequent test performance. Yet interleaving, but not blocking, ensures that practice of any particular skill is distributed, or spaced, because any two opportunities to practice the same task are not consecutive. Hence, because spaced practice typically improves test performance, the previously observed test benefits of interleaving may be due to spacing rather than interleaving per se. In the experiment reported herein, children practiced four kinds of mathematics problems in an order that was interleaved or blocked, and the degree of spacing was fixed. The interleaving of practice impaired practice session performance yet doubled scores on a test given one day later. An analysis of the errors suggested that interleaving boosted test scores by improving participants' ability to pair each problem with the appropriate procedure. Copyright © 2009 John Wiley & Sons, Ltd.
Encoding strategies vary in their duration of effectiveness, and individuals can best identify and modify strategies that yield effects of short duration on the basis of retrieval failures. Multiple study sessions with long inter-session intervals are better than massed training at providing discriminative feedback that identifies encoding strategies of short duration. We report two investigations in which long intervals between study sessions yield substantial benefits to long-term retention, at a cost of only moderately longer individual study sessions. When individuals monitor and control encoding over an extended period, targets yielding the largest number of retrieval failures contribute substantially to the spacing advantage. These findings are relevant to theory and to educators whose primary interest in memory pertains to long-term maintenance of knowledge.
Previous studies, such as those by Kornell and Bjork (Psychonomic Bulletin & Review, 14:219-224, 2007) and Karpicke, Butler, and Roediger (Memory, 17:471-479, 2009), have surveyed college students' use of various study strategies, including self-testing and rereading. These studies have documented that some students do use self-testing (but largely for monitoring memory) and rereading, but the researchers did not assess whether individual differences in strategy use were related to student achievement. Thus, we surveyed 324 undergraduates about their study habits as well as their college grade point average (GPA). Importantly, the survey included questions about self-testing, scheduling one's study, and a checklist of strategies commonly used by students or recommended by cognitive research. Use of self-testing and rereading were both positively associated with GPA. Scheduling of study time was also an important factor: Low performers were more likely to engage in late-night studying than were high performers; massing (vs. spacing) of study was associated with the use of fewer study strategies overall; and all students-but especially low performers-were driven by impending deadlines. Thus, self-testing, rereading, and scheduling of study play important roles in real-world student achievement.
Effective management of chronic diseases (e.g., diabetes) can depend on the extent to which patients can learn and remember disease-relevant information. In two experiments, we explored a technique motivated by theories of self-regulated learning for improving people's learning of information relevant to managing a chronic disease. Materials were passages from patient education booklets on diabetes from NIDDK. Session 1 included an initial study trial, Session 2 included self-regulated restudy, and Session 3 included a final memory test. The key manipulation concerned the kind of support provided for self-regulated learning during Session 2. In Experiment 1, participants either were prompted to self-test and then evaluate their learning before selecting passages to restudy, were shown the prompt questions but did not overtly self-test or evaluate learning prior to selecting passages, or were not shown any prompts and were simply given the menu for selecting passages to restudy. Participants who self-tested and evaluated learning during Session 2 had a small but significant advantage over the other groups on the final test. Secondary analyses provided evidence that the performance advantage may have been modest because of inaccurate monitoring. Experiment 2 included a group who also self-tested but who evaluated their learning using idea-unit judgments (i.e., by checking their responses against a list of key ideas from the correct response). Participants who self-tested and made idea-unit judgments exhibited a sizable advantage on final test performance. Secondary analyses indicated that the performance advantage was attributable in part to more accurate monitoring and more effective self-regulated learning. An important practical implication is that learning of patient education materials can be enhanced by including appropriate support for learners' self-regulatory processes.