Content uploaded by Amanda M. Vanderheyden
Author content
All content in this area was uploaded by Amanda M. Vanderheyden on Aug 30, 2023
Content may be subject to copyright.
Research-Based Practice; Communique, pp. 1, 20-25, Volume 48 Issue 5
Belief-Based Versus Evidence-Based Math Assessment and Instruction: What School
Psychologists Need to Know to Improve Student Outcomes
By Amanda M. VanDerHeyden & Robin S. Codding
Many school psychologists work in schools that have low proficiency rates on the year-
end test of mathematics, which is concerning because math proficiency is a powerful indicator
of long-term academic success. For example, Duncan et al. (2007) found that early numeracy
measures forecast later academic proficiency even better than early literacy measures among
young students. Children who meet the college-readiness benchmarks in mathematics tend to
complete 2-year and 4-year degrees at a higher rate and experience higher lifetime earnings
(Lee, 2012). Assisting schools to help all students meet the college-readiness benchmarks in
math performance is a socially meaningful action that, in effect, can be an economic gateway to
their future lives.
Fortunately for school psychologists, identifying which students are on track toward the
college-readiness benchmarks is knowable given certain milestone indicators, like early
numeracy mastery by kindergarten, whole number operation mastery by grade 3, proportion
quantity and operation mastery by grade 6, and linear function mastery by grade 8.
Unfortunately, many systems fail to notice lagging progress until children fail to master
proportions, by which time children are sorted into math tracks (e.g., advanced math, remedial
math). Being sorted into a remedial track at grade 7 or 8 when lagging performance could have
been addressed and prevented in grades 1 to 5 leads many children to unfairly miss out on the
lifetime economic benefit of mathematical proficiency (Lee, 2012). In fact, a recent study found
that 5th grade math proficiency was the strongest basis for predicting who would meet the ACT
college readiness benchmark for math, not the sequence of instruction followed in secondary
schooling (e.g., remedial versus advanced math course sequences; Koon & Davis, 2019).
Addressing math instructional problems is not easy work. One reason that the work is so
challenging for school psychologists is that there is a great deal of philosophy that is at odds
with contemporary evidence, yet is embraced by the teachers, school coaches, and leaders who
observe the large and persistent achievement gaps, care deeply about their students, and want
to avoid harm. The result is tension between evidence-based and philosophy-based practices in
math education. Although consensus building documents, such as Adding it Up (National
Research Council [NCR], 2001) and the National Mathematics Advisory Panel Report (NMAP,
2008), have been published, conflicting recommendations around key approaches to
mathematics instruction and intervention are promoted through websites, blogs, and formal
organizations (Doabler et al., 2015; Rittle-Johnson, Schneider, & Star, 2015). This wealth of
information circulating in both traditional and modern outlets makes it difficult to distinguish
between pseudoscience and scientific approaches to quality mathematics instruction and
intervention (Kratochwill, 2012; Lilienfeld, Ammirati, & David, 2012) and ultimately leaves
students vulnerable to ineffective approaches by well-meaning adults. We believe that if school
psychologists can understand the origin of this tension, they can be key actors toward
diminishing this tension, finding common ground, and moving educators toward practices that
are more beneficial for students.
What is the Source of This Tension?
How can adults who all share a common goal of helping children experience more
success in mathematics be so at odds about what practices work and which practices should be
avoided? In 1986, Hiebert and LeFevre wrote a chapter and distinguished conceptual
understanding from procedural understanding. It is important to recognize the context in which
they wrote this chapter. At the time, algorithm-only instruction was the rule, not the exception,
in classrooms. We believe Hiebert and LeFevre were deliberately challenging the field of math
education to aspire for teachers to understand the underlying coherent structure of
mathematics, so that they could assist students to attain more substantial and lasting
mathematical proficiency. In their chapter, Hiebert and LeFevre defined procedural knowledge
as superficial and sequential (as opposed to rich) syntax, steps, conventions, and rules for
manipulating symbols; reducing the definition to basically memorization of the algorithm only.
By 2001, the “Adding it Up” report asserted that pitting conceptual against procedural
understanding created a “false dichotomy” (p. 122, NRC, 2001) that ultimately detracted from
the goal of helping U.S. students attain greater mathematical proficiency. The NMAP (2008)
report reiterated that conceptual understanding and procedural fluency are mutually beneficial
and equally important. Yet, by this time, math teachers were hearing the message that
conceptual understanding had to precede procedural skill building and that directly teaching
algorithms could be harmful to students even from reputable sources like the National Council
for Teachers of Mathematics (e.g., https://www.nctm.org/Publications/Teaching-Children-
Mathematics/Blog/Strategies-Are-Not-Algorithms, published online in 2016).
In 2005, Star challenged the dichotimization of procedural and conceptual knowledge,
asserting that “depth” is a dimension that can be applied logically to both procedural and
conceptual knowledge. He further argued that Hiebert and LeFevre’s (1986) treatment of
procedural versus conceptual understanding missed the notion of heuristics (i.e., tactics used to
solve problems) and underplayed the hugely important role of flexibility in mathematical
problem solving. Star (2005) defined procedural knowledge as:
… order of steps, the goals and subgoals of steps, the environment or type of situation in
which the procedure is used, the constraints imposed upon the procedure by the
environment or situation, and any heuristics or common sense knowledge that are
inherent in the environment or situation. (p. 409)
Star (2005) argued that proficient learners could use heuristics and demonstrate flexibility in
choosing which procedures to use to solve problems specific to a given context. These
parameters of proficiency had been ignored in Hiebert and LeFevre’s original treatment.
Flexibility is a recognized element of proficiency in mathematics and it is one that
cannot occur without what Star (2005) defined as deep procedural knowledge. One example of
such flexibility that proficient problem solvers use is choosing the format of a proportion given
the type of problem they are trying to solve (e.g., using .58 instead of 33/57, or 3/5 instead of
60%). In math, flexibility is a life skill that allows problem solvers to turn challenging problems
into easier problems in the context in which a problem must be solved (e.g., what level of
precision is required? What operation or operations will I need to conduct?). For example,
when solving a problem 6 x (14 ÷ 6) + 10, flexible problem solvers will recognize immediately
that the factor of 6 multiplied by (14 ÷ 6) is the same as 14. Problem solvers with only
superficial procedural knowledge may simply apply the PEMDAS rule (parentheses, exponents,
multiplication, division, addition, subtraction) for order of operations and solve 14 ÷ 6 and then
try to multiply that quantity by 6, which will result in a decimal quantity due to rounding the
irrational value that results when 14 is divided by 6. A flexible problem solver will simply choose
to represent 14 divided by 6 as a fraction quantity (i.e., 14/6) and then the answer is apparent:
14 + 10 or 24. Mathematical problem solving is ripe with these types of examples, beginning
with very simple skills (e.g., finding a near easy fact in addition to solve a more challenging
addition fact) and continuing through more advanced skills like solving linear equations. Given 2
(x + 1) + 3 (x + 1) = 10, a flexible problem solver will recognize that there are two ways to
proceed: collect then distribute 5 (x + 1) = 10 or distribute then collect 2x + 2 + 3x + 3 = 10 and
can choose the method that seems easiest given the problem context. However, given 2 (x + 1)
+ 3 (x + 2) = 10, the flexible problem solver will recognize that one must distribute and then
collect to solve because the terms ((x + 1) and (x + 2)) are different terms. Flexibility requires
understanding the relationships between operations, facility with creating equivalent
quantities, constructing problems to solve for an unknown, and choosing the problem-solving
step or steps that are easiest in a given problem-solving context.
Some teachers may be unaware of the National Council of Teachers of Mathematics
(NCTM) position statement on procedural fluency that can be found at
https://www.nctm.org/Standards-and-Positions/Position-Statements/Procedural-Fluency-in-
Mathematics, which is consistent with Star (2005). But this very position is seemingly at odds
with other recommendations offered by the same organization which is so influential among
math teachers. For example, a recent president of NCTM addressed fluency with advice that
can be found here (https://www.nctm.org/News-and-Calendar/Messages-from-the-
President/Archive/Linda-M_-Gojak/Fluency_-Simply-Fast-and-Accurate_-I-Think-Not!/). On the
surface, there is nothing objectionable in this advice. In fact, the advice is very seductive
because it seems so reasonable. However, the way recommended approach for building fluency
is terribly misleading because it is wholly disconnected from the empirical body of work around
how to build fluency and in fact is at odds with best practices. A savvy school psychologist
needs to know about these disconnects and find ways to hybridize classroom practices to
ensure that children develop fluent performances that are built upon and in turn benefit
conceptual understanding and reflect generalizable and flexible problem-solving skills. The key
take-away for school psychologists is to understand that (a) procedural fluency and conceptual
understanding emerge in concert around specific and connected skills, (b) high-quality fluency
building instruction requires high doses of opportunities to respond (e.g., practice with
feedback) that ensures high student engagement and occurs in frequent doses, and (c) knowing
whether students have attained fluency requires the use of brief timed assessments (Burns,
Riley-Tilman, & VanDerHeyden, 2012). It is also important for school psychologists to know that
class-wide fluency-building intervention can be a powerful supplement to typical classroom
practices in math even though teachers may be wary of these evidence-based practices
(VanDerHeyden & Codding, 2015).
In the rest of this paper, we will discuss some common misunderstandings of math
practices and summarize available evidence that school psychologists may use to advise
systems in the thoughtful implementation of evidence-based practices in ways that bring
mathematical success to more children, which is sorely needed in the United States.
Why Timed Assessment Is Important
Teachers may view timed assessment and practice as tantamount to rote memorization,
but the evidence makes a case that timed assessment is an important component of
instructional decision making and that timed practice is a necessary active ingredient of
fluency-building intervention. Why do we rely on timed assessment in mathematics? First, it is
important to use timed assessment at certain decision points because timed assessment
provides superior information than does untimed performance in terms of knowing whether
students have attained mastery and whether they are ready for more challenging content. If
timed assessment were not necessary to make meaningful instructional decisions, then perhaps
it could be avoided altogether. So first, let’s understand why we must have timed assessment.
As part of a randomized control trial of class-wide math intervention (VanDerHeyden,
McLaughlin, Algina, & Snyder, 2012), 4th and 5th grade students participated in math screening
in the fall (N = 209) and the spring (N = 218). Procedural controls were followed to ensure
administration fidelity and interscorer agreement. Each measure was scored for digits correct
per 2 minutes for the original study. Raw data were rescored for digits correct per minute and
accuracy of responses (computed as the number of correct digits divided by the total number of
digits attempted and multiplied by 100%). The data provided for this example come from the
fact families measure for multiplication and division facts with numerals 0–12 administered at
both time points. In the scatterplot depicted in Figure 1, for each of 427 administered and
scored measures, the digits correct score is plotted against the accuracy score.
One pattern that readers should notice right away is the natural tendency of errors to
diminish as performance becomes more fluent. This pattern should resonate with readers,
because with oral reading fluency, students making the most errors while reading are typically
those students reading at lower rates. It is a natural pattern of behavior for errors to diminish
as speeded performance improves.
The level of accuracy that teachers typically require to consider students proficient
might be 90%. That accuracy criterion is reflected by the horizontal line connecting to the y-axis
at 90%. The score in digits correct per 2 minutes (fluency score, which is accuracy plus speed;
Binder, 1996) has two criteria shown as vertical, dashed bars. The one closest to the y-axis
represents the instructional range and the one farther to the right represents the mastery
range of performance (Deno & Mirkin, 1977). Functionally, we know that children who attain
the instructional range of performance in digits correct per two minutes are likely to be making
fewer errors (i.e., they have acquired the skill) and visually we can see that is true in this graph.
Students in this stage of learning will benefit the most (i.e., grow the most rapidly) given
instructional tactics that are designed to build fluency (e.g., increasing opportunities to
respond, removing prompts and cues, setting goals, providing rewards, and encouraging self-
monitoring of performance gains, and delayed error correction). Students to the right of the
mastery line are students who are ready for generalization opportunities and more challenging
problem types. We also know that students who are to the left of the mastery line and
especially to the left of the instructional line are highly unlikely to retain the skill in only a few
weeks, are highly likely to make errors that compromise understanding, are unlikely to be able
to use the skill to solve more complex or novel problems, and will not experience faster
learning of more complex related content (Burns, VanDerHeyden, & Jiban, 2006). So here is the
punchline: When a teacher uses untimed assessment to judge whether students have mastered
important mathematical content and understandings, the teacher will be wrong in all those
cases that fall above the 90% criterion but to the left of the instructional line. This is a large
number of cases (n = 157 errors or 55% of cases in the frustrational range) about whom the
teacher would reach an incorrect conclusion and deprive students of additional needed
instruction to truly attain mastery.
This problem is one that is expected based on the accuracy metric. Once a student
reaches 100% accuracy, there is nothing more that the accuracy metric can tell you about
proficiency and yet, there is valuable information still to know. If you have two students who
both score 100% on an addition task, but one student has to draw and count hashmarks while
the other student can solve problems immediately or employ a variety of efficient strategies to
arrive at the correct answer (e.g., 5 + 4 = 5 + 5 -1), the second student is more proficient and
the only way to detect that superior proficiency would be to time the performance. Under
timed conditions, the second student would answer more problems correctly than would the
first. This truth of assessment (and the limits of accuracy on untimed measures) is exactly why
college readiness batteries use timed assessment. It is the timing that separates the 30’s from
the 36’s on the ACT, for example.
Tension 1: Teachers Might Believe That Math Instruction and Assessment Cause Anxiety
Teachers and parents worry about math anxiety and some math education experts
caution against tactics used in math class, such as timed tasks and tests, that might theoretically
stoke anxiety (Boaler, 2012). First, the evidence does not support that people are naturally
anxious or not anxious in the context of math assessment and instruction (Hart & Ganley,
2019). Second, simply avoiding math or certain math tactics should not be expected to
ameliorate anxiety in the long term. Third, preventing a student from full exposure to math
assessment and intervention costs the student the opportunity to develop adaptive coping
mechanisms to deal with possible anxiety in the face of challenging academic content. Fourth,
focusing solely on math anxiety overlooks the important role that schools and teachers can play
in reducing anxiety so children can participate fully in math instruction.
Given this important rationale for timed assessment, what do we know about anxiety in
math? Gunderson, Park, Maloney, Bellock, and Levine (2018) found a reciprocal relationship
between skill proficiency and anxiety, such that weak skill reliably preceded anxiety and anxiety
further contributed to weak skill development. They found that anxiety could be attenuated by
two strategies: improving skill proficiency (this cannot be done by avoiding challenging math
work and timed assessment) and promoting a growth mindset (as opposed to a fixed ability
mindset) using specific language and instructional arrangements to promote the idea that I, as a
student, can work hard and beat my score; I can grow today; my brain is like a muscle that gets
stronger when I work it with challenging math content. A recent meta-analysis that included
131 studies also found a negative correlation between anxiety and math performance (r = -.34)
and the negative relationship between math anxiety and math performance was stronger when
performance was measured on complex, multistep math tasks and when students believed that
the math task would impact their grades (Namkung, Peng, & Lin, 2019).
Research by Hart and Ganley (2019) found the same association between math skill and
adult math anxiety. Most adults in their study with about 1,000 participants reported low to
moderate math anxiety. Self-reported adult math anxiety was negatively correlated with fluent
addition, subtraction, multiplication, and division performance (r = - .25 to - .27) and probability
knowledge (r = - .31 to - .34). Self-reported test taking anxiety was negatively correlated with
math skill fluency and probability knowledge, too (r = -.22 to - .26). One must wonder with
these emerging data whether math anxiety has been oversimplified in the press.
What does the evidence say about math anxiety? There is very little empirical evidence
examining whether timed tests have a causal impact on anxiety and the existing few studies
that include school-age participants do not support the idea (Grays, Rhymer, & Swartzmiller,
2017; Tsui & Mazzocco, 2006). What is clear is there is a modest, negative bidirectional
relationship between math anxiety and math performance (Namkung et al., 2019). These
correlational data suggest that poor mathematics performance can lead to high math anxiety
and that high math anxiety can lead to poor mathematics performance. The remedy that school
psychologists can advocate for is to identify, through effective and efficient screening, the
presence of high math anxiety and determine which students would benefit from supplemental
and targeted mathematics supports. Intervention approaches should target math skill deficits,
address high anxiety, and promote a growth mindset as well as monitor progress toward clearly
defined objectives using tools that are brief (often timed), reliable, and valid.
Tension 2: Teachers Might Believe That Instruction of Algorithms is Harmful and Conceptual
Understanding Must Precede Procedural Knowledge
A substantial body of longitudinal and experimental research has emerged examining
exactly this question (see Rittle-Johnson, 2017 for a review). The empirical evidence has
demonstrated that the purported unidirectional relationship between conceptual
understanding and procedural knowledge is untrue. For example, Hecht and Vagi (2010) found
that procedural knowledge with fractions among fourth graders predicted their conceptual
knowledge with fractions as fifth graders and vice versa even after controlling for prior
knowledge. Schneider, Star, and Rittle-Johnson (2011) also demonstrated that procedural
knowledge predicted conceptual knowledge and vice versa across a broad array of skills and
concepts. Knowledge development is iterative and understanding, which can be thought of as
more conceptual than procedural, facilitates procedural knowledge and procedural knowledge
facilitates deeper conceptual understanding. Effective instruction includes both and not in a
linear fashion, but in a way that facilitates bidirectional input and opportunity and results in
understanding and performance that is flexible, retained, adaptable, and useful in learning new,
more complex content.
By the time of the “Adding it Up” report in 2001, the research was clear that avoiding
algorithm-only instruction was and is distinct from not teaching the algorithm at all. Despite
this evidence and clear recommendations from Common Core State Standards in Mathematics
(2010) and NMAP (2008) indicating that children should master the standard algorithm, such
instruction currently takes a back seat to alternative approaches to problem solving. The
standard algorithm serves as a link between conceptual understanding and procedural
knowledge. Without understanding the logic of why the standard algorithm works, students will
be unable to determine when to appropriately apply the standard algorithm (Fuson &
Beckmann, 2012–2013; Wu, 2011). However, given that conceptual understanding and
procedural knowledge are bidirectional, students may better understand the conceptual basis
for the standard algorithm by applying the procedure. Standard algorithms work because of
mathematical laws; they can, and should, be proofed and unpacked. Standard algorithms
require students to decompose numbers into base-10 units and complete a series of simple
computations. They are systematic, efficient, and transferrable. For example, if students
understand how to use the standard algorithm to solve 2-digit by 2-digit operations they can
generalize this approach to larger whole number problems as well as operations with decimals
(Fuson & Beckmann, 2012–2013). The standard algorithm is one of several key approaches that
students need to be explicitly taught in order to engage in more flexible and efficient problem
solving.
What does the evidence say about procedural and conceptual knowledge acquisition?
In order to solve problems flexibly and efficiently, students need to be exposed to instruction
that is both conceptual and procedural (NMAP, 2008; Schneider et al., 2011). The empirical
data indicate that conceptual understanding and procedural knowledge are bidirectional (e.g.,
Rittle-Johnson et al., 2015). Therefore, school psychologists can advocate for core instructional
approaches and adoption of curriculum that interleave conceptual and procedural lessons.
Additionally, school psychologists can assist teachers in recognizing that the standard algorithm
is an important tool to explicitly teach in order to illustrate to students the relationship
between concepts and procedures.
Tension 3: Teachers Might Believe That Explicit Instruction Is Beneficial Only for Struggling
Learners
The most robust finding demonstrated across multiple meta-analyses is that explicit
instruction is the most effective mathematics instructional practice (e.g., Gersten, Chard, et al.,
2009; Hattie, 2009; Swanson, 2009). In fact, the strongest effects reported in experimental
research on mathematics achievement are for explicit instruction (d = .55; Hattie,2009). It has
also been demonstrated that students with mathematics difficulties, as well as students with
other types of disabilities, benefit more from explicit instruction than discovery-oriented
approaches (Kroesbergen & Van Luit, 2003). The finding that explicit instruction is essential for
students with mathematics disabilities or difficulties is plainly evident in the literature and led
to recommendations from a panel within the Institute for Education Sciences that tiered
intervention supports embed explicit instruction (Gersten, Beckmann et al., 2009). The
importance of explicit instruction for typically performing students has been less often studied
(Doabler et al., 2015). Although the existing evidence has been mixed, the NMAP (2008) report
indicated that the current data warrant the inclusion of explicit instruction along with student-
centered approaches in core instruction. A recent study demonstrated that the rate and quality
of student–teacher interactions, embedded within an explicit instruction approach to core
kindergarten instruction, was related to student achievement (Doabler et al., 2015).
Collectively, these data suggest that explicit instruction should be incorporated in universal
teaching practices, and if it is not, access to core mathematics instruction is going to be limited
for many students.
Because of the ubiquitous use of the phrase, confusion about the precise definition of
explicit instruction is common among educators. Sometimes explicit instruction is confused
with Direct Instruction (Stein, Kinder, Rolf, Silbert, & Carnine, 2018), which is a specific, highly
effective, instructional curriculum that employs explicit instruction. Explicit instruction may
also be confused with didactic lectures. Explicit instruction is a systematic approach that
incorporates previewing of previous skills and concepts, precise instructions, modeling, guided
and independent practice, immediate feedback, and checks for maintenance of skills. This
methodology is aligned with student proficiency (acquisition, fluency,
generalization/application) as support is scaffolded for frequent student responding. Explicit
instruction provides highly engaging learning volleys between content, teachers, and students
that build many opportunities for practice, verbalization, feedback, and demonstration of
mathematical thinking, which in turn, sets the stage for successful skill acquisition and mastery,
and enables creative expression and curious exploration. Teachers can easily differentiate
instruction to provide enrichment for advanced content or address foundational skill deficits
that are impeding grade-level performance. Step-by-step routines incorporate opportunities for
children to respond in multiple formats (verbally, drawing, constructing, and writing). Explicit
instruction anticipates, prevents, and detects misunderstandings with carefully engineered
lesson content. Explicit instruction draws direct connections between what a student already
knows and what a student is currently learning. As a result, the use of explicit instruction
provides opportunities to reason, speculate, estimate, justify, predict, conclude, and ask new
questions. In sum, explicit instruction is a cornerstone of effective mathematics instruction
(VanDerHeyden & Alsopp, 2014).
What does the evidence say about the value of explicit instruction in math? The
scientific evidence on the benefits of explicit instruction are clear and robust, suggesting that
school psychologists should feel comfortable supporting teachers in the use of explicit
instruction in their general education classrooms during core instruction. School psychologists
should also advocate for the adoption of tiered intervention supports that incorporate explicit
instruction. The IRIS Center (https://iris.peabody.vanderbilt.edu/module/math) and National
Center for Intensive Intervention (NCII; https://intensiveintervention.org/intensive-
intervention-features-explicit-instruction) have online modules describing explicit mathematics
instruction to which school psychologists can direct administrators and educators . Notably,
systematic review of different mathematics textbooks across grades 1, 2, and 4 all identified the
need for more explicit instruction to be embedded, among other critical evidence-based
instructional principles (Doabler, Fien, Nelson-Walker, & Baker, 2012; Sood & Jitendra, 2007).
Therefore, school psychologists should also participate in school-level discussions regarding the
adoption of curriculum and be sure that adequate opportunities for explicit instruction are
embedded. If school psychologists work in schools where explicit instruction is not embedded
into the curriculum or instructional routines, then consultation with teachers can be used to
develop activities that embed explicit instruction to supplement core instruction.
Tension 4: Teachers Might Believe That Executive Functioning Tools and Interventions
Improve Math Performance
The relevance of cognitive measures for intervention planning has long been debated
despite consistent evidence indicating that cognitive measures are not helpful for intervention
planning or associated with intervention outcomes (Miciak, Williams, Taylor, Cirino, Fletcher, &
Vaughn, 2016; Stuebing et al., 2015). The aptitude by treatment interaction theory suggested
that instructional interventions are more or less effective depending upon students’ measured
cognitive aptitudes. Although this theory was refuted by Cronbach and Snow in a meta-analysis
published in 1977, the notion that matching cognitive aptitudes, and more recently executive
functions, to instructional interventions has persisted. The evidence summarized and analyzed
in meta-analytic studies illustrates that (a) although cognitive measures correlate with
mathematics achievement, these measures do not correlate with student responsiveness to
intervention; (b) using cognitive assessment tools does not provide the information necessary
to improve academic skill weaknesses; and (c) cognitive interventions do very little to improve
academic performance outcomes (Burns, 2016). In their meta-analysis, Jacob and Parkinson
(2015) found a moderate association, that remains consistent across developmental levels,
between executive function skills and math and reading achievement. Importantly, the authors
noted that most studies failed to control for IQ or background characteristics and when studies
did control for these factors, the strength of the relationship between executive functioning
and achievement was reduced. This meta-analysis also examined intervention studies that
employed executive function interventions and measured outcomes on both executive
functioning and academic achievement. These authors concluded that there are very few
rigorous intervention studies examining the causal link between executive function
interventions and academic outcomes. The authors then indicated that these existing studies
showed improvements on measures of executive function but no improvements on academic
achievement. Thus, the notion that executive function training can bring about gains in
mathematics proficiency is not consistent with existing evidence.
The evidence serves as a reminder that the most effective way to address a math skill
deficit is to directly remediate math skills rather than trying to improve working memory or
executive functioning as a means to address math skill deficits. This conversation is not
intended to suggest that individual differences do not matter. In fact, there is some preliminary
data suggesting differences in working memory and reasoning impact intervention
responsiveness (Fuchs et al. 2013; Fuchs et al., 2014). For example, Fuchs and colleagues (2013)
compared a number knowledge intervention that either included a fluency-building activity or a
conceptual-knowledge activity delivered to first grade students at-risk for mathematical
difficulties. The findings suggested whether students had weak or strong cognitive reasoning
ability did not matter when they received the intervention with the fluency activity, but for
students with low reasoning ability, the intervention with the conceptual activity led to poorer
outcomes than their peers with better reasoning ability. Similarly, Fuchs and colleagues (2014)
evaluated a fraction intervention provided to at-risk fourth graders that either embedded
fluency or conceptual practice activities. The findings suggested better performance in either
intervention group compared to the control. However, students with very poor scores on a
working memory task did better with the intervention variation that included a conceptual
activity whereas children with more adequate scores (all students had relatively low working
memory scores) did better on the intervention variation including the fluency activity. It is
important to recognize in both studies the way the specific math activities were constructed
within the academic intervention was altered to address individual cognitive differences;
therefore, the emphasis was still on the math skill itself.
What does the evidence say about executive functioning and math performance? The
evidence supports providing intensified instruction for students who struggle in the context of
general education mathematics instruction. Interventions should be tailored and intensified
according to student needs using direct evaluation of students’ skills to make low-inference
decisions about intervention tactics. It has long been recognized that students at-risk for
mathematics learning disabilities may also have difficulties with attention, motivation, self-
regulation, and working memory (e.g., Compton, Fuchs, Fuchs, Lambert, & Hamlett, 2012;
Montague, 2007). Thus, when building intensive interventions, it is useful to include self-
regulation and reinforcement strategies, minimize cognitive load on working memory and
reasoning by including explicit instruction and breaking down problems into smaller more
manageable parts, minimize excessive language load by incorporating visual representations,
and provide fluency practice (Fuchs, Fuchs, & Malone, 2018 Powell & Fuchs, 2015). School
psychologists can help establish clear, systematic guidelines within school intervention teams
for adapting interventions and intensifying instruction in order to address students with the
largest gaps between expected and present performance.
Conclusion
In conclusion, we summarize incorrect beliefs that create tension between those beliefs
and evidence-based practices. We attempt to contextualize these incorrect beliefs to create
common ground, and summarize contemporary evidence (see Table 1). School psychologists
are, first and foremost, advocates for all students, which necessitates a strong commitment to
the use of practices that can be expected to work if correctly implemented (i.e., evidence-based
practices). But school psychologists must be highly savvy implementation allies to support the
adoption of effective practices in schools. In math, many children are vulnerable to the use of
ineffective practices because teachers and teacher leaders have been influenced by modern
math myths. The school psychologist can play a pivotal, and much needed role in inspiring,
equipping, and reinforcing the use of effective practices for students to help more students
realize the lifetime social and economic benefit of math proficiency.
References
Binder, C. (1996). Behavioral fluency: Evolution of a new paradigm. Behavior Analyst, 19, 163–
197.
Boaler, J. (2012). Commentary: Timed tests and the development of math anxiety: Research
links “torturous” timed testing to underachievement in math. Education Week. Retrieved
from https://www.edweek.org/ew/articles/2012/07/03/36boaler.h31.html
Burns, M. K., Riley-Tilman, T. C., & VanDerHeyden, A. M. (2012). RTI applications (Volume 1):
Academic and behavioral interventions. New York, NY: Guilford Press.
Burns, M. K., VanDerHeyden, A. M., & Jiban, C. (2006). Assessing the instructional level for
mathematics: A comparison of methods. School Psychology Review, 35, 401–418.
Burns, M. K. (2016). Effects of cognitive processing outcomes and interventions on academic
outcomes: Can 200 studies be wrong? Communique, 44(5), 1, 26–29.
Compton, D. L., Fuchs, L. S., Fuchs, D., Lambert, W., & Hamlett, C. (2012). The cognitive and
academic profiles of reading and mathematics learning disabilities. Journal of Learning
Disabilities, 45(1), 79–95. Retrieved from https://doi-
org.ezproxy.neu.edu/10.1177/0022219410393012
Cronbach, L. J., & Snow, R. E. (1977). Aptitudes and instructional methods: A handbook for
research on interactions. New York, NY: Irvington.
Deno, S. L., & Mirkin, P. K. (1977). Data-based program modification: A manual. Reston, VA:
Council for Exception Children.
Doabler, C. T., Fien, H., Nelson-Walker, N. J., & Baker, S. K. (2012). Evaluating three elementary
mathematics programs for presence of eight research-based instructional design principles.
Learning Disability Quarterly, 35(4), 200–211. doi:10.1177/0731948712438557
Doabler, C. T., Baker, S K., Kosty, D. B., Smolkowski, K., Clarke, B., Miller, S. J., & Fine, H. (2015).
Examining the association between explicit mathematics instruction and student
mathematics achievement. The Elementary School Journal, 115, 303–333.
Duncan, G. J., Dowsett, C. J., Claessens, A., Magnuson, K., Huston, A. C., Klebanov, P., … Japel, C.
(2007). School readiness and later achievement. Developmental Psychology, 43, 1428–1446.
doi:10.1037/0012-1649.43.6.1428
Fuchs, L. S., Compton, D. L., Fuchs, D., Hamlett, C. L., DeSelms, J., Seethaler, P. M., … Changas, P.
(2013). Effects of first-grade number knowledge tutoring with contrasting forms of practice.
Journal of Educational Psychology, 105(1), 58–77. doi:10.1037/a0030127.
Fuchs, L. S., Fuchs, D., & Malone, A. S. (2018). The taxonomy of intervention intensity. Teaching
Exceptional Children, 50(4), 194–202. doi:10.1177/0040059918758166
Fuchs, L. S., Schumacher, R. F., Sterba, S. K., Long, J., Namkung, J., Malone, A., … Changas, P.
(2014). Does Working Memory Moderate the Effects of Fraction Intervention? An Aptitude-
Treatment Interaction. Journal of Educational Psychology, 106(2), 499–514. https://doi-
org.ezproxy.neu.edu/10.1037/a0034341
Fuson, K. C., & Beckmann, S. (2012–2013). Standard algorithms in the common core state
standards, NCSM Journal (fall/winter), Retrieved from
https://www.mathedleadership.org/docs/resources/journals/NCSMJournal_ST_Algorithms
_Fuson_Beckmann.pdf
Gersten, R., Beckmann, S., Clarke, B., Foegen, A., Marsh, L., Star, J. R., & Witzel, B. (2009).
Assisting students struggling with mathematics: Response to Intervention (RtI) for
elementary and middle schools. A practical guide (NCEE 2009-4060). Washington, DC:
National Center for Education Evaluation and Regional Assistance, Institute of Education
Sciences, U.S. Department of Education.
Gersten, R., Chard, D. J., Jayanthi, M., Baker, S. K., Morphy, P., & Flojo, J. (2009). Mathematics
instruction for students with learning disabilities: A meta-analysis of instructional
components. Review of Educational Research, 79, 1202–1242.
Grays, S., Rhymer, K., & Swartzmiller, M. (2017). Moderating effects of mathematics anxiety on
the effectiveness of explicit timing. Journal of Behavioral Education, 26(2), 188–200.
doi:10.1007/s10864-016-9251-6
Gunderson, E. A., Park, D., Maloney, E. A., Beilock, S. L. & Levine, S. C. (2018). Reciprocal
relations among motivational frameworks, math anxiety, and math achievement in early
elementary school. Journal of Cognition and Development, 19, 21–46.
doi:10.1080/15248372.2017.1421538
Hart, S. A., & Ganley, C. M. (2019). The nature of math anxiety in adults: Prevalence and
correlates. Journal of Numerical Cognition, 5, 122–139.
Hattie, J. A. C. (2009). Visible learning: A synthesis of 800 meta-analyses relating to
achievement. Oxon, UK: Routledge.
Hecht, S. A., & Vagi, K. J. (2010). Sources of group and individual differences in emerging
fraction skills. Journal of Educational Psychology, 102, 843–859. doi:10.1037/a0019824
Hiebert, J., & Lefevre, P. (1986). Conceptual and procedural knowledge in mathematics: An
introductory analysis. In J. Hiebert (Ed.), Conceptual and procedural knowledge: The case of
mathematics (pp. 1–27). Hillsdale, NJ: Erlbaum.
Jacob, R., & Parkinson, J. (2015). The potential for school-based interventions that target
executive function to improve academic achievement: A review. Review of Educational
Research, 85(4), 512–552. Retrieved from
http://search.ebscohost.com.ezp3.lib.umn.edu/login.aspx?direct=true&AuthType=ip,uid&d
b=eric&AN=EJ1081758&site=ehost-live
Koon, S., & Davis, M. (2019). Math course sequences in grades 6–11 and math achievement in
Mississippi (REL 2019–007). Washington, DC: U.S. Department of Education, Institute of
Education Sciences, National Center for Education Evaluation and Regional Assistance,
Regional Educational Laboratory Southeast. Retrieved from http://ies.ed.gov/ncee/edlabs
Kratochwill, T. R. (2012). Comments on “Distinguishing science from pseudoscience in school
psychology:” Evidence-based interventions for grandiose bragging. Journal of School
Psychology, 50(1), 37–42. doi:org.ezproxy.neu.edu/10.1016/j.jsp.2011.11.003
Kroesbergen, E. H., & Van Luit, J. E. H. (2003). Mathematics interventions for children with
special educational needs: A meta-analysis. Remedial and Special Education, 24, 97–114.
Lee, J. (2012). College for all: Gaps between desirable and actual P–12 math achievement
trajectories for college readiness. Educational Researcher, 41(2), 43–55.
Lilienfeld, S. O., Ammirati, R., & David, M. (2012). Distinguishing science from pseudoscience in
school psychology: Science and scientific thinking as safeguards against human error.
Journal of School Psychology, 50(1), 7–36. doi:10.1016/j.jsp.2011.09.006
Miciak, J., Williams, J. L., Taylor, W. P., Cirino, P. T., Fletcher, J. M., & Vaughn, S. (2016). Do
processing patterns of strengths and weaknesses predict differential treatment response?
Journal of Educational Psychology, 108(6), 898–909.
Montague, M. (2007). Self-regulation and mathematics instruction. Learning Disabilities
Research & Practice (Wiley-Blackwell), 22(1), 75–83. doi:10.1111/j.1540-5826.2007.00232.x
Namkung, J. M., Peng, P., & Lin, X. (2019). The relation between mathematics anxiety and
mathematics performance among school-aged students: a Meta-analysis. Review of
Educational Research, 89(3), 459–496. hdoi: 10.3102/0034654319843494
National Governors Association Center for Best Practices & Council of Chief State School
Officers. (2010). Common core state standards. Washington, DC: Authors.
National Research Council. (2001). Adding it up: Helping children learn mathematics.
Washington, DC: National Academy Press.
National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the
national mathematics advisory panel. Washington, DC: U.S. Department of Education.
Retrieved from https://www2.ed.gov/about/bdscomm/list/mathpanel/report/final-
report.pdf
Powell, S. R., & Fuchs, L. S. (2015). Intensive intervention in mathematics. Learning Disabilities
Research & Practice (Wiley-Blackwell), 30(4), 182–192. doi:10.1111/ldrp.12087
Rittle-Johnson, B. (2017). Developing mathematics knowledge. Child Development Perspectives,
11(3), 184–190. doi:10.1111/cdep.12229
Rittle-Johnson, B., Schneider, M., & Star, J. (2015). Not a one-way street: Bidirectional relations
between procedural and conceptual knowledge of mathematics. Educational Psychology
Review, 27(4), 587–597doi:10.1007/s10648-015-9302-x
Schneider, M., Star, J. R., & Rittle-Johnson, B. (2011). Relations among conceptual knowledge,
procedural knowledge, and procedural flexibility in two samples differing in prior
knowledge. Developmental Psychology, 47(6), 1525–1538. doi:10.1037/a0024997
Sood, S., & Jitendra, A. K. (2007). A comparative analysis of number sense instruction in reform-
based and traditional mathematics textbooks. Journal of Special Education, 41(3), 145–157.
doi:10.1177/00224669070410030101
Star, J. (2005). Reconceptualizing procedural knowledge. Journal for Research in Mathematics
Education, 36, 404–411.
Stein, M., Kinder, D., Rolf, K., Silbert, J. & Carnine, D. W. (2018). Direct instruction mathematics
(5th Ed.). New York, NY: Pearson.
Stuebing, K. K., Barth, A. E., Trahan, L. T., Radhika R. R., Miciak, J., & Fletcher, J. M. (2015). Are
child cognitive characteristics strong predictors of responses to intervention? A meta-
analysis. Review of Educational Research, 85, 395–429.
Swanson, H. L. (2009). Science-supported math instruction for children with math difficulties:
Converting a meta-analysis to practice. In S. Rosenfield, V. Berninger, S. Rosenfield, & V.
Berninger (Eds.), Implementing evidence-based academic interventions in school settings
(pp. 85–106). New York, NY: Oxford University Press.
Tsui, J. M., & Mazzocco, M. M. M. (2006). Effects of math anxiety and perfectionism on timed
versus untimed math testing in mathematically gifted sixth graders. Roeper Review,
29(2), 132–139. doi:10.1080/02783190709554397
VanDerHeyden, A. M., & Allsopp, D. (2014). Evidence-based practices for mathematics
(Document No. IC-6). Retrieved from University of Florida, Collaboration for Effective
Educator, Development, Accountability, and Reform Center,
http://ceedar.education.ufl.edu/tools/innovation-configuration
VanDerHeyden, A. M., & Codding, R. (2015). Practical effects of classwide mathematics
intervention. School Psychology Review, 44, 169–190. doi:10.17105/spr-13-0087.1
VanDerHeyden, A. M., McLaughlin, T., Algina, J., & Snyder, P. (2012). Randomized evaluation of
a supplemental grade-wide mathematics intervention. American Education Research
Journal, 49, 1251–1284.
Wu, H. (2011). Phoenix rising: Bringing the common core state mathematics standards to life.
American Educator, Fall 2011, 3-13.
Amanda VanDerHeyden, PhD, is founder of Spring Math and a frequent contributor to the
research literature in school psychology. Robin Codding, PhD, is an associate professor in the
school psychology program at Northeastern University and author of Effective Math
Interventions: A Guide to Improving Whole Number Knowledge.
Table 1. Commonly Held Beliefs, Their Origins, and Evidence-Based Positions
Incorrect Belief
Where does this incorrect
belief come from and what
harm may it cause?
What is the empirically
supported position?
Timed assessment causes
anxiety.
Fear that timed assessment
will cause anxiety that will be
harmful for the child and
justifies avoidance of timed
assessment (i.e., risks
outweigh benefits belief).
When teachers avoid timed
assessment, they lose the
ability to accurately estimate
skill mastery, which means
they cannot bring formative
assessment to bear on
learning.
Anxiety dissipates with
supported exposure.
Skill is precursor to anxiety.
Growth mindset practices
reduce anxiety & increase
skill.
You cannot teach a child how
to procedurally solve a
problem until you have
established conceptual
understanding.
An incorrect belief that
teaching procedural
knowledge will be a barrier to
conceptual knowledge and
that understanding
progresses linearly. Failure to
teach problem solving
procedures and strategies
Relationship between
procedural knowledge and
conceptual knowledge is
bidirectional.
actually inhibits conceptual
understanding.
It is harmful to explicitly teach
a child how to use an
algorithm to solve a problem.
Fear that teachers will teach
the algorithm without
cultivating understanding,
which will result in forgetting.
Fear that algorithm-
instruction destroys curiosity.
Children are deprived of a
sure-fire strategy to arrive at
a correct solution.
Anxiety follows weak skill.
Anxiety is a likely barrier to
curiosity.
Procedural & conceptual
knowledge are bidirectional
so conceptual knowledge is
worsened.
Algorithms are like phonics
skills in reading. They work
because of mathematical
laws. They can be proofed
and unpacked.
Explicit instruction is
beneficial only for struggling
learners.
Lack of understanding of what
explicit instruction actually
entails. Many believe that
explicit or direct instruction
involves a teacher providing
didactic lectures to students.
Some teachers fear that using
explicit instruction will harm
learners and so they avoid.
Over 50 years, explicit or
direct instruction has yielded
the strongest stable effect
size on learning for students
at all performance levels.
Executive functioning tools
and interventions improve
academic performance.
Belief that executive function
skills, which are correlated
with academic proficiency,
are causal rather than just
correlated with academic
proficiency. Unnecessary
additional assessments are
administered instead of
making low inference
instructional decisions based
upon direct evaluation of skill
strengths and weaknesses
leading to the
Executive functioning (and
cognitive) interventions do
not improve academic
outcomes.
recommendation of
intervention tactics that will
not produce the desired
academic gains.
Figure 1. Fluency by Accuracy