ArticlePDF Available

Using Data to Understand and Improve Students' Learning: Empowering Teachers and Researchers Through Building and Using a Knowledge Base

Authors:

Abstract and Figures

In our May editorial (Cai et al., 2018a), we explored how collaborations among teacher-researcher partnerships could harness emerging technological resources to address the problem of isolation in the work of teachers and researchers. In particular, we described a professional knowledge base (Cai et al., 2018b) and a mechanism by which that knowledge base could be continuously populated, updated with data and resources that are useful to teachers and researchers, and shared among partnerships thereby enabling them to work on the same instructional problems. In this editorial, we shift our focus to discuss how data on students' thinking and classroom experiences could be leveraged within such a system to improve instructional practice. We will explore how the knowledge base could serve as a tool to (a) gather, process, and analyze data from individual students; (b) increase our understanding of the effects of students' mathematical learning experiences; and (c) help teacher-researcher partnerships understand and improve students' learning.
Content may be subject to copyright.
Editorial
Using Data to Understand and Improve Students
Learning: Empowering Teachers and Researchers
Through Building and Using a Knowledge Base
Jinfa Cai, Anne Morris, Charles Hohensee, Stephen Hwang,
Victoria Robison, and James Hiebert
University of Delaware
In our May editorial (Cai et al., 2018a), we explored how collaborations among
teacher–researcher partnerships could harness emerging technological resources
to address the problem of isolation in the work of teachers and researchers. In
particular, we described a professional knowledge base (Cai et al., 2018b) and a
mechanism by which that knowledge base could be continuously populated,
updated with data and resources that are useful to teachers and researchers, and
shared among partnerships thereby enabling them to work on the same instruc-
tional problems. In this editorial, we shift our focus to discuss how data on
students’ thinking and classroom experiences could be leveraged within such a
system to improve instructional practice. We will explore how the knowledge base
could serve as a tool to (a) gather, process, and analyze data from individual
students; (b) increase our understanding of the effects of students’ mathematical
learning experiences; and (c) help teacher–researcher partnerships understand and
improve students’ learning.
Developing an Explanatory Theory That Connects Teaching With
Students’ Learning
An overarching theme of our editorials has been addressing the persistent gap
between research and practice in mathematics education. We have acknowledged
that if research is to have a greater impact on practice, it must address the problems
of practice that teachers grapple with, and it must do so in a way that produces
knowledge that teachers can use. The professional knowledge base we have
described attempts to do this by engaging teacher–researcher partnerships in
collaborative efforts to create and share lessons whose effectiveness is iteratively
refined over the course of many cycles of design and implementation. This process
is based on the assumption that specific instructional activities can be connected
to students’ opportunities to learn and the degree to which students are able to
take advantage of those opportunities.
This assumption echoes Nuthall’s (2004) call, in his critique of research on
teaching effectiveness, for “research that actually answers the question of how
teaching is related to learning in a way that is comprehensible and practically
useful for teachers” (p. 273). Nuthall proposed six considerations that must be
Journal for Research in Mathematics Education
2018, Vol. 49, No. 4, 362–372
Copyright © 2018 by the National Council of Teachers of Mathematics, Inc., www.nctm.org. All rights reserved. This
material may not be copied or distributed electronically or in other formats without written permission from NCTM.
363
Cai, Morris, Hohensee, Hwang, Robison, and Hiebert
taken into account for research on the teaching–learning relationship to produce
useful findings. Of these, he considered “complete, continuous data on individual
student experience” (p. 296) to be the most critical because this kind of data is
fundamental to developing an explanatory theory of how different ways of
teaching are related to student learning outcomes. Without data on how students
experience and respond to teaching, research cannot fully illuminate the under-
lying processes that connect the choices made by teachers in the classroom to
students’ learning.
In his own research, Nuthall (2004) made use of different technologies,
including “miniature video cameras with zoom lenses mounted on the ceiling of
the classroom” and “miniature individual broadcast microphones” (p. 300), to
systematically capture continuous data on each individual student in a classroom.
These data allowed him to follow the development of each student’s understanding
of particular concepts and to trace the origins and consequences of particular
misconceptions that students developed during lessons. We agree with Nuthall
about the value of collecting continuous data on the learning experiences of each
student. We have further extended this view to include collecting continuous data
about noncognitive aspects of each student’s learning experiences (Cai et al.,
2017b).
Challenges to Understanding Students’ Learning Experiences
Taking such a broad view of students’ learning experiences comes with a cost.
In typical mathematics classrooms, teachers (and researchers) face a number of
obstacles to collecting continuous data from (and studying) students’ mathematical
learning experiences. Accessing how all students think about and make meaning
of mathematics in the moment would be daunting, to say the least. Teachers often
gain insights into students’ thinking by talking with students or by examining
artifacts of their work. However, keeping track of every student’s thinking through
an entire lesson or over several lessons is an overwhelming task for a teacher.
Collecting so much information about every student, and keeping that information
up-to-date, could easily collapse under its own weight. Although Nuthall (2004)
found the comprehensive data he collected to be a powerful resource for under-
standing the cognitive relationships between teaching and learning, he acknowl-
edged that the process of obtaining and processing the data was time-consuming
and labor-intensive.
Including noncognitive aspects of students’ experiences adds yet more
complexity to data collection and processing. For example, comprehensively
assessing students’ engagement and motivation might require a combination of
classroom observation, video analysis, surveys, and interviews (Middleton,
Jansen, & Goldin, 2017). Nevertheless, we believe it is worth pursuing the creation
and use of technologically aided professional knowledge bases because of the
considerable power such knowledge offers for building a usable explanatory theory
that connects teaching with students’ learning.
364 Using Data to Understand and Improve Students’ Learning
In response to the challenges of creating and using large databases and to calls
for greater adoption of data-driven instruction (Hamilton et al., 2009), a number
of digital tools to collect and manage student data have been created and marketed
to teachers and school districts. These tools include digital gradebooks and dash-
boards, learning management systems, applications that generate assessments,
software and online platforms for individual student instruction, and digital
remediation tools. Indeed, in a national representative survey of 4,600 teachers in
the United States, the Bill & Melinda Gates Foundation (2015) found that “virtu-
ally all teachers (93%) regularly use some form of digital tool to guide instr uction”
(p. 3). However, the same survey revealed that 67% of those teachers were “not
fully satisfied with the effectiveness of the data or the tools for working with data
that they have access to on a regular basis” (p. 3). The teachers identified key
challenges presented by the tools, reaching the consensus that, despite the help of
existing digital tools, it remains too overwhelming to collect, analyze, and use
data to support data-driven instruction. Current offerings such as digital dash-
boards that track and display student progress remain subject to the fundamental
problem of communicating too much information about too many students at once.
Information overload is a very real phenomenon (Ingram, Louis, & Schroeder,
2004). Moreover, the use of the data provided by current digital tools is often
constrained by the incompatibility of different technological platforms and incon-
sistency in reporting the data. Connecting student data from different sources into
a single platform often requires much time and effort. Finally, teachers are hard-
pressed to react to data effectively and to adjust their instruction based on feedback
from digital systems because these systems often do not provide timely informa-
tion in a usable form. Therefore, it is not surprising that the promise of data-driven
instruction has, to this point, not been fully realized.
In the future world we envision, however, it is not difficult to imagine solving
the technical difficulties of gathering and managing such complex and large data
sets in ways that could provide timely insights in a form that teachers could use
on a daily basis. Even today, portable video cameras and audiorecording equip-
ment are ubiquitous in the form of smartphones. In addition, the spread of one-to-
one technology initiatives that provide every student in a school district with a
laptop or tablet computer means that many students are rarely far from a device
that can gather the continuous student data that Nuthall (2004) described.
Moreover, the technology to automatically process, transcribe, parse, and filter
these data is rapidly developing. Online services already routinely process huge
collections of image data, automatically indexing pictures by faces and objects.
The presence of these technologies in the classroom can also facilitate the collec-
tion of data relevant to noncognitive outcomes and affective factors by making it
easier to capture real-time data directly from students using methods such as
experience sampling (Zirkel, Garcia, & Murphy, 2015). In other words, the
capacity to capture, process, and store comprehensive cognitive and noncognitive
data longitudinally for every student either already exists or is on the near horizon.
Thus, a critical consideration for our vision is how these kinds of data on students’
365
Cai, Morris, Hohensee, Hwang, Robison, and Hiebert
classroom experience, coupled with detailed student assessment data and teachers’
own observations, could enable teachers and researchers to gain insights into
students’ mathematical learning experiences that have a real impact on practice.
The Power of the Knowledge Base for Collecting, Analyzing,
and Using Data
Our vision of the use of data on students’ thinking and experiences is based on
three assumptions about these data and the relationship between teaching and
learning. The first assumption is that conceptual models based on longitudinal
data on individual students or groups of students with similar learning profiles,
often called learning trajectories, are incomplete without descriptions of instruc-
tional activities or learning experiences associated with changes in student
thinking and learning. In other words, data on students’ experiences must be
paired with data on instruction to make connections between teachers’ teaching
and students’ learning. This is a point we have emphasized in our descriptions of
how teacher–researcher partnerships could work with a professional knowledge
base (Cai et al., 2018a). The second assumption is that teaching can greatly improve
students’ learning if teachers understand students’ thinking and learning experi-
ences. The work of Cognitively Guided Instruction has already provided ample
evidence to support this assumption (Carpenter, Franke, Jacobs, Fennema, &
Empson, 1998). The final assumption is that a professional knowledge base offers
the potential, through the effective application of technology, to provide timely
and useful information to teachers about students’ thinking and learning in ways
that do not further burden them.
What makes this level of student data important? Why would we, as researchers
and teachers, want to have this f lood of information? What would researchers and
teachers actually do with this information? How could the data be collected,
analyzed, and used efficiently and productively? In this section, we propose a
framework for suppor ting teacher–researcher partnerships’ use of data for instruc-
tion. As we indicated in Cai et al. (2017b), we believe that students’ learning
experiences include both cognitive and noncognitive aspects in both the short and
the long term. Thus, data have the potential to be useful to teachers and researchers
at different times relative to any individual lesson. We will therefore consider how
data can be useful in the moment (during a lesson), in the short term (shortly after
a single or multiple lessons), and in the long term (across years).
Table 1 outlines our proposed framework for envisioning the collection, anal-
ysis, and use of student data, indicating the kinds of data on students’ experiences
that could be useful at different points in time. Although not explicitly listed in
the table, our first assumption implies that any data on a student’s experiences
collected within this framework would necessarily be coupled with a description
of the instr uctional activities associated with those experiences. It is also important
to note that the data and the tools that support teachers’ use of data must work
together to avoid the time-consuming, manual aggregation of information often
required today (Bill & Melinda Gates Foundation, 2015). Moreover, it remains an
366 Using Data to Understand and Improve Students’ Learning
Table 1
Framework for Collecting, Analyzing, and Using Data on Students’ Mathematical
Learning Experiences
Time frame Cognitive Noncognitive
In the moment
Data Students’ conceptions and
misconceptions
Students’ unexpected
responses
Students’ engagement with
tasks
Students’ affect or frustration
level
Students’ participation in
discourse
Goals Address, in the moment,
particular misconceptions
among subgroups of students
and provide immediate
supports
Enact supports for students
who are disengaged or
discouraged
Identify how students are
being positioned within the
classroom and shape classroom
discourse to provide them with
a voice
Short term
Data Students’ conceptions,
misconceptions, and
unexpected responses
Students’ solution strategies
Students’ ways of thinking
Students’ insights
Factors that affect students’
engagement with a task
Students’ confidence both
before and after solving a
problem
Classroom norms of
participation
Goals Identif y groups of students
with similar conceptions,
misconceptions, or ways of
thinking to inform the next
lesson plan
Identify groups of students
who are experiencing different
levels of motivation or
engagement with the lesson to
inform the next lesson plan
Long term
Data
Goals
Data across classrooms and
research sites
Longitudinally examine
changes in st udents’ cognitive
learning outcomes so that
teachers can track the progress
of individual students
Develop explanatory theories
that connect teaching and
learning for particular groups
of students
Connections between affect
and achievement
Longitudinally examine
changes in st udents’ affect
related to their learning
367
Cai, Morris, Hohensee, Hwang, Robison, and Hiebert
open question what kinds of information teachers can use effectively, especially
while they are actively engaged in instruction. Rather than an exhaustive list, we
see this proposed framework as a potential guide for research in this area,
providing some examples of the data that could be relevant and the goals for using
that data at different time frames.
Data in the Moment
In the classroom, teachers engage in a complex interaction with students
wherein they continuously assess their students’ responses and make pedagogical
decisions in the moment based on those assessments, their own knowledge, and
their instructional plan. What data would be useful in the moment to help teachers
make these decisions more effectively as they teach? How could those data be
presented to teachers in such a way that it is not just another distraction or demand
on their time?
Suppose that all students were equipped with a tablet device onto which they
recorded their mathematical work as they would on paper. The device’s hand-
writing recognition algorithms would read and process the data, and the data
would be uploaded to the knowledge base for analysis, resulting in immediate
feedback provided to teachers about each student’s understanding and strategy
use. For example, a teacher–researcher partnership could identify potential attri-
butes of interest for each instructional task that they stored in the knowledge base.
These attributes would be “dimensions of reasoning or understanding in a given
domain” (Izsák & Templin, 2016, p. 20) that would be needed to complete the task.
The system would provide feedback about students’ performance with respect to
those attributes. Developments in diagnostic classif ication models (de la Torre,
Carmona, Kieftenbeld, Tjoe, & Lima, 2016) and computer adaptive testing
(Chang, 2015) as well as advances in technology could contribute to designing a
system to assess students’ mathematical thinking in such ways. This combination
of technology and psychometrics would give teachers a window into each student’s
understanding and allow them to use students’ responses to immediately inform
instruction.
As another example of using data in the moment, the system of data collection
and analysis could provide the teacher with an initial clustering of student
responses to a task based on similarities along particular attributes. Different
categories of student responses could be easily compared to illustrate different
strategies or to address misconceptions. As a third example, if a task involved
drawing a diagram, the system could classify the students’ pictures and present
the main types to the teacher in a side-by-side comparison. If the teacher was
working with a well-designed lesson and this allowed him or her to see that the
students had used only two of four expected responses, the teacher could adjust
the remainder of the lesson to focus on the two responses that students generated
(or find a way to bring out the other two responses).
Many kinds of data on students’ noncognitive learning experiences could also
inform in-the-moment teacher decision making. For example, students could rate
368 Using Data to Understand and Improve Students’ Learning
their confidence level before and after working on a particular problem. Research
has shown that students’ confidence for solving a particular problem is highly
correlated with their success in solving the problem (Pajares, 1996; Zimmerman,
1995). Information about how confident students are when approaching a task
could signal teachers that less confident students might need additional support
to engage in productive struggle with the task. With respect to student engagement
and participation in classroom discourse, a system could monitor each student’s
talk and process it on the fly to produce classroom “heat maps” indicating which
students are contributing to mathematical discussions and which students are
silent. If a teacher were equipped with such visualizations, he or she could quickly
gain impor tant insights into which students are being positioned as mathematically
powerful and which students are playing more passive roles (Esmonde & Langer-
Osuna, 2013; Herbel-Eisenmann, Meaney, Bishop, & Heyd-Metzuyanim, 2017).
This would then allow the teacher to shape the classroom discourse to give all
students an opportunity to have a voice. Similarly, the system could report in-the-
moment data on student frustration based on image and voice analysis, helping
the teacher judge when students are productively struggling with a task versus
when students are becoming too frustrated. Another possibility is a tablet device
equipped to collect data to determine how engaged the students are with a task or
to which aspects of the task they are attending. This could also involve on-the-fly
voice analysis or other technologies such as eye tracking. Real-time displays of
these data could, again, be provided to teachers for their use in the moment.
Of course, many teachers already gather some information of this type through
their own observation in the classroom. Noticing what students do and listening
to what they say is a powerful tool, as expert teachers have long recognized. But
no teacher (and no researcher) has the time or resources to collect and make sense
of these data for every student during every lesson. The difference in the type of
system that we describe is that data from every student would be gathered simul-
taneously and automatically, and the system itself would surface those data that
would be most helpful at any given moment to support teachers’ pedagogical
decision making—a just-in-time resource for instruction.
Data in the Short Term
In our framework, analyzing and using data in the short term refers to using
data reflectively after a lesson or unit has been taught to inform subsequent
instruction with the same students. Data recorded in the knowledge base on each
student’s strategy use, conceptions and misconceptions, and affective responses
to a lesson could guide teachers and researchers as they decide what needs to be
addressed in the next lesson and what new concepts are feasible for students given
their current understanding. Similarly, teachers and researchers could access
students’ performance on previous instructional tasks to help them predict how
those students would think about tasks in the next lesson. For example, following
a lesson introducing exponential growth and graphs of exponential functions, the
knowledge base would contain data on the kinds of graphs students produced. If
369
Cai, Morris, Hohensee, Hwang, Robison, and Hiebert
some students produced graphs that did not show equal growth factors over equal
intervals (perhaps producing linear graphs or graphs with irregular growth
factors), the system could alert the teacher and researcher of this development and
make predictions about how those students would engage with the next lesson’s
tasks, allowing the teacher and researcher to plan how to address the misconcep-
tion in the next lesson.
Such data would also reveal individual students’ learning progressions in the
unit. The data could be used to identify students who had difficulty with particular
concepts in the unit. The system could then spotlight clusters of students who were
experiencing similar difficulties, perhaps identifying those clusters in another
type of heat map display, so that the teacher and researcher could plan how to
address those difficulties. Data on noncognitive aspects of students’ experiences
could also be used by the teacher and researcher to build targeted noncognitive
supports into the next lesson. For example, the teacher and researcher could look
specifically at students who were not participating much during a given lesson
and check that they still were engaged and not “falling through the cracks.” Or the
system could highlight productive and unproductive classroom norms, allowing
the teacher and researcher to plan for supports in subsequent lessons that would
promote productive norms and discourage unproductive ones. By analyzing these
kinds of data from the lessons in a unit, the system could help teacher–researcher
partnerships to identify key aspects of how each student’s affect and cognitive
aspects of learning mutually influence each other.
Data in the Long Term
The professional knowledge base that we have described (Cai et al., 2018b)
would provide teachers and researchers with a powerful tool suitable for a variety
of needs ranging from large scale (across classrooms or schools) to small scale
(across particular groups of students or individual students). Teachers and teacher
researcher partnerships will likely want to study data from their own classroom
or a few classrooms in which students are trying to achieve the same learning
goals. Moreover, with access to longitudinal data on each student’s mathematical
thinking, teachers and researchers could become increasingly familiar with how
their students think about certain concepts and, ultimately, could begin accurately
predicting how particular students will respond. By connecting classroom data
sets from teachers who have used the same instructional task or sequence of tasks,
the system could begin to make useful connections between students’ under-
standing and conceptions and their subsequent learning experiences with those
tasks. These connections would generate an explanatory theory of the kind envi-
sioned by Nuthall (2004), a theory that would predict how other students will
respond to the activity and, along with data from the teacher’s own classroom,
enable a kind of data-based planning not previously possible. This kind of long-
term use of data could have a strong impact on equity by affording teacher
researcher partnerships the ability to tailor implementation to create similar
learning opportunities for all groups of students.
370 Using Data to Understand and Improve Students’ Learning
Teacher–researcher partnerships might also be interested in studying students
who respond in different ways to hypothesized cause-and-effect relationships
between a task and student learning. Learning more about these local cause-and-
effect relationships would allow tweaking of the explanatory theory, as well as
tweaking of the instructional activity for future implementation. Moreover, these
data could aid the planning of follow-up activities to build on students’ thinking
as revealed by the data. Fundamentally, the long-term work of teaching (conducted
by teachers and teacher–researcher partnerships) would not lie in redesigning
activities (i.e., curriculum development) but in studying tendencies of students and
making systematic incremental improvements in teaching and learning that, over
time, accumulate into big improvements.
Researchers would likely have a special interest in accumulating long-term data
on a sequence of tasks that develop a particular learning goal or network of goals.
Teacher–researcher partnerships at different sites might use different tasks or
sequences of tasks for a particular mathematical topic, and the data on students’
experiences with different tasks and sequences would help shed light on the more
promising sequences of tasks for maximizing students’ learning. Stepping back
and looking at larger data sets (across more students and connected sequences of
activities) would allow building more ambitious explanatory theories based on
models of students’ thinking or learning trajectories that provide new insights into
how students with different backgrounds develop their thinking connected to
particular kinds of instructional tasks. The knowledge base would open new
possibilities for formulating and testing both local and more general theories about
cause-and-effect relationships between teaching and learning. These explanatory
theories could, for example, specify relationships that are contingent on the devel-
opment of particular prerequisite knowledge.
With respect to teaching, and specifically the pedagogical decisions that
teachers make as they teach, the system could collect data across research sites
about the different kinds of in-the-moment decisions that teachers make when
confronted with unexpected situations in a given task or lesson. Over time,
collecting and analyzing those data along with the student outcomes that followed
particular pedagogical choices could help populate the knowledge base with
information on what kinds of decisions are best for students’ learning on the topic.
The same kind of analyses could be conducted on the effects of using particular
planned questions, follow-up responses to students’ anticipated solution strategies,
and practice exercises after the concept was developed. Were the predicted
outcomes confirmed, or are changes to the predictions warranted? As data are
collected across multiple classrooms with diverse groups of students, explanatory
theories can be refined to guide the planning of instruction that reaches more and
more students.
The Roles of Teachers and Researchers
If we assume the existence of a system that could efficiently collect, analyze,
and share data on student experiences linked to instructional activities to create
371
Cai, Morris, Hohensee, Hwang, Robison, and Hiebert
usable knowledge bases, we are confronted with the fact that teachers and
researchers are likely to play quite different roles. We have already described some
of the radical changes in the work of teachers and researchers in this new system
in this and earlier editorials (Cai et al., 2017a, 2017b, 2017c). In our next editorial,
we will further explore these new roles and consider how we might move from
our present reality to this future reality.
References
Bill & Melinda Gates Foundation. (2015). Teachers know best: Making data work for teachers and
students. Retrieved from http://k12education.gatesfoundation.org/resource/teachers-k now-best-
making-data-work-for-teachers-and-students-2/
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2017a). A future vision
of mathemat ics educat ion research: Blurring the boundaries of research and practice to
address teachers’ problems. Journal for Research in Mathematics Education, 48(5), 466–473.
doi:10.5951/jresematheduc.48.5.0466
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2017b). Clarif ying the
impact of educational research on students’ learning. Journal for Research in Mathematics
Education, 48(2), 118–123. doi:10.5951/jresematheduc.48.2.0118
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2017c). Making classroom
implementation an integral part of research. Journal for Research in Mathematics Education,
48(4), 342–347. doi:10.5951/jresematheduc.48.4.0342
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2018a). Building and
structuring knowledge that could actually improve instruct ional practice. Journal for Research
in Mathematics Education, 49(3), 238–246.
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2018b). Data in a brave new
world: Reducing isolation to amplify t he impact of educational research on practice. Jour nal for
Research in Mathematics Education, 49(2), 118–124. doi:10.5951/jresematheduc.49.2.0118
Carpenter, T. P., Franke, M. L., Jacobs, V. R., Fennema, E., & Empson, S. B. (1998). A longitudinal
study of invention and understanding in children’s multidigit addition and subtraction. Jour nal
for Research in Mathematics Education, 29(1), 3–20. doi:10.2307/749715
Chang, H.-H. (2015). Psychometrics behind computerized adaptive testing. Psychometrika, 80(1) ,
1–20. d oi:10.1007/s11336- 014-94 01-5
de la Torre, J., Car mona, G., K ieftenbeld, V., Tjoe, H., & Lima, C. (2016). Diagnost ic classication
models and mathematics education research: Opportu nities a nd challenges. In A. Izsák, J. T.
Remillard, & J. Templin (Ed s.), Psychometric methods in mathematics education: Opportunities,
challenges, and interdisciplinary collaborations (Journa l for Research in Math ematics Edu cation
Monograph No. 15, pp. 53–71). Reston, VA: National Council of Teachers of Mathematics.
Esmonde, I., & La nger-Osuna, J. M. (2013). Power in numbers: Stu dent particip ation in mathem atical
discussions in heterogeneous spaces. Journal for Research in Mathematics Education, 44(1),
288–315. doi:10.5951/jresematheduc.44.1.0288
Hamilton, L., Halverson, R., Jackson, S. S., Mandinach , E., Supovitz, J. A., & Wayman, J. C. (2009).
Using student achievement data to support inst ruct ional decision mak ing (NCEE 2009- 4067).
Washington, DC: National Center for Education Evaluation and Regional Assistance, Instit ute of
Education Sciences, U.S. Department of Education.
Herbel-Eisenman n, B., Meaney, T., Bishop, J. P., & Heyd-Metzuyanim, E. (2017). Highlighti ng
heritages and building tasks: A critical analysis of mathematics classroom discourse literat ure.
In J. Cai (Ed.), Compendium for research in mathematics education (pp. 722–765). Reston, VA:
National Cou ncil of Teachers of Mathematics.
Ingram, D., Louis, K. S., & Schroeder, R. G. (2004). Accountability policies and teacher decision
making: Barr iers to the use of data to improve practice. Teachers College Record, 106(6 ),
12 58 –128 7. doi :10 .1111/j .14 67-96 20. 2 004. 0 03 79.x
372 Using Data to Understand and Improve Students’ Learning
Izsák, A., & Templin, J. (2016). Coordinating concept uali zations of mathemat ical knowledge with
psychometric models: Opport unities and challenges. In A. Izsák, J. T. Remillard, & J. Templin
(E ds.), Psychometric methods in mathematics education: Opportunities, challenges, and
interdisciplinary collaborations (Journal for Research in Mathematics Education Monograph
No. 15, pp. 5–30). Reston, VA: National Council of Teachers of Mathematics.
Middleton, J., Jansen, A., & Goldin, G. A. (2017). The complexities of mathematical engagement:
Motivation, affect, a nd social interactions. In J. Cai (Ed.), Compendium for research in
mathematics education (pp. 667–699). Rest on, VA : National Council of Teachers of Mat hematics.
Nuthall, G. (2004). Relating classroom teaching to student learning: A cr itical analysis of why
research has failed to bridge the theor y-practice gap. Har vard Educational Review, 74(3) , 273
306. doi:10.17763/haer.74.3.e08k1276713824u5
Pajares, F. (1996). Self-ef cacy beliefs in academic settings. Review of Educational Research,
66(4), 543 –578. doi:10.3102/0 03465430 660 04543
Zimmerman, B. J. (1995). Self-ef cacy and educational development. In A. Bandura (Ed.), Self-
efcacy in changing societies (pp. 202–231). New York, NY: Cambridge University Press.
Zirkel, S., Garcia, J. A., & Mur phy, M. C. (2015). Experience-sampling research methods
and their potential for education research. Educational Researcher, 44(1), 7–16.
do i:10. 310 2/0 013189X1456 68 79
... Inquirybased professional development communities (IPDCs) represent a way of organizing the partnership between researchers and practitioners to improve the quality of mathematics instruction (Cobb et al., 2003). In mathematics education, the collaboration between teachers and educational researchers around inquiry in IPDCs has become an increasingly widespread form of RPP (Cai et al., 2018;Coburn et al., 2013;Farrel et al., 2022). IPDCs involve mathematics teachers and researchers in collaborative work around educational research across boundaries of their respective cultural, professional, and organizational orientations, exchanging ideas from research and practice and learning from and with each other through this collaboration (Farrel et al., 2022;Pinto & Koichu, 2021). ...
... Through collaborative work, educational researchers gain a deeper understanding of mathematics teaching practices, equipping them to directly address practitioners' day-today professional challenges (Cai et al., 2018;Matthews et al., 2021). In turn, collaboration with researchers empowers teachers with the systematic exploration of genuine professional dilemmas and the implementation of innovative research findings in their classrooms. ...
... In turn, collaboration with researchers empowers teachers with the systematic exploration of genuine professional dilemmas and the implementation of innovative research findings in their classrooms. The integration of ideas from the IPDC into teachers' knowledge and practice enhances the relevance and impact of their work (Cai et al., 2018;Jaworski & Potari, 2021). ...
Article
Full-text available
Research–practice partnerships around educational research may have beneficial outcomes but also present tensions. By considering the dynamics of manifested tensions, our study aims to understand how teachers engage with the various stages of the research in an inquiry-based professional development community consisting of eleven in-service teachers and three mathematics education researchers. In light of Heider's Balance Theory, we identify and analyze tensions expressed by teachers in the community discourse. Findings indicate that epistemic tensions related to teachers' and researchers' different cultural orientations act as powerful generators of inclusionary and exclusionary actions shaping community members' participation paths. While downplaying epistemic tensions can evoke individual actions detrimental to learning and destructive to the community's existence, awareness of and well-timed coping with tensions can become a springboard for community development.
... One way to move the knowledge base forward is a consideration of the results of the past decade with an increasing availability of wide-ranging technological methodologies that can provide data about students' engagement in mathematics learning activities to both teachers and researchers (Type CB research). To address the gap between research and practice for understanding and improving students' mathematical learning experiences, Cai et al. (2018) proposed the collection, analysis, and use of "continuous data on the learning experiences of each student" (p. 363) to facilitate researchers' understanding of explicit connections between teaching practices (Type C) and student learning activities (Type B). ...
... Yet, questions need to be considered if technological and methodological tools exist without overwhelming both researchers and teachers with too many data? According to Cai et al. (2018), the "capacity to capture, process, and store comprehensive cognitive and noncognitive data longitudinally for every student either already exists or is on the near horizon" (p. 364). ...
... 392) are still under development. Still, examining the future potential of technology to access student mathematical thinking for each student in the next decade, Cai et al. (2018) have proposed a framework for collecting, analyzing, and using data on students' mathematical experiences that uses a three-part time frame: (1) in the moment, (2) short term, and (3) long term (p. 366). ...
Book
Full-text available
Mathematics teaching is subject to cultural and temporal conditions. Not only do school and societal conditions shift, and with them the composition of the student body, but also curricular regulations and new mathematical and pedagogical insights determine the content to be taught and the approach to learning used in mathematics classes. To reflect on mathematics teaching in a changing world, there is a need for continuous scientific research into this process of teaching mathematics. Results of this research also have a retrospective impact on mathematics teacher education insofar as the conditions of education need to be continuously adapted to the professional requirements of teachers in practice. Research on teaching mathematics thus bears a great responsibility and is a constantly evolving field of research for scholars around the globe. This book comes at the time when the world is facing an ongoing global pandemic and experiencing violence and unrest due to active war. This publication symbolizes a professional commitment and international collaboration par excellence apropos teaching mathematics. The editors from three different continents and researchers who represent sixteen institutions and eight countries worked constructively and collaboratively with utmost respect for each other, with intentions to reflect on existing research knowledge and to create new knowledge that can be shared and used by other educators and researchers across the world. In preparation for this book, our international group of researchers shared current issues related to the evolution of research on teaching mathematics. We examined the present state of research on mathematics teaching and discussed the theoretical and methodological challenges associated with it, including issues related to conceptualization, instrumentation, and design. Additionally, we explored the likely direction of future research developments. In our literature review and discussions on this project, it became evident that studies on teaching frequently establish direct relationships between units of analysis that, at first glance, cannot be assumed to be directly related in a chain of effects. There are examples of studies presented in this book that directly relate teacher competencies to student achievements using empirical measurement models in a causal or relational way. Without criticizing these studies across the board, however, it seems reasonable to consider moderating or intermediate variables in this chain of effects (Baron & Kenny, 1986), such as the initiated student learning activities observable by teachers in the classroom, aspects of instructional quality (e.g., classroom management or cognitive activation), or corresponding student variables such as attention and cooperation in class or students’ prior knowledge (e.g., Fig. 1). Although there are researchers who do indeed study mediating variables (e.g., Blömeke et al., 2022), it became clear to us that there is a lack of a systematic scientific overview of the complete chain of effects between teacher characteristics, activities, and students’ learning processes. Overviews of precisely these aspects of research on teaching and respective studies are scarce, which inspired this book.
... One way to move the knowledge base forward is a consideration of the results of the past decade with an increasing availability of wide-ranging technological methodologies that can provide data about students' engagement in mathematics learning activities to both teachers and researchers (Type CB research). To address the gap between research and practice for understanding and improving students' mathematical learning experiences, Cai et al. (2018) proposed the collection, analysis, and use of "continuous data on the learning experiences of each student" (p. 363) to facilitate researchers' understanding of explicit connections between teaching practices (Type C) and student learning activities (Type B). ...
... Yet, questions need to be considered if technological and methodological tools exist without overwhelming both researchers and teachers with too many data? According to Cai et al. (2018), the "capacity to capture, process, and store comprehensive cognitive and noncognitive data longitudinally for every student either already exists or is on the near horizon" (p. 364). ...
... 392) are still under development. Still, examining the future potential of technology to access student mathematical thinking for each student in the next decade, Cai et al. (2018) have proposed a framework for collecting, analyzing, and using data on students' mathematical experiences that uses a three-part time frame: (1) in the moment, (2) short term, and (3) long term (p. 366). ...
Chapter
Full-text available
This chapter provides an overview of different conceptualizations of student engagement with mathematical ideas in studies that occur in mathematics classrooms and teaching experiment environments, and the types of quality student mathematics learning activities that result in desired learning outcomes. Over the last three decades, mathematics curriculum initiatives have called for the development of student behaviors and dispositions (i.e., mathematical competencies, processes, proficiencies, and practices) that actively engage all students in knowing and doing mathematics. According to Medley (1987), it is axiomatic that all learning depends on the activity of the learner. One of the main purposes of teaching is to provide students with effective and equitable experiences that will result in successful learner outcomes. Given the complexity of studying student engagement with learning activities, including the “constraint-support system” (Kaput in Handbook of research on mathematics teaching and learning, Macmillan, 1992) of technology-based mathematics activities, and the persistent challenges of conducting research in mathematics classrooms, this chapter describes the evolution of research examining student learning experiences through the lens of multiple theoretical perspectives that provide explanations relevant to how and why student behaviors and dispositions develop in the way they do within different learning environments.
... One way to move the knowledge base forward is a consideration of the results of the past decade with an increasing availability of wide-ranging technological methodologies that can provide data about students' engagement in mathematics learning activities to both teachers and researchers (Type CB research). To address the gap between research and practice for understanding and improving students' mathematical learning experiences, Cai et al. (2018) proposed the collection, analysis, and use of "continuous data on the learning experiences of each student" (p. 363) to facilitate researchers' understanding of explicit connections between teaching practices (Type C) and student learning activities (Type B). ...
... Yet, questions need to be considered if technological and methodological tools exist without overwhelming both researchers and teachers with too many data? According to Cai et al. (2018), the "capacity to capture, process, and store comprehensive cognitive and noncognitive data longitudinally for every student either already exists or is on the near horizon" (p. 364). ...
... 392) are still under development. Still, examining the future potential of technology to access student mathematical thinking for each student in the next decade, Cai et al. (2018) have proposed a framework for collecting, analyzing, and using data on students' mathematical experiences that uses a three-part time frame: (1) in the moment, (2) short term, and (3) long term (p. 366). ...
Chapter
Full-text available
In this chapter we investigate the evolution of research in mathematics education related to digital resources as an essential element of the external context for mathematics teachers’ professional activity. In the relevant research literature, we identified different themes and different kinds of evolution. We investigate the evolution of research with respect to educational policies related to digital resources, and to teacher integration of digital resources, including digital assessment. We also analyze the evolution of research concerning the quality of digital curriculum resources, and discuss emerging research questions related to mathematics and programming; to collective dimensions of teachers’ work with digital resources; and about the COVID-19 pandemic consequences. The different kinds of research developments are a result of evolution in the external context, or from more general trends in the research in mathematics education. We finally discuss possible directions for future research.
... One way to move the knowledge base forward is a consideration of the results of the past decade with an increasing availability of wide-ranging technological methodologies that can provide data about students' engagement in mathematics learning activities to both teachers and researchers (Type CB research). To address the gap between research and practice for understanding and improving students' mathematical learning experiences, Cai et al. (2018) proposed the collection, analysis, and use of "continuous data on the learning experiences of each student" (p. 363) to facilitate researchers' understanding of explicit connections between teaching practices (Type C) and student learning activities (Type B). ...
... Yet, questions need to be considered if technological and methodological tools exist without overwhelming both researchers and teachers with too many data? According to Cai et al. (2018), the "capacity to capture, process, and store comprehensive cognitive and noncognitive data longitudinally for every student either already exists or is on the near horizon" (p. 364). ...
... 392) are still under development. Still, examining the future potential of technology to access student mathematical thinking for each student in the next decade, Cai et al. (2018) have proposed a framework for collecting, analyzing, and using data on students' mathematical experiences that uses a three-part time frame: (1) in the moment, (2) short term, and (3) long term (p. 366). ...
Chapter
Full-text available
Lesson planning, assessment, and reflection constitute the key actions that teachers perform when students are not present in the classroom (henceforth, “Type D” variable). These “pre- and post-”actions are the most direct ways through which teachers shape their observable teaching work as mediated by their goals for their teaching. These goals are representations of teachers’ epistemological commitments apropos of teaching mathematics, whether those commitments be consciously espoused or unconsciously reproduced due to constraints within which they work. In this chapter, we survey the literature on lesson planning, assessment, and reflection according to eight epistemological paradigms that are widely known in the field of mathematics teaching. These epistemological paradigms are: Situated Learning Theory, Behaviorism, Cognitive Learning Theory, Social Constructivism, Structuralism, Problem Solving, Culturally Relevant Pedagogy, and Project- and Problem-Based Learning. We situate other perspectives on learning theory, which are derivatives of these prevailing paradigms, within this overarching frame. Our literature search revealed that some of the theoretical perspectives are well-reported in the literature whilst others have not received the same amount of attention from researchers. We detail each perspective, providing a definition, goals for teaching, pros and cons, and examples from the literature. We posit that, with the advent of the digital era of mathematics education, researchers must engage more explicitly with the theoretical perspectives we identified as underserved and must reckon with their own epistemological commitments more intentionally when reporting on studies regarding Type D.
... As models of teaching and learning change, as in the case of the response to COVID-19, teachers must be able to adapt their professional knowledge for teaching on a rapid, iterative basis. Cai et al. (2018b) proposed how the use of a professional knowledge base storing lessons and instructional adaptations that are aggregated over time and that involve teacherresearcher partnerships could have direct implications for developing professional learning. Cai et al. (2020) discussed how researchers must work to supplement and build teachers' specific, lesson-level professional knowledge to create learning opportunities for students as well as how to share this knowledge and make it accessible. ...
... Elsewhere, we have discussed the need for artifacts-tangible products-that can store professional knowledge and that can form the foundation of a knowledge base for the profession (Cai et al., 2018b). Such artifacts are a way to give a physical reality to the dual processes of theory for teaching and teaching for theory; they act as "carriers" that facilitate the storing, sharing and growth of professional knowledge. ...
Book
Full-text available
This open access book seeks to create a forum for discussing key questions regarding theories on teaching: Which theories of teaching do we have? What are their attributes? What do they contain? How are they generated? How context-sensitive and content-specific do they need to be? Is it possible or even desirable to develop a comprehensive theory of teaching? The book identifies areas of convergence and divergence among the answers to these questions by prominent international scholars in research on teaching. Initiating exchanges among the authors, it then evaluates whether consensus can be reached on the areas of divergence. The book concludes by discussing lessons learned from this endeavor and outlines steps that need to be taken for advancing future work on theorizing teaching. As such, the book is aimed at readers interested in an overview of the theorizing of teaching and key open questions that, if addressed, help to move the field forward.
... As models of teaching and learning change, as in the case of the response to COVID-19, teachers must be able to adapt their professional knowledge for teaching on a rapid, iterative basis. Cai et al. (2018b) proposed how the use of a professional knowledge base storing lessons and instructional adaptations that are aggregated over time and that involve teacherresearcher partnerships could have direct implications for developing professional learning. Cai et al. (2020) discussed how researchers must work to supplement and build teachers' specific, lesson-level professional knowledge to create learning opportunities for students as well as how to share this knowledge and make it accessible. ...
... Elsewhere, we have discussed the need for artifacts-tangible products-that can store professional knowledge and that can form the foundation of a knowledge base for the profession (Cai et al., 2018b). Such artifacts are a way to give a physical reality to the dual processes of theory for teaching and teaching for theory; they act as "carriers" that facilitate the storing, sharing and growth of professional knowledge. ...
Chapter
Full-text available
In discussing theories of teaching, we take the position that there is a two-way street between what we call theory for teaching and teaching for theory . We articulate the linkages between these two dynamic processes through a particular conceptualization of professional knowledge for teaching carried by tangible artifacts. Within this context we have tried to answer a set of questions about theory and teaching: (1) What is a theory (of teaching)? (2) What should it contain and why? (3) Can such a theory accommodate differences across subject matters and student populations taught? If so, how? If not, why? (4) Do we already have a theory or theories on teaching? If so, which are they? (5) In the future, in what ways might it be possible, if at all, to create a (more comprehensive) theory of teaching? To answer these questions, we draw on the lens of Confucian learning as well as examples from Chinese and U.S. mathematics education to elaborate on understanding, assessing, and accumulating professional knowledge for teaching.
... Such studies are usually done to help them in their school environment and not to do a job search. [8] The most common approach is classifying students into clusters created using basic clustering methods such as k-means, apriori algorithm or decision trees. ...
Chapter
Modern technical universities help students get practical experience. They educate thousands of students and it is hard for them to connect individual students with relevant industry experts and opportunities. This article aims to solve this problem by designing a matchmaking procedure powered by a recommendation system, an ontology, and knowledge graphs. We suggest improving recommendations and reducing the cold-start problem with a re-ranking module based on student educational profiles for students who opt-in. Each student profile is represented as a knowledge graph derived from the successfully completed courses of the individual. The system was tested in an online experiment and demonstrated that recommendations based on student educational profiles and their interaction history significantly improve conversion rates over non-personalised offers.Keywordscooperation with industryknowledge graphsrecommender systemstudent profilingjob recommendationontology
... Therefore, using such an approach at higher learning institutions may compromise student experiences, success, and retention, as well as institutional performance. According to (Cai et al. 2018) data may be used to understand and improve the student experience. ...
Chapter
Full-text available
The Fourth Industrial Revolution (4IR) brought disruptive technologies, dramatically changing the way businesses operate. Higher education institutions make use of learning management systems (LMS) primarily for teaching, learning and assessment. The COVID-19 pandemic has pushed the use of technology for academic continuity, resulting in institutions using LMS for virtual engagements with students, student collaborations, assessments, and as a repository for resources. Student behaviour on the LMS can be tracked, giving useful learning analytics which may be used to improve student success, retention, experience, and institutional performance. This paper is an exploration of institutional readiness for learning analytics. We adopted a qualitative approach, using purposive sampling to select the institution and initial participants. We used the snowball technique to recruit further participants. The personality traits stated in the Technology Readiness Index model were used to formulate interview questions. The findings show that the institution has systems in place to support students, which were launched to address insights from LMS-based learning analytics. The institution is ready for using learning analytics, with participants innovatively using the LMS, showing enthusiasm, and optimisation of the full potential of learning analytics. We recommend the use of learning analytics to come up with effective student support.KeywordsFourth Industrial Revolution (4IR)Data-DrivenDecision-SupportHigher EducationLearning AnalyticsTechnology Readiness
Article
Full-text available
We began our editorials in 2017 seeking answers to one complex but important question: How can we improve the impact of research on practice? In our first editorial, we suggested that a first step would be to better define the problem by developing a better understanding of the fundamental reasons for the divide between research and practice (Cai et al., 2017). This sparked subsequent editorials in which we delved deeper into some of the many complicated facets of this issue. In our March (Cai et al., 2017b) editorial, we argued that impact needs to be defined more broadly than it often has been, notably, to include cognitive and noncognitive outcomes in both the near term and longitudinally. This led us to focus our May (Cai et al., 2017a) editorial on the ways that research might have a greater impact on the learning opportunities that help students reach broader learning goals. We argued that it is not enough to identify learning goals–it is also necessary to conduct research that breaks those learning goals into subgoals that can be appropriately sequenced. We highlighted research on learning trajectories as an example of this sort of work but also emphasized the need to work at a grain size that is compatible with teachers' classroom practice. Finally, in our July (Cai et al., 2017c) editorial, we argued that the implementation of learning opportunities in the classroom is an integral element of research that has an impact on practice.
Article
Full-text available
In our May editorial (Cai et al., 2017), we argued that a promising way of closing the gap between research and practice is for researchers to develop and test sequences of learning opportunities, at a grain size useful to teachers, that help students move toward well-defined learning goals. We wish to take this argument one step further. If researchers choose to focus on learning opportunities as a way to produce usable knowledge for teachers, we argue that they could increase their impact on practice even further by integrating the implementation of these learning opportunities into their research. That is, researchers who aim to impact practice by studying the specification of learning goals and productively aligned learning opportunities could add significant practical value by including implementation as an integral part of their work.
Article
Full-text available
In our first editorial (Cai et al., 2017), we highlighted the long-standing, critical issue of improving the impact of educational research on practice. We took a broad view of impact, defining it as research having an effect on how students learn mathematics by informing how practitioners, policymakers, other researchers, and the public think about what mathematics education is and what it should be. As we begin to dig more deeply into the issue of impact, it would be useful to be more precise about what impact means in this context. In this editorial, we focus our attention on defining and elaborating exactly what we mean by “the impact of educational research on students' learning.”
Article
Full-text available
As educators face increasing pressure from federal, state, and local accountability policies to improve student achievement, the use of data has become more central to how many educators evaluate their practices and monitor students’ academic progress (Knapp et al., 2006). Despite this trend, questions about how educators should use data to make instructional decisions remain mostly unanswered. In response, this guide provides a framework for using student achievement data to support instructional decision making. These decisions include, but are not limited to, how to adapt lessons or assignments in response to students’ needs, alter classroom goals or objectives, or modify student-grouping arrangements. The guide also provides recommendations for creating the organizational and technological conditions that foster effective data use. Each recommendation describes action steps for implementation, as well as suggestions for addressing obstacles that may impedeprogress. In adopting this framework, educators will be best served by implementing the recommendations in this guide together rather than individually.
Article
In our March editorial (Cai et al., 2018), we considered the problem of isolation in the work of teachers and researchers. In particular, we proposed ways to take advantage of emerging technological resources, such as online archives of student data linked to instructional activities and indexed by learning goals, to produce a professional knowledge base (Cai et al., 2017b, 2018). This proposal would refashion our conceptions of the nature and collection of data so that teachers, researchers, and teacher-researcher partnerships could benefit from the accumulated learning of ordinarily isolated groups. Although we have discussed the general parameters for such a system in previous editorials, in this editorial, we present a potential mechanism for accumulating learning into a professional knowledge base, a mechanism that involves collaboration between multiple teacher-researcher partnerships. To illustrate our ideas, we return once again to the collaboration between fourth-grade teacher Mr. Lovemath and mathematics education researcher Ms. Research, who are mentioned in our previous editorials(Cai et al., 2017a, 2017b).
Article
In our November 2017 editorial (Cai et al., 2017), we presented a vision of a future in which research has a significant impact on practice. In the world we described, researchers and teachers work together, sharing similar goals and incentive structures. A critical feature of this brave new world is the existence of an online professional knowledge base comprising “useful findings and artifacts that are continuously refined over time, indexed by specific learning goals and subgoals, and that assist teachers and researchers in implementing learning opportunities in their classrooms” (p. 469). Moreover, we argued that teacher—researcher partnerships are a necessary condition for greater impact on practice.
Article
In this article, Graham Nuthall critiques four major types of research on teaching effectiveness: studies of best teachers, correlational and experimental studies of teaching-learning relationships, design studies, and teacher action and narrative research. He gathers evidence about the kind of research that is most likely to bridge the teaching-research gap, arguing that such research must provide continuous, detailed data on the experience of individual students, in-depth analyses of the changes that take place in the students' knowledge, beliefs, and skills, and ways of identifying the real-time interactive relationships between these two different kinds of data. Based on his exploration of the literature and his research on teaching effectiveness, Nuthall proposes an explanatory theory for research on teaching that can be directly and transparently linked to classroom realities.
Article
Experience-sampling methods (ESM) enable us to learn about individuals’ lives in context by measuring participants’ feelings, thoughts, actions, context, and/or activities as they go about their daily lives. By capturing experience, affect, and action in the moment and with repeated measures, ESM approaches allow researchers access to expand the areas and aspects of participants’ experiences they can investigate and describe and to better understand how people and contexts shape these experiences. We argue ESM approaches can be particularly enriching for education research by enabling us to ask new and interesting questions about how students, teachers, and school leaders engage with education as they are living their lives and thus help us to better understand how education contexts shape learning and other outcomes. In this article, we highlight the value of these approaches for addressing new and exciting questions they may help education researchers to answer as they allow us to uncover experience in new ways.
Article
This 3-year longitudinal study investigated the development of 82 children's understanding of multidigit number concepts and operations in Grades 1-3. Students were individually interviewed 5 times on a variety of tasks involving base-ten number concepts and addition and subtraction problems. The study provides an existence proof that children can invent strategies for adding and subtracting and illustrates both what that invention affords and the role that different concepts may play in that invention. About 90% of the students used invented strategies. Students who used invented strategies before they learned standard algorithms demonstrated better knowledge of base-ten number concepts and were more successful in extending their knowledge to new situations than were students who initially learned standard algorithms.