Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay on Nov 03, 2020
Content may be subject to copyright.
ISSN: 2292-8588 Volume 35, No. 1, 2020
This work is licensed under a Creative Commons Attribution 3.0 Unported License
Exploring the Creation of Instructional Videos to Improve
the Quality of Mathematical Explanations for Pre-Service
Teachers
Dr. Robin Kay and Dr. Robyn Ruttenberg-Rozen
Abstract: One of the primary skills required by mathematics teachers is the ability to provide
effective explanations to their students. Using Kay’s (2014) theory-based framework for creating
instructional videos, this study explored the quality and growth of explanations embedded in
mathematical instructional videos created by 37 pre-service teachers (female = 26, male = 11).
The Instructional Video Evaluation Scale (IVES), comprised of four constructs (establishing
context, explanation heuristics, minimizing cognitive load, engagement), was used to assess the
quality of two videos (pre-feedback and post-feedback). The initial video created by pre-service
teachers (pre-feedback) revealed a number of problem areas, including providing a clear
problem label, using visual supports, noting potential errors that might occur, writing legibly,
highlighting key areas, listing key terms and formulas, being concise, and using a clear,
conversational voice. After receiving detailed feedback based on the IVES, the ratings of the
second video (post-feedback) for each of the initial problem areas increased significantly. The
IVES scale, grounded on Kay’s (2014) framework, helped identify and improve the
effectiveness of pre-service teachers’ explanations of mathematics concepts.
2
Keywords: pre-service teachers, instructional videos, mathematics teaching,
explanation
Résumé: L'une des principales compétences requises des professeurs de
mathématiques est de fournir des explications efficaces à leurs élèves. À l'aide du
cadre théorique de Kay (2014) pour la création de vidéos pédagogiques, cette
étude a exploré la qualité et la croissance des explications intégrées dans les
vidéos pédagogiques mathématiques créées par 37 enseignants stagiaires
(femmes = 26, hommes = 11). L'échelle d'évaluation de la vidéo pédagogique
(IVES), composée de quatre concepts (établissement du contexte, heuristique
d'explication, minimisation de la charge cognitive, engagement), a été utilisée
pour évaluer la qualité de deux vidéos (pré-feedback et post-feedback). La vidéo
initiale créée par les enseignants stagiaires (pré-rétroaction) a révélé un certain
nombre de domaines problématiques, notamment fournir une étiquette claire du
problème, utiliser des supports visuels, noter les erreurs potentielles qui
pourraient survenir, écrire lisiblement, mettre en évidence les domaines clés,
énumérer les termes et formules clés , utilisant une voix claire et
conversationnelle et concis Après avoir reçu des commentaires détaillés, basés
sur l'IVES, les notes de la deuxième vidéo (post-feedback) pour chacun des
problèmes initiaux ont augmenté de manière significative. L’échelle IVES, fondée
sur le cadre de Kay (2014), a permis d’identifier et d’améliorer l’efficacité des
explications des enseignants de formation sur les concepts mathématiques.
Mots-clés: enseignants stagiaires, vidéos pédagogiques, enseignement des
mathématiques, explication
3
Introduction
Instructional explanations are developed to help learners understand or apply concepts
related to a specific subject area (Leinhardt, 2001). Exemplification, or providing
examples, is critical for effective mathematics explanations (Bills et al., 2006; Inoue,
2009) and helps simplify abstract mathematical concepts (Rowland, 2008). Kirschner et
al. (2006) provide substantial evidence that direct instruction through the use of well-
explained worked examples is particularly useful when students have a limited
understanding of concepts to be learned. Learning to provide clear and explicit
explanations of specific mathematical examples, then, is an essential skill to develop for
secondary school pre-service teachers (Atkinson et al., 2000). Developing those
explanation skills, especially within pre-service mathematics teacher education, is a
complex process (Kay, 2014). A teacher’s pedagogical decisions are multifaceted and
might involve highlighting mathematical elements, procedures used, choice of
technology, and type of representations. Each one of these decisions strongly influences
student learning (Kay, 2014).
Educational research on creating effective instructional explanations in mathematics
and other subject areas has been side-stepped for many years due to heavy emphasis on
problem-based and exploratory methods (Schopf et al., 2019). Consequently,
comprehensive, evidence-based frameworks on developing explanatory competency
are limited (Kay, 2014; Schopf et al., 2019). However, a set of general guidelines has
emerged including referencing previous knowledge, providing clear objectives and
structures, demonstrating use of knowledge, presenting examples and general rules,
offering visual aids, using straightforward language, following an appropriate pace,
and creating a positive atmosphere of humour and enthusiasm (Schopf et al., 2019;
Wittwer & Renkl, 2008).
4
An amalgam of heuristics for creating effective mathematics explanations includes
providing an overview at the beginning, following a series of steps in logical order,
offering clear definitions, incorporating adequate visualizations, and using appropriate
language to match the learner (Schopf et al., 2019). However, a coherent framework for
designing and delivering effective mathematical explanations through technology has
yet to be developed. Furthermore, data collection on the quality of mathematics
explanations is predominantly passive and does not adequately assess the process of
explaining.
Literature Review
An alternative approach to examining and fostering high-quality mathematical
explanations is to investigate how technology can be leveraged with video-based
worked examples. With over 1.3 billion users and 5 billion videos watched each day on
YouTube (MerchDope, 2020), an argument could be made that video explanations are
becoming more relevant and dominant than face-to-face explanations. The use of videos
to explain worked examples has been examined by researchers under different labels,
including podcasts (e.g., Crippen & Earl, 2004; Kay, 2014; Loomes et al., 2002), flipped
learning (e.g., Long et al., 2016; Sahin et al., 2015; Triantafyllou & Timcenko, 2015), and
video lectures (e.g., Giannakos et al., 2015; Ljubojevic et al., 2014). A video format is
useful for critically investigating the quality of explanations because it allows both
students and instructors to carefully review, replay, compare, and reflect upon critical
elements presented.
Consistent with previous research heuristics on explanatory competence (Schopf et al.,
2019; Wittwer & Renkl, 2008), Kay (2014) developed an evidence-based framework to
guide the creation of effective video-based explanations of worked examples. This
framework includes four areas: establishing the context of the problem, explanation
heuristics, minimizing cognitive load, and engaging students. Establishing context
5
includes providing a clear problem label (Bransford et al., 2000), explaining what the
problem is asking (Willingham, 2009), and identifying the type of problem presented
(Ball & Bass, 2000). Providing effective explanations involves breaking a problem into
meaningful steps (e.g., Mason et al., 2010; Polya, 2004), explaining the reasoning for
each step, and using visual supports (Atikinson et al., 2000; Clark & Mayer, 2008; Renkl,
2005). Minimizing cognitive load encompasses factors such as presenting problems in a
well-organized layout, writing clearly, and drawing students’ attention to key aspects
of the problem using visual highlighting (Clark & Mayer, 2008; Willingham, 2009).
Finally, engaging students while explaining worked examples refers to using a clear,
personalized voice and proceeding at a pace that is suitable for learning (not too fast,
not too slow), and minimizing distractions (Atkinson et al., 2005; Clark & Mayer, 2008;
Kester et al., 2006).
This study explored the evolution of pre-service teachers’ video-based explanations of
Grade 7 and 8 mathematical concepts using feedback provided by Kay’s (2014)
instructional framework.
Research Design and Methods
Participants
Thirty-seven pre-service teachers (female = 26, male = 11) participated in this study.
They were enrolled in a 12-week course focusing on teaching mathematics taught in
Grades 7 to 12 (intermediate/senior level). This course was part of a 1-year Bachelor of
Education program, situated in a small university (8,000 students) within a community
of 650,000 people. English was the second language for 32% (n = 12) of the participants.
Data Collection
The Instructional Video Evaluation Scale (IVES), based on Kay’s (2014) framework, was
used to analyze the video-based mathematical explanations of the pre-service students.
6
The IVES, consisting of 19 items, focuses on four constructs: establishing context (n = 3
items), creating effective explanations (n = 7 items), minimizing cognitive load (n = 4
items), and engagement (n = 5 items). Each item in the IVES was rated on a three-point
scale (0 = No, 1 = Sort of, 2 = Yes) assessing whether a pre-service teacher demonstrated
a specific explanation quality.
The first construct, establishing context (problem label, type, key elements), had an
internal reliability coefficient of 0.77. The second construct, creating effective
explanations (all key steps, clear reasoning, mathematical conventions, appropriate
strategy, tips, visuals, potential errors), had an internal reliability coefficient of 0.85. The
third construct, minimizing cognitive load (organized layout, readability, highlighting,
support information), had an internal reliability coefficient of 0.60. The final construct,
engagement (limiting distractions, pace, voice, length, tone), had an internal reliability
coefficient of 0.69. With the exception of the cognitive load construct, the internal
reliability values are considered acceptable for scales used in social sciences (Kline,
1999; Nunnally, 1978).
Procedure
During a 2-hour teaching session, we introduced pre-service teachers to screencasting
software (Camtasia) and how to use the required hardware (laptop computer with a
Wacom tablet). At the end of the teaching session, we asked them to create instructional
videos covering one or more mathematics concepts in the Grade 7 or 8 Ontario
Elementary Mathematics Curriculum (Ontario Ministry of Education, 2005). We then
provided pre-service teachers with a detailed description of the criteria for each of the
19 items in the IVES to help guide the creation of their instructional mathematics
videos.
7
Each student created one 4- to 6-minute instructional video on a self-selected topic
within the Grade 7 or 8 Ontario Mathematics Curriculum (Ontario Ministry of
Education, 2005). These videos addressed four of the five strands in the Ontario
Mathematics Curriculum: (i) numbers and number sense (n = 10 videos), (ii) geometry
and spatial sense (n = 13 videos), (iii) patterning and algebra (n = 6 videos), and (iv) data
management and probability (n = 8 videos).
After students created the first video (pre-feedback), they received detailed feedback
based on the IVES framework. We then asked the students to create a second 4- to 6-
minute instructional video (post-feedback) on the same topic as the first.
Data Analysis
For the first video (pre-feedback), we calculated means and standard deviations for
each item on the IVES to provide an overview of pre-service teachers’ initial explanation
skills. Next, we compared the percentage of pre-service teachers who fully achieved
items on the IVES to identify where students excelled and struggled with mathematical
explanations. Finally, we conducted paired t-tests for the entire IVES scale (total score),
individual IVES items, and the length of videos to determine whether the quality of
mathematics explanations changed based on feedback provided by the IVES.
Research Questions
We addressed two research questions in this study:
1. What strengths and challenges do pre-service teachers demonstrate when
creating instructional video explanations of Grade 7 and 8 mathematics content?
2. How does the quality of mathematical explanations provided by pre-service
teachers change based on feedback from the IVES?
8
Results
Overview
A paired t-test revealed that the average IVES post-feedback score (M = 1.51, SD = 0.48)
was significantly higher than the average IVES pre-feedback score (M = 1.37, SD = 0.45)
t(36) = 2.54, p < .05 with a medium effect size of 0.29 (Cohen, 1988, 1992). In other words,
the overall quality of explanations increased significantly after pre-service teachers
received feedback based on the IVES framework. A second paired t-test indicated that
the average post-feedback mathematics instructional video (M = 234.0 seconds, SD =
73.7) was significantly longer than the average pre-feedback video (M = 352.5 seconds,
SD = 147.4), with a large effect size of 1.02 (Cohen, 1988, 1992).
Establishing Context
The average initial (pre-feedback) context item score was 1.53 (SD=0.56), the highest of
the four IVES themes. Seven out of ten students were able to articulate the context and
type of problem addressed in the video, and six out of ten were successful at noting the
key elements for solving the problem, but only half of the students fully achieved the
criteria of correctly labelling the problem (Table 1).
A paired t-test comparing average pre- and post-feedback total context scores was not
significant (t(36) = 0.09, ns). Paired t-tests conducted on the three individual context
items on the IVES revealed no significant differences between pre-feedback and post-
feedback scores.
9
Table 1
Pre- vs. Post-Feedback Video Scores for Establishing Context Items (n = 37)
Item
Pre-Feedback
Mean (SD)
Item Fully
Achieved
Post-Feedback
Mean (SD)
Item Fully
Achieved
Context and type of problem
articulated
1.59 (0.64)
68%
1.57 (0.65)
65%
Key elements of problem
explained
1.54 (0.61)
60%
1.54 (0.69)
65%
Clear problem label
1.46 (0.65)
54%
1.51 (0.73)
65%
Total context score
1.53 (0.56)
NA
1.54 (0.62)
NA
Creating Effective Explanations
The average initial (pre-feedback) explanation item score was 1.27 (SD = 0.52), the lowest
among the four IVES constructs. Approximately six out of ten students were successful
at showing and explaining all the key steps in their mathematics problems and using
the correct mathematical conventions. About half the students were able to offer a
suitable strategy or tip for solving problems. Only one quarter of the students offered
visuals to support their explanations or noted potential errors one might make when
solving the problem addressed in the
video (Table 2).
The average post-feedback total explanation score was significantly higher than the
average pre-feedback score, with a medium effect size (Table 2). Paired t-tests
comparing pre- and post-feedback scores for the seven individual explanation items on
the IVES indicated significant gains for two items: the use of visuals to support
explanations, and communicating potential errors that students could make when
10
trying to solve the mathematics problem (Table 2). The effect size for the use of visual
supports item is considered medium, according to Cohen (1988, 1992). The effect size
for the noting potential errors item is considered small, according to Cohen (1988, 1992).
It is worth noting that these two items were the lowest-rated explanation items on pre-
feedback videos. The remaining five explanation construct items showed no significant
increases between pre- and post-feedback scores.
Table 2
Pre- vs. Post-Feedback Video Scores for Effective Explanation Items (n = 37)
Item
Pre-Feedback
Mean (SD)
Item Fully
Achieved
Post- Feedback
Mean (SD)
Item Fully
Achieved
Show all key steps
1.59 (0.60)
65%
1.65 (0.59)
70%
Explain reasoning behind
each
step
1.49 (0.65)
57%
1.62 (0.64)
70%
Use correct mathematics
conventions
1.46 (0.69)
57%
1.54 (0.61)
60%
Use appropriate strategy to
solve problem
1.41 (0.64)
49%
1.54 (0.65)
62%
Offer tips for solving
problems
1.35 (0.75)
51%
1.46 (0.80)
65%
Use visuals to support
explanation
0.89 (0.74)
22%
1.27 (0.87)
1
54%
Note potential errors that
could be made
0.70 (0.88)
27%
0.92 (0.92)
2
38%
Total effective explanation
s
core
1.27 (0.52)
NA
1.43 (0.56)
3
NA
1 – t (36) = 2.90, p <. 01
2 – t (36) = 2.10, p <. 05
3 – t (36) = 2.37, p <. 05
11
Minimizing Cognitive Load
The average cognitive load item score was 1.43 (SD=0.42), the second highest among the
four IVES themes. Eight out of ten students provided a clear, organized layout for
presenting their problems. Half the students had legible writing and visually
highlighted key points in the video explanations. Only one third of the students listed
key supportive elements like key terms or formulas needed to solve the problems (Table
3).
The average post-feedback total cognitive load score was significantly higher than the
average pre-feedback score, with a medium effect size (Table 3). Paired t-tests
comparing pre- and post-feedback scores for the four cognitive load items on the IVES
revealed a significant increase in listing key supportive elements when providing a
mathematical explanation (Table 3). The effect size for this increase is considered
medium, according to Cohen (1988, 1992). It is worth noting that listing key supportive
elements was the lowest-rated cognitive load item in the pre-feedback videos (Table 3).
The remaining three cognitive load items increased, but not significantly (Table 3).
Table 3
Pre- vs. Post-Feedback Video Scores for Cognitive Load Items (n=37)
Item
Pre-Feedback
Mean (SD)
Item Fully
Achieved
Post- Feedback
Mean (SD)
Item Fully
Achieved
Clear, organized layout of
problem
1.78 (0.48)
81%
1.81 (0.46)
84%
Readability of writing
1.43 (0.60)
49%
1.54 (0.69)
65%
Visually highlighting key points
1.41 (0.69)
51%
1.46 (0.77)
62%
Listing supportive elements
1.11 (0.74)
32%
1.46 (0.73)
1
59%
12
Total cognitive load
1.43 (0.43)
NA
1.57 (0.52)
2
NA
1 – t (36) = 2.10, p <. 05
2 – t (36) = 2.22, p <. 05
Engagement
The average engagement item score was 1.37 (SD = 0.52), the third highest among the
four IVES themes. Six out of ten students limited distracting behaviour (e.g., saying
“uhm” too often, clearing throat, poor sound quality) and proceeded at a pace that was
effective for learning a new concept. Half the students were successful at using a clear,
engaging conversational voice to present concepts and provide an explanation that was
not too long or too short.
The average post-feedback engagement score was significantly higher than the average
pre-feedback score, with a medium effect size (Table 4). All five individual engagement
items increased from pre- to post-feedback scores. However, paired t-tests conducted
on the five individual engagement items on the IVES revealed that these gains were not
significant (Table 4).
Table 4
Pre- vs. Post-Feedback Video Scores for Engagement Items (n=37)
Item
Pre-Feedback
Mean (SD)
Item Fully
Achieved
Post- Feedback
Mean (SD)
Item Fully
Achieved
Limiting distractions
1.46 (0.80)
65%
1.73 (0.60)
76%
Effective pace for learning
1.43 (0.80)
62%
1.57 (0.60)
62%
Clear voice
1.41 (0.73)
54%
1.57 (0.69)
58%
13
Appropriate length of
explanation
1.32 (0.78)
51%
1.46 (0.80)
65%
Engaging conversational voice
1.24 (0.80)
46%
1.46 (0.69)
46%
Total engagement score
1.37 (0.52)
NA
1.56 (0.52)
1
NA
1 – t (36) = 2.68, p <. 05
Discussion
Providing effective explanations is an essential skill for mathematics teachers (Bills et
al., 2006). In support of developing this skill, we assessed changes in video-based
explanations created by pre-service secondary teachers using four constructs:
establishing context, creating effective explanations, minimizing cognitive load, and
engagement.
Establishing Context
Regarding establishing context in their initial (pre-feedback) video explanations, most
pre-service teachers were able to communicate the type of problem presented.
However, 40% struggled with presenting the key elements required to solve the
problem. Pre-service teachers are just beginning to unpack their previously automatized
mathematical knowledge to make it useful for teaching (Ball & Forzani, 2009). The more
automatized their knowledge, the harder it is to unpack and communicate the required
steps to explain the mathematics concepts. Pre-service teachers might need more direct
guidance with elementary mathematics problems, or they may need to observe students
trying to solve these problems to better understand which elements are essential for
naïve or new learners (Santagata & Bray, 2016).
Almost half of the pre-service teachers were unable to provide a clear label for their
problem, an issue that may be related to new teachers not having an evolved schema of
14
how to organize and categorize problems. Without entirely unpacking their knowledge,
beginning teachers can solve mathematics problems, but may not understand the big
picture well enough to provide adequate labels or descriptions. Further research,
perhaps in the form of interviews, could be used to understand the challenges that pre-
service teachers have with establishing context.
Effective Explanations
Half of the pre-service teachers had difficulty selecting an appropriate strategy and
offering tips to solve a problem in their first video (pre-feedback). This finding
highlights the need for teacher education programs to spend more time explicitly
focusing on the connections between problem solving and making thinking explicit.
After making these connections, pre-service teachers can use that knowledge to provide
more effective explanations for their students. Pre-service teachers need to be cognizant
of the difference between solving and explaining problems.
Additionally, only 20% of pre-service teachers used visual aids to support their video-
based mathematical explanations. This result signifies a need for teacher education
programs to focus on how representations and visual aids might improve the quality of
explanations (Arcavi, 2003). After receiving direct feedback from the IVES, though, pre-
service teachers’ scores on providing visual aids increased significantly for the second
video (post-feedback). Directing attention to the value of visual aids can lead to short-
term changes.
Finally, only one third of pre-service teachers were able to note potential errors in the
pre-feedback videos. This finding is in line with other research showing that pre-service
and novice teachers require explicit instruction in noticing (Mason, 2002) and
anticipating student misconceptions and errors (e.g., Lee & Francis, 2018; Son, 2013).
Again, the advanced mathematical knowledge that pre-service teachers have, especially
15
when solving relatively straightforward Grade 7 and 8 problems, undermines their
ability to identify potential errors because they do not make these errors and therefore
have difficulty anticipating them. Consequently, this research highlighted the continued
necessity of including mathematical error analysis to create effective explanations in
teacher education programs. Drawing attention to this weakness resulted in significant
improvements on this item in the second video (post-feedback)
Cognitive Load
For cognitive load, most pre-service teachers presented problems in a clear, organized
format. Half the pre-service teachers had legible writing and highlighted key points;
however, feedback based on the IVES scale did not significantly improve these qualities.
Future research endeavours could focus on why these two features are resistant to
change.
Writing down supportive elements such as the definition of terms or formulas proved
to be challenging for the pre-service teachers. Similar to establishing context, writing
down supportive elements is connected to pre-service teachers’ unpacking of their
automized mathematical knowledge. Our finding aligns with recent calls by Krupa et
al. (2017) for more research regarding mediating secondary teachers in noticing student
thinking. Pre-service teachers could improve their explanations by identifying,
highlighting, and writing down key elements to support student thinking. Making
students aware of this problem through feedback from the IVES resulted in significant
increases in pre-service students listing supportive elements.
Engagement
Overall, pre-service students scored highest on creating engaging mathematical
explanations for their first videos (pre-feedback). Many pre-service teachers limited
distracting behaviours and explained the mathematical examples at a pace that was
16
neither too fast nor too slow. Using a clear and engaging conversational tone was a little
more challenging for the students. Voice may be more critical in a video than a face-to-
face explanation but using a more personalized tone is likely more effective, regardless
of the environment (Kay, 2014). Changing voice may be one of the more challenging
skills to develop with pre-service teachers. Finally, about half the students struggled
with creating a video that was long enough to provide sufficient detail. The
automatization of knowledge might make pre-service teachers somewhat blind to the
level of detail required for a novice learner (Ball & Forzani, 2009). However, feedback
from the IVES led students to create significantly longer post-feedback videos. It is also
worthwhile noting that although no individual item in the engagement construct
improved significantly as a result of receiving feedback from the IVES, the overall
average construct score did increase significantly. Again, pre-service students appeared
to be responsive to direct, explicit feedback on their explanation skills.
Conclusions
Overall, the IVES appeared to be a useful metric for analyzing instructional
mathematics videos created by pre-service teachers and identifying potential
opportunities for improvement in explanatory competence. Feedback based on the IVES
was significantly helpful in improving particularly weak problem areas such as
providing visual supports (representations), noting potential errors that students could
make, and listing supportive elements. However, the majority of the 19 items assessed
by the IVES did not show a significant improvement after pre-service teachers received
feedback. Developing high-quality explanation skills takes time because pre-service
teachers have to unpack their automized mathematical knowledge (Ball & Forzani,
2009), observe and understand naïve learners (Santagata & Bray, 2016), and notice
student misconceptions and thinking (Krupa et al., 2017; Lee & Francis, 2018; Son, 2013).
17
This study is a first step in systematically exploring the use of videos to improve the
quality of pre-service teachers’ mathematical explanations. However, the sample was
small, and the period for examining improvements in these explanations was relatively
short. Future research might explore the progression of pre-service teachers’
explanation skills over an entire semester, year, or program to identify the rate at which
specific skills develop. In addition, interviews would help identify the source and
progression of acquiring specific explanation skills identified by the IVES. Finally,
student ratings of explanations would help validate the criteria noted in the IVES and
possibly add new essential elements to aid the development of mathematics
explanation skills.
References
Arcavi, A. (2003). The role of visual representations in the learning of mathematics. Educational
Studies in Mathematics, 52(3), 215–241. https://doi.org/10.1023/A:1024312321077
Atkinson, R. K., Derry, S. J., Renkl, A., & Wortham, D. (2000). Learning from examples:
Instructional principles from the worked examples research. Review of Educational
Research, 70(2), 181–214. https://doi.org/10.3102/00346543070002181
Atkinson, R. K., Mayer, R. E., & Merrill, M. M. (2005). Fostering social agency in multimedia
learning: Examining the impact of an animated agent’s voice. Contemporary Educational
Psychology, 30(1), 117–139. https://doi.org/10.1016/j.cedpsych.2004.07.001
Ball, D. L., & Bass, H. (2000). Interweaving content and pedagogy in teaching and learning to
teach: Knowing and using mathematics. In J. Boaler (Ed.), Multiple perspectives on
mathematics of teaching and learning (pp. 83–104). Ablex Publishing.
Ball, D. L., & Forzani, F. M. (2009). The work of teaching and the challenge for teacher
education. Journal of Teacher Education, 60(5), 497–511.
https://doi.org/10.1177/0022487109348479
Bills, L., Dreyfus, T., Mason, J., Tsamir, P., Watson, A., & Zaslavsky, O. (2006). Exemplification
in mathematics education. In J. Novotná, H. Moraová, M. Krátká, & N. Stehlíková (Eds.),
Proceedings of the 30th Conference of the International Group for the Psychology of Mathematics
Education (Vol. 1, pp. 126–154). Charles University.
18
Bransford, J. D., Brown, A. L., & Cocking. R. R. (2000). How people learn: Brain, mind, experience,
and school (Expanded ed.). National Academy Press.
Clark, R. C., & Mayer, R. E. (2008). e-Learning and the science of instruction. Pfeiffer.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Academic Press.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155–159.
https://doi.org/10.1037//0033-2909.112.1.155
Crippen, K. J., & Earl, B. L. (2004). Considering the efficacy of web-based worked examples in
introductory chemistry. Journal of Computers in Mathematics and Science Teaching, 23(2),
151–167. https://www.learntechlib.org/primary/p/12876/
Giannakos, M. N., Jaccheri, L., & Krogstie, J. (2015). Exploring the relationship between video
lecture usage patterns and students’ attitudes. British Journal of Educational Technology,
47(6), 1259–1275. https://doi.org/10.1111/bjet.12313
Inoue, N. (2009). Rehearsing to teach: Content-specific deconstruction of instructional
explanations in pre-service teacher training. Journal of Education for Teaching, 35(1), 47–60.
https://doi.org/10.1080/02607470802587137
Kay, R. H. (2014). Developing a framework for creating effective instructional video
podcasts. International Journal of Emerging Technologies in Learning, 9(1), 22–30.
http://dx.doi.org/10.3991/ijet.v9i1.3335
Kester, L., Lehnen, C., Van Gerven, P. W., & Kirschner, P. A. (2006). Just-in-time, schematic
supportive information presentation during cognitive skill acquisition. Computers in
Human Behavior, 22(1), 93–112. https://doi.org/10.1016/j.chb.2005.01.008
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction
does not work: An analysis of the failure of constructivist, discovery, problem-based,
experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86.
https://doi.org/10.1207/s15326985ep4102_1
Kline, P. (1999). The handbook of psychological testing (2nd ed.). Routledge.
Krupa, E. E., Huey, M., Lesseig, K., Casey, S., & Monson, D. (2017). Investigating secondary
preservice teacher noticing of students’ mathematical thinking. In E. O. Schack, M. H.
Fisher, & J. Wilhelm (Eds.), Research in mathematics education (Vol. 6, pp. 49–72). Springer
International Publishing.
Lee, M. Y., & Francis, D. C. (2018). Investigating the relationships among elementary teachers’
perceptions of the use of students’ thinking, their professional noticing skills, and their
19
teaching practices. The Journal of Mathematical Behavior, 51, 118–128.
https://doi.org/10.1016/j.jmathb.2017.11.007
Leinhardt, G. (2001). Instructional explanations: A commonplace for teaching and location for
contrast. In V. Richardson (Ed.), Handbook of research on teaching (4th ed., pp. 333–357).
American Educational Research Association.
Ljubojevic, M., Vaskovic, V., Stankovic, S., & Vaskovic, J. (2014). Using supplementary video in
multimedia instruction as a teaching tool to increase efficiency of learning and quality of
experience. International Review of Research in Open and Distributed Learning, 15(3), 275–
291. https://doi.org/10.19173/irrodl.v15i3.1825
Loomes, M., Shafarenko, A., & Loomes, M. (2002). Teaching mathematical explanation through
audiographic technology. Computers & Education, 38(1–3), 137–149.
https://doi.org/10.1016/S0360-1315(01)00083-5
Long, T., Logan, J., & Waugh, M. (2016). Students’ perceptions of the value of using videos as a
pre-class learning experience in the flipped classroom. TechTrends, 60(3), 245–252.
http://dx.doi.org/10.1007/s11528-016-0045-4
Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge-Falmer.
Mason, J., Burton, L., & Stacey, K. (2010). Thinking mathematically (2nd ed.). Pearson.
MerchDope. (2020, February 26). 37 mind blowing YouTube facts, figures and statistics – 2020.
https://merchdope.com/youtube-stats
Nunnally, J. C. (1978). Psychometric theory. McGraw-Hill.
Ontario Ministry of Education. (2005). The Ontario curriculum grades 1-8: Mathematics revised.
Queen’s Printer for Ontario.
Polya, G. (2004). How to solve it: A new aspect of mathematical method (Expanded Princeton Science
Library ed.). Princeton University Press.
Renkl, A. (2005). The worked-out examples principle in multimedia learning. In R. E. Mayer
(Ed.), The Cambridge handbook of multimedia learning (pp. 229–245). Cambridge University
Press.
Rowland, T. (2008). The purpose, design and use of examples in the teaching of elementary
mathematics. Educational Studies in Mathematics, 69(2), 149–163.
https://doi.org/10.1007/s10649-008-9148-y
Sahin, A., Cavlazoglu, B., & Zeytuncu, Y. E. (2015). Flipping a college calculus course: A case
study. Educational Technology & Society, 18(3), 142–152.
https://www.jstor.org/stable/jeductechsoci.18.3.142
20
Santagata, R., & Bray, W. (2016). Professional development processes that promote teacher
change: The case of a video-based program focused on leveraging students’
mathematical errors. Professional Development in Education, 42(4), 547–568.
https://doi.org/10.1080/19415257.2015.1082076
Schopf, C., Raso, A., & Kahr, M. (2019). How to give effective explanations: Guidelines for
business education, discussion of their scope and their application to teaching
operations research. RISTAL, 2, 32–50. https://doi.org/10.23770/rt1823
Son, J.- W. (2013). How preservice teachers interpret and respond to student errors: Ratio and
proportion in similar rectangles. Educational Studies in Mathematics, 84(1), 49–70.
https://www.jstor.org/stable/43589772
Triantafyllou, E., & Timcenko, O. (2015). Student perceptions on learning with online resources in a
flipped mathematics classroom [Paper presentation]. 9th Congress of European Research in
Mathematics Education (CERME9), Prague, Czech Republic.
https://www.researchgate.net/publication/277016310_Student_perceptions_on_learning_
with_online_resources_in_a_flipped_mathematics_classroom
Willingham, D. T. (2009). Why don’t students like school? A cognitive scientist answers questions
about how the mind works and what it means for the classroom. John Wiley & Sons.
Wittwer, J., & Renkl, A. (2008). Why instructional explanations often do not work: A framework
for understanding the effectiveness of instructional explanations. Educational
Psychologist, 43(1), 49–64. https://doi.org/10.1080/00461520701756420
Authors
Dr. Robin Kay is currently a full professor and dean in the Faculty of Education at Ontario Tech
University in Oshawa, Ontario, Canada. He has published over 160 articles, chapters, and
conference papers in the area of technology in education, is a reviewer for five prominent
computer education journals, and has taught in the fields of computer science, mathematics,
and educational technology for over 25 years at the high school, college, undergraduate, and
graduate levels. Current projects include research on laptop use in higher education, BYOD in
K-12 education, web-based learning tools, e-learning and blended learning in secondary and
higher education, video podcasts, scale development, emotions and the use of computers, the
impact of social media tools in education, and factors that influence how students learn with
technology. Dr. Kay received his MA in Computer Applications in Education and his PhD in
21
Cognitive Science (Educational Psychology) at the University of Toronto. Email:
Robin.Kay@uoit.ca
Dr. Robyn Ruttenberg-Rozen is an assistant professor of STEAM education and graduate
program director in the Faculty of Education at Ontario Tech University in Oshawa, Ontario,
Canada. She explores pedagogical practices and current discourses in STEAM education around
typically underserved, linguistically and culturally diverse, and exceptional populations of
learners and their teachers. At the centre of her research is the study of change, innovation, and
access in pedagogical spaces (virtual and face-to-face) with a focus on strategies and
interventions. Her graduate and undergraduate teaching includes courses in integrated STEAM
learning, mathematics methods and content, qualitative research methods, curriculum theory,
and theories of learning. Email: Robyn.Ruttenberg-Rozen@uoit.ca