Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay on Nov 20, 2020
Content may be subject to copyright.
EXPLORING STUDENT PERCEPTIONS OF VIDEO FEEDBACK:
A REVIEW OF THE LITERATURE
T. Bahula, R. Kay
University of Ontario Institute of Technology (CANADA)
Abstract
Feedback is an integral component of learning and attempts to provide students with information
about a perceived gap between their performance and desired outcomes. The standard format,
particularly in higher education, is text-based feedback, despite significant advances in the ease of
recording and distributing video-based feedback in digital learning environments. While recent studies
have investigated the experimental use of video-based feedback, the perceptions of students who
have received video-based feedback are not well understood. The purpose of the current study, then,
was to conduct a systematic literature review of research on the use of video-based feedback in
higher education from 2009-2019. Sixty-seven peer-reviewed articles, selected from a systematic
search of electronic databases, were organized and examined through the lenses of Diffusion of
Innovation and Community of Inquiry theory. An area of research that emerged as common to many
studies was how students perceived the video feedback they received and video feedback in general.
Analysis of the literature revealed that students preferred this form of feedback over text-based
feedback. Students perceived video-based feedback positively, seeing it as more detailed, clearer,
and richer, noting that it improved higher-order thinking skills and prepared them for future work.
Video-based feedback also had a positive influence on their perceptions of cognitive and social
presence. When students perceived video-based feedback negatively, they cited accessibility
problems, the linear nature of feedback, and the evocation of negative emotions as adverse effects of
receiving video feedback. This paper concludes with some educational implications arising from the
perceptions of students and a discussion of research opportunities.
Keywords: video feedback; screencast feedback; assessment; higher education; systematic review.
1 INTRODUCTION
Feedback is an integral component of learning and involves communication about a gap between
actual performance and desired outcomes [1]. Narrowly construed, feedback provides a justification
for an assigned grade, in which case student engagement with the comments becomes perfunctory
[2], [3]. However, a broader conception is that feedback facilitates understanding and future
performance through dialogue among participants in learning communities [4]. As such, the provision
of feedback that engages students and encourages high-quality dialogue is one of the primary roles of
instructors in higher education [4]. Research has confirmed the importance of feedback. A synthesis of
over 500 meta-analyses identified feedback as one of the most critical factors in improving student
achievement [5]. However, the study also found that feedback had a high degree of variance in the
effect size, indicating that not all feedback had the same effect on learning [5]. Furthermore, some
feedback interventions had a negative effect [6], highlighting the need for educators to think carefully
about the quality and format of feedback.
One-on-one tutorial instruction is thought of as the “gold standard” of education [7]. Similarly, face-to-
face conferences appear to be one of the best methods to receive feedback [8], [9] and necessary to
clarify written feedback [10]. However, text-based feedback has become the norm in higher education.
Before using computers, instructors provided feedback as handwritten comments and codes on
students’ written submissions [11]. The practice of writing extensive corrections and comments with a
red pen has led to disappointment and discouragement [12]. The association of red ink with negative
emotions led to the recommendation that instructors use a neutral colour of ink for marking [13].
However, the limitations of handwritten markup went beyond the colour of the ink. Students found
much of the feedback they received to be unhelpful because the comments were not specific, lacked
guidance, focused on the negative, or did not align with the learning goals for the assessment [14],
[15].
With the advent of digital submissions, feedback shifted from a handwritten to a digital format with text
typed in the digital margins [9], [16]. This change to digital markup helped students overcome the
Proceedings of ICERI2020 Conference
9th-10th November 2020
ISBN: 978-84-09-24232-0
6535
challenge of deciphering illegible scratches [2], [17], [18]. However, other problems remained,
including the lack of detail [19], the absence of pedagogical training for instructors [20], the difficulty
students encountered making connections between grades, feedback, and assessment criteria [17],
and the negative emotional responses that feedback can elicit [21].
Students expect feedback that is timely, personal, explicable, criteria-referenced, objective, and useful
for improvement in future work, according to a review of 37 empirical studies on assessment feedback
in higher education [22]. While higher quality feedback could address some of these expectations,
large class sizes and media constraints make meeting students’ expectations with text-based
feedback challenging. Instructors have experimented with other forms of media to provide feedback to
students as far back as the days of reel-to-reel tape [23]. More recently, instructors have used video-
based media, including screencast and webcam video, to provide feedback. The purpose of the
current study was to explore student perceptions by reviewing the literature about the use of video-
based feedback in higher education.
2 METHODOLOGY
2.1 Overview
We conducted a systematic literature review on the use of video-based feedback in higher education
using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework
[24]. The PRISMA process attempts to create a reproducible, comprehensive, and reliable overview of
a topic by identifying, screening, analyzing, and synthesizing primary research sources [25]. The
identification and screening phases were conducted iteratively and included establishing selection
criteria, testing search terms, and using those terms to search targeted databases. The search was
extended to high-quality educational journals and by scanning articles that met eligibility requirements
for additional sources. Articles that met the eligibility criteria were analyzed through careful reading,
extracting characteristics, describing methodologies, and coding emergent themes. Results were
synthesized by aggregating quantitative data and configuring qualitative results [25]. The PRISMA
framework produced 67 peer-reviewed articles on the use of video-based feedback, of which 58
articles reported on student perceptions of the feedback received.
2.2 Data Analysis
To frame an understanding of the context of video-based feedback use, we collected and analyzed the
key characteristics of each article. Data items included the year of publication, country, academic
level, academic discipline, assessment type, media used, and length of feedback. Descriptive
frequency statistics were calculated for each item. Further, beginning with five highly relevant articles,
we discovered emerging themes by carefully reading the results and discussion sections and
recording key findings. We then employed a constant comparative method [26] to review and code the
remaining articles for consistency and alignment with emerging themes.
2.3 Context
The 58 articles reported in this study were published between 2009 and 2019, with a majority
published since 2014. Most studies occurred in the United States or the United Kingdom. The studies
primarily focused on undergraduate students, with a few focused on graduate students and pre-
service teachers. Video-based feedback, both formative and summative, was received in various
academic disciplines, including education, language learning, humanities, business, and STEM. The
principal media used was screencast, followed by webcam videos of the instructor. The average
length of recorded feedback was seven minutes, with recording ranging between two and 26 minutes.
3 RESULTS
The synthesis of the results of the 58 articles that reported on student perceptions of the video-based
feedback they received revealed positive perceptions about the quality of video-based feedback and
the cognitive and social presence produced by it, negative perceptions about the accessibility
problems encountered, the linearity limitations of video, and the evocation of negative emotions
triggered. Notwithstanding the negative perceptions, students generally preferred video-based
feedback to text-based feedback.
6536
3.1 Positive Perceptions
3.1.1 High Quality
Multiple studies reported that students perceived the quality of video-based feedback to be better than
[27]–[31] or the same as [32], [33] text-based feedback. Three factors contributed to this perception.
The first factor was that students considered video-based feedback to include a higher degree of detail
[9], [29], [34]–[40]. The second factor was that students indicated that the content of video-based
feedback was clearer than text-based feedback [33], [38], [40]–[44]. The third factor was that students
appreciated the multimedia nature of video-based feedback because visuals and audio helped them
better understand the feedback [28], [41], [45]–[47]. Video-based feedback accomplished this, in part,
by extending the available palette to convey emphasis [34]. The visual aspect helped students
understand the feedback received [48]. For example, pharmacology students reported that the
synchronization of audio and video enabled them to understand calculations better [49]. Media arts
students considered screencast feedback to be the most appropriate type of feedback for the
predominantly visual assignments being assessed [38]. Finally, the audio component of screencast
feedback made parallel processing of feedback possible [50] and also reduced the anxiety of one
student with dyslexia [51].
Students also reported that the video-based feedback they received addressed higher-order thinking
[40], [45], [52], [53]. Video-based feedback addressed important issues, provided supporting
explanations and examples and helped students to prioritize revisions [45]. Students reported that
screencast feedback addressed problems in their thesis, research questions, organization, and
supporting evidence [53].
In addition, students were of the opinion that the content of video-based feedback would feed-forward,
resulting in improvements to their future work [8], [27], [29], [31], [37], [52], [54]–[56]. However, in one
study, graduate communications students indicated that they rated the usefulness of screencast and
digital markup feedback equally [45].
3.1.2 Better Cognitive Presence
In a large number of studies (n = 24), students indicated that they found the content of video-based
feedback easy to understand [29], [31], [32], [34], [37], [40], [42], [50], [52]–[54], [56]–[64] and that it
consequently increased understanding[29], [31], [55], [65]. Two studies linked student understanding
explicitly to the perceived clarity of video-based feedback [40], [42]. Video-based feedback was
considered less prone to misunderstanding because of the visual and vocal cues [32], [61], [62].
The engagement of students in the feedback process is sought after in higher education [16], [18],
[66]. Students in numerous studies (n = 18) reported feeling more engaged when receiving video-
based feedback [28], [30], [34]–[37], [39], [41], [47], [55]–[57], [59], [60], [62], [67]–[69]. Students
reported that video-based feedback helped to increase their motivation [28], [57], [62], [67] and
engagement in the revision process [39], [41], [56], [57], [70]. In several studies, students mentioned
that the social presence they experienced motivated them to engage with the feedback [28], [39], [67].
Students manifested their engagement with video-based feedback in at least two ways. Students
reported [30], [39], [40], [53], [59], [60], [71], [72] or were observed [50] viewing feedback multiple
times. Students also reported spending more time reviewing feedback compared to their typical
pattern with text-based feedback [27], [29], [31], [63]. Students also reported talking about the
feedback with their peers [60] and paying equal attention to the feedback as to the grade awarded
[35], [47].
Learning was another way in which students reported their engagement in the feedback process. In
some studies, learning was evidenced by revising submissions through the application of the feedback
received [30], [47], [73]. Students reported that they incorporated more of the comments provided in
video-based feedback than in digital markup [30]. Students reported an increase in their
comprehension of the topic due to screencast feedback [68]. Other studies reported that learning was
vaguely defined and evidence lacking [28], [36], [68]. While students perceived learning dividends,
they could not articulate what the learning dividends were [28].
Although not a dominant theme across the literature, students perceived that their instructors were
more engaged when providing video-based feedback [42], [48].
6537
3.1.3 Better Social Presence
Students also reported experiencing a sense of social presence when provided with video-based
feedback in nearly three-quarters of studies (n = 43, 74%). Students experienced social presence
even when instructors provided video-based feedback without any physical presence, such as in
asynchronous online courses. Many articles reported that students perceived video-based feedback to
be personal in a general sense (n = 29), while some reported positive perceptions of affective (n = 14),
cohesive (n = 15), and interactive (n = 11) expression. On the other hand, the results of a small
number of studies stand in contrast. Borup et al. [32], [74] found no significant difference in perception
of social presence between students who received video feedback and those who received digital
markup. Further, students (n = 37) in asynchronous online courses rated detailed text-based feedback
as significantly more effective at establishing social presence than video-based feedback [75].
Nonetheless, there was a general sense that video-based feedback was personal [9], [27], [29], [33],
[36], [37], [40], [42]–[48], [53]–[56], [59], [61], [63], [65], [67], [76], [77]. The perception of video-based
feedback as personal was reported with screencast, video, pencast, and VoiceThread feedback. Even
general video-based feedback, such as pencast worked solutions in mathematics sent to the whole
class, was perceived as personal [56]. Mayhew [37] noted that students considered screencasts with
embedded video more personal than a screencast alone.
Video-based feedback was perceived by students to be effective at conveying affective expression.
Students reported that videos revealed their instructors’ emotions more accurately [8], [74] and helped
them judge the instructor’s authenticity [74]. Students commented that hearing the instructor’s tone of
voice was as an important factor [44], [47], [51], [52], [78], [79]. Hearing the instructor’s tone of voice
helped students perceive feedback as friendly [47], to understand the feedback [51], [52], to interpret it
more positively [78], and to see it as a constructive opportunity for improvement [44], [79].
Video-based feedback was considered by students to be effective at promoting group cohesion.
Students noted that this type of feedback helped them feel closer to their instructor [8], [28], [36], [47],
[52], [54], [61], [72], [74] and increased rapport [31], [47]. Students also perceived instructors to be
more caring [43], [48], [62], encouraging [43], [48], [70], supportive[32], [43], [64], and respectful [80].
Video-based feedback was considered by students to be effective at inviting interaction. Students
found screencast feedback to be interactive, although they recognized that communication was
unidirectional [34]. Students commented that video-based feedback felt conversational [8], [47], [53],
[59], [74], promoted dialogue [81], and encouraged open communication [30]. Some students
indicated that video-based feedback was like a face-to-face feedback meeting [36], [40], [51], [81].
However, a few students stated that video-based feedback created an expectation of a conversation
without providing the opportunity to have one [36], [52]. In the study of Borup et al. [32], students
indicated that video feedback inhibited further communication with their instructor because they felt
like they needed to reply with a video. Those students were less likely to respond to video feedback
because they lacked the confidence and technical proficiency to record a video.
3.2 Negative Perceptions
3.2.1 Accessibility Problems
Students perceived that video-based feedback was not easily accessible in some cases. Twenty
percent of students in one study admitted that they did not know how to access screencast feedback
despite being provided with written instructions and having available student supports for using
technology [47]. In another study, teacher education students preferred text-based feedback because
accessing it did not require headphones or a private location [32]. Radiography students on placement
reported that they could not access or hear video-based feedback because of technological limitations
found in some medical environments, including internet restrictions and the absence of speakers or
headphones [35]. Media arts students indicated that they appreciated text feedback because it could
be printed and did not require an electronic device to review [38]. International students and students
learning a new language were negatively affected by the pace of spoken feedback because of limited
listening skills [62]. Further, some students indicated that they experienced difficulty finding video files
in an LMS [52], slow downloads [38], media files that were incompatible with their device [41], [55],
and low audio quality [41], [52], [82]. However, the number of students who encountered these
complications in each of these studies was relatively low. One exception was a study that reported
considerable difficulties with the audio quality (69% of students) and playing Flash files (27% of
students) [41].
6538
3.2.2 Linearity Limitations
Students in several studies reported on limitations they encountered because of the linear nature of
video-based feedback [32], [40], [41], [45], [47], [53], [81], [83]. Students noted that they experienced
difficulty in scanning the feedback [32], [45], [47]. This difficulty led to increased difficulty in making
revisions [53] because students could not easily find specific feedback comments [32], [47]. As a
result, they required repeated reviewing of the videos to act on the feedback [40], [81], [83] or needed
to take notes to keep track of the oral feedback comments [32], [81]. Further, students who received
feedback with only a video of their instructor talking reported that they found it difficult to match the
instructor’s comments to the appropriate location in their submission [43]. Even students who received
screencast feedback had difficulty matching oral comments to their submission when the instructor did
not also provide annotations on their submission [47].
3.2.3 Evocation of Negative Emotions
Seven studies noted that students experienced negative feelings when receiving video-based
feedback [35], [40]–[43], [45], [52]. Several studies did not detail the extent of these negative feelings;
however, those that provided details reported that more than 20% of students identified negative
feelings about video-based feedback on surveys [41] and in open-text responses [42], [43], [52].
Students expressed feeling anxiety [41], [43], nervousness [45], discomfort [40], [42], awkwardness
[52], and hesitancy to watch the feedback [35]. These findings seem to corroborate the thesis that
video-based feedback can be scarily personal for some students [43].
3.3 Preference
In the end, notwithstanding the negative perceptions, the majority of studies reported a widespread
preference for video-based versus text-based feedback. This preference was noted among students
across educational environments, including in face-to-face classroom courses [29]–[31], [34], [38]–
[42], [44], [46], [50], [54], [59], [73], [79], [82], [84], blended courses [43], [81], [83], synchronous online
courses [33], and asynchronous online courses [27], [36], [45], [75]. Generic video-based feedback
addressed to a group of students was also preferred to other forms of feedback [60], as was generic
pencast feedback illustrating solutions for mathematical calculations [56], [84]. McCarthy [38] found
that male respondents and respondents under the age of 25 were slightly more inclined to prefer
video-based feedback to text-based feedback. Henderson & Phillips [43] found no discernible
relationship between variables such as gender, degree level, or ESL ability and a preference for video
or text-based feedback. A couple of studies reported nuance in students’ preferences, with some
preferring video-based feedback for comments about higher-order concerns such as structure and
text-based feedback for lower-order concerns such as grammar, spelling, and punctuation corrections
[53], [85].
4 CONCLUSIONS
Six issues, three with positive valence and three with negative, surfaced in a systemic review of the
literature on students’ perceptions of receiving video-based feedback in higher education. On the
positive side, students perceived video-based feedback to be of high quality and to promote cognitive
and social presence. On the negative side, students encountered problems accessing and using
video-based feedback, perceived limitations due to its linear nature, and found that it evoked negative
emotions. Despite the problems, limitations, and anxiety induced, students preferred video-based
feedback in most studies.
Consequently, instructors desiring to provide richer feedback to students may find video-based
feedback to be a useful tool. Research findings from this study indicate that attention should be given
to simplifying access to video-based feedback, communicating instructions for access, and ensuring
adequate audio and video quality. Further, instructors can mitigate the limitations of the linear nature
of video by providing an annotated copy of the submission to students. Increasing access to and
accuracy of video transcription may further mitigate this limitation by providing students with an easier
way to scan or search oral comments. Finally, instructors should be aware that video-based feedback
may be perceived as too personal by some students and seek to craft feedback that is sensitive while
being educative. While exploring student perceptions has provided insight, a significant opportunity
exists for research that examines the content of video-based feedback and compares it to text-based
and other forms of feedback.
6539
REFERENCES
[1] D. Carless, “Differing perceptions in the feedback process,” Studies in Higher Education, vol.
31, no. 2, pp. 219–233, 2006, doi: 10/cktfdf.
[2] M. Price, K. Handley, J. Millar, and B. O’Donovan, “Feedback : all that effort, but what is the
effect?,” Assessment & Evaluation in Higher Education, vol. 35, no. 3, pp. 277–289, May 2010,
doi: 10/drrnc3.
[3] A. M. Rae and D. K. Cochrane, “Listening to students: How to make written assessment
feedback useful,” Active Learning in Higher Education, vol. 9, no. 3, pp. 217–230, 2008, doi:
10/dhjczz.
[4] C. Evans, “Making sense of assessment feedback in higher education,” Review of Educational
Research, vol. 83, no. 1, pp. 70–120, 2013, doi: 10/gf82tm.
[5] J. Hattie and H. Timperley, “The power of feedback,” Review of Educational Research, vol. 77,
no. 1, pp. 81–112, 2007, doi: 10/bf4d36.
[6] A. N. Kluger and A. DeNisi, “The effects of feedback interventions on performance: A historical
review, a meta-analysis, and a preliminary feedback intervention theory.,” Psychological
Bulletin, vol. 119, no. 2, pp. 254–284, 1996, doi: 10/gtw.
[7] B. S. Bloom, “The 2 sigma problem: The search for methods of group instruction as effective as
one-to-one tutoring,” Educational Researcher, vol. 13, no. 6, pp. 4–16, Jun. 1984, doi:
10/ddj5p7.
[8] C. M. Anson, D. P. Dannels, J. I. Laboy, and L. Carneiro, “Students’ perceptions of oral
screencast responses to their writing: Exploring digitally mediated identities,” Journal of
Business and Technical Communication, vol. 30, no. 3, pp. 378–411, Mar. 2016, doi:
10/gg57hm.
[9] T. Ryan, M. Henderson, and M. Phillips, “Feedback modes matter: Comparing student
perceptions of digital and non-digital feedback modes in higher education,” British Journal of
Educational Technology, vol. 50, no. 3, pp. 1507–1523, 2019, doi: 10/gg57hg.
[10] J. Sommers, “The effects of tape-recorded commentary on student revision: A case study,”
Journal of Teaching Writing, vol. 8, no. 2, pp. 49–76, 1989.
[11] N. Sommers, “Responding to student writing,” College Composition and Communication, vol.
33, no. 2, pp. 148–156, 1982, doi: 10/cz9brj.
[12] H. D. Semke, “Effects of the red pen,” Foreign Language Annals, vol. 17, no. 3, pp. 195–202,
1984, doi: 10/fnqggc.
[13] R. L. Dukes and H. Albanesi, “Seeing red: Quality of an essay, color of the grading pen, and
student reactions to the grading process,” The Social Science Journal, vol. 50, no. 1, pp. 96–
100, Mar. 2013, doi: 10/f4r7rf.
[14] C. Glover and E. Brown, “Written feedback for students: too much, too detailed or too
incomprehensible to be effective?,” Bioscience Education, vol. 7, no. 1, pp. 1–16, May 2006,
doi: 10/gg57bp.
[15] M. R. Weaver, “Do students value feedback? Student perceptions of tutors’ written responses,”
Assessment & Evaluation in Higher Education, vol. 31, no. 3, pp. 379–394, 2006, doi:
10/cjknpn.
[16] H. J. Parkin, S. Hepplestone, G. Holden, B. Irwin, and L. Thorpe, “A role for technology in
enhancing students’ engagement with feedback,” Assessment & Evaluation in Higher
Education, vol. 37, no. 8, pp. 963–973, 2012, doi: 10/d8njhq.
[17] I. Glover, H. J. Parkin, S. Hepplestone, B. Irwin, and H. Rodger, “Making connections:
technological interventions to support students in using, and tutors in creating, assessment
feedback,” Research in Learning Technology, vol. 23, no. 1, p. 27078, 2015, doi:
10.3402/rlt.v23.27078.
[18] S. Hepplestone, G. Holden, B. Irwin, H. J. Parkin, and L. Thorpe, “Using technology to
encourage student engagement with feedback: a literature review,” Research in Learning
Technology, vol. 19, no. 2, pp. 117–127, 2011, doi: 10/fx6rbz.
6540
[19] E. Pitt and L. Norton, “‘Now that’s the feedback I want!’ Students’ reactions to feedback on
graded work and what they do with it.,” Assessment & Evaluation in Higher Education, vol. 42,
no. 4, pp. 499–516, Jun. 2017, doi: 10/gdqbvq.
[20] K. Richards, T. Bell, and A. Dwyer, “Training sessional academic staff to provide quality
feedback on university students’ assessment: Lessons from a faculty of law learning and
teaching project,” The Journal of Continuing Higher Education, vol. 65, no. 1, pp. 25–34, Jan.
2017, doi: 10/gg57fr.
[21] S. Shields, “‘My work is bleeding’: exploring students’ emotional responses to first-year
assignment feedback,” Teaching in Higher Education, vol. 20, no. 6, pp. 614–624, Aug. 2015,
doi: 10/gf9k57.
[22] J. Li and R. De Luca, “Review of assessment feedback.,” Studies in Higher Education, vol. 39,
no. 2, pp. 378–393, Mar. 2014, doi: 10/gfxd5q.
[23] J. B. Killoran, “Reel-to-reel tapes, cassettes, and digital audio media: Reverberations from a
half-century of recorded-audio response to student writing,” Computers and Composition, vol.
30, no. 1, pp. 37–49, 2013, doi: 10/gcpgwb.
[24] A. Liberati et al., “The PRISMA statement for reporting systematic reviews and meta-analyses
of studies that evaluate health care interventions: Explanation and elaboration,” PLOS
Medicine, vol. 6, no. 7, pp. 1–28, Jul. 2009, doi: 10/cw592j.
[25] D. Gough and J. Thomas, “Systematic reviews of research in education: Aims, myths and
multiple methods,” Review of Education, vol. 4, no. 1, pp. 84–102, 2016, doi: 10/gg57hx.
[26] J. M. Corbin and A. L. Strauss, Basics of Qualitative Research: Techniques and Procedures for
Developing Grounded Theory, 3rd ed. Los Angeles, CA: Sage Publications, Inc, 2008.
[27] W. Alharbi, “E-feedback as a scaffolding teaching strategy in the online language classroom,”
Journal of Educational Technology Systems, vol. 46, no. 2, pp. 239–251, Apr. 2017, doi:
10/gg57hr.
[28] P. Mathisen, “Video feedback in higher education: A contribution to improving the quality of
written feedback,” Nordic Journal of Digital Literacy, vol. 7, no. 2, pp. 97–113, 2012, Retrieved
from https://www.idunn.no/dk/2012/02.
[29] W. Turner and J. West, “Assessment for ‘digital first language’ speakers: Online video
assessment and feedback in higher education.,” International Journal of Teaching & Learning in
Higher Education, vol. 25, no. 3, pp. 288–296, Jul. 2013, Retrieved from
http://www.isetl.org/ijtlhe/past2.cfm?v=25&i=3.
[30] E. J. Vincelette and T. Bostic, “Show and tell: Student and instructor perceptions of screencast
assessment,” Assessing Writing, vol. 18, no. 4, pp. 257–277, 2013, doi: 10/gcpgkz.
[31] J. West and W. Turner, “Enhancing the assessment experience: Improving student perceptions,
engagement and understanding using online video feedback,” Innovations in Education &
Teaching International, vol. 53, no. 4, pp. 400–410, Aug. 2016, doi: 10/gg57h4.
[32] J. Borup, R. E. West, and R. A. Thomas, “The impact of text versus video communication on
instructor feedback in blended courses,” Educational Technology Research and Development,
vol. 63, no. 2, pp. 161–184, Feb. 2015, doi: 10/f65vp5.
[33] A. Grigoryan, “Audiovisual commentary as a way to reduce transactional distance and increase
teaching presence in online writing instruction: Student perceptions and preferences,” Journal of
Response to Writing, vol. 3, no. 1, pp. 83–128, 2017, Retrieved from
https://journalrw.org/index.php/jrw/article/view/77.
[34] M. Ghosn-Chelala and W. Al-Chibani, “Screencasting: Supportive feedback for EFL remedial
writing students,” The International Journal of Information and Learning Technology, vol. 35, no.
3, pp. 146–159, 2018, doi: 10/gg57hb.
[35] E. Hyde, “Talking results: Trialing an audio-visual feedback method for e-submissions,”
Innovative Practice in Higher Education, vol. 1, no. 3, 2013, Retrieved from
http://journals.staffs.ac.uk/index.php/ipihe/article/view/37.
6541
[36] K. Mathieson, “Exploring student perceptions of audiovisual feedback via screencasting in
online courses,” American Journal of Distance Education, vol. 26, no. 3, pp. 143–156, 2012, doi:
10/gg57f7.
[37] E. Mayhew, “Playback feedback: The impact of screen-captured video feedback on student
satisfaction, learning and attainment,” European Political Science, vol. 16, no. 2, pp. 179–192,
Jun. 2017, doi: 10/gg57hk.
[38] J. McCarthy, “Evaluating written, audio and video feedback in higher education summative
assessment tasks,” Issues in Educational Research, vol. 25, no. 2, pp. 153–169, May 2015,
Retrieved from http://www.learntechlib.org/p/161352.
[39] S. Özkul and D. Ortaçtepe, “The use of video feedback in teaching process-approach EFL
writing,” TESOL Journal, vol. 8, no. 4, pp. 862–877, Dec. 2017, doi: 10.1002/tesj.362.
[40] J. Sommers, “Response 2.0: Commentary on student writing for the new millennium,” Journal of
College Literacy & Learning, vol. 39, pp. 21–37, Jan. 2013, Retrieved from https://j-
cll.org/volume-39-2013.
[41] A. D. Ali, “Effectiveness of using screencast feedback on EFL students’ writing and perception,”
English Language Teaching, vol. 9, no. 8, pp. 106–121, Jun. 2016, doi: 10/gg57f4.
[42] T. Hall, D. Tracy, and A. Lamey, “Exploring video feedback in philosophy: Benefits for
instructors and students,” Teaching Philosophy, vol. 39, no. 2, pp. 137–162, Jun. 2016, doi:
10/f3p8fc.
[43] M. Henderson and M. Phillips, “Video-based feedback on student assessment: Scarily
personal,” Australasian Journal of Educational Technology, vol. 31, no. 1, pp. 51–66, Jan. 2015,
doi: 10.14742/ajet.1878.
[44] N. S. Moore and M. Filling, “iFeedback: Using video technology for improving student writing,”
Journal of College Literacy & Learning, vol. 38, pp. 3–14, Jan. 2012, Retrieved from https://j-
cll.org/volume-38-2012.
[45] K. Edwards, A.-F. Dujardin, and N. Williams, “Screencast feedback for essays on a distance
learning MA in professional communication,” Journal of Academic Writing, vol. 2, no. 1, pp. 95–
126, 2012, doi: 10/gg57hp.
[46] P. Marriott and L. K. Teoh, “Using screencasts to enhance assessment feedback: Students’
perceptions and preferences,” Accounting Education, vol. 21, no. 6, pp. 583–598, Dec. 2012,
doi: 10/gg57hh.
[47] R. Thompson and M. J. Lee, “Talking with students through screencasting: Experimentations
with video feedback to improve student learning,” The Journal of Interactive Technology and
Pedagogy, vol. 1, no. 1, 2012, Retrieved from https://jitp.commons.gc.cuny.edu/2012/02/17/.
[48] I. G. Anson, “Assessment feedback using screencapture technology in political science,”
Journal of Political Science Education, vol. 11, no. 4, pp. 375–390, 2015, doi: 10/gg57fw.
[49] M. Flood, J. C. Hayden, B. Bourke, P. J. Gallagher, and S. Maher, “Design and evaluation of
video podcasts for providing online feedback on formative pharmaceutical calculations
assessments,” American Journal of Pharmaceutical Education, vol. 81, no. 10, pp. 100–103,
Dec. 2017, doi: 10/gcvnfg.
[50] K. J. Cunningham, “Student perceptions and use of technology-mediated text and screencast
feedback in ESL writing,” Computers and Composition, vol. 52, pp. 222–241, Jun. 2019, doi:
10/gg57fv.
[51] L. C. Bissell, “Screen-casting as a technology-enhanced feedback mode,” Journal of
Perspectives in Applied Academic Practice, vol. 5, no. 1, pp. 4–12, Jan. 2017, doi: 10/gg57fz.
[52] A. Lamey, “Video feedback in philosophy,” Metaphilosophy, vol. 46, no. 4–5, pp. 691–702,
2015, doi: 10/gg57f6.
[53] M. L. Silva, “Camtasia in the classroom: Student attitudes and preferences for video
commentary or Microsoft Word comments during the revision process,” Computers and
Composition, vol. 29, no. 1, pp. 1–22, Mar. 2012, doi: 10/gcpgt7.
6542
[54] T. B. Crews and K. Wilkinson, “Students’ perceived preference for visual and auditory
assessment with e-handwritten feedback,” Business Communication Quarterly, vol. 73, no. 4,
pp. 399–412, Nov. 2010, doi: 10/bs5nrw.
[55] S. J. Deeley, “Using technology to facilitate effective assessment for learning and feedback in
higher education,” Assessment & Evaluation in Higher Education, vol. 43, no. 3, pp. 439–448,
Jul. 2017, doi: 10/gg57f3.
[56] M. Robinson, B. Loch, and T. Croft, “Student perceptions of screencast feedback on
mathematics assessment,” International Journal of Research in Undergraduate Mathematics
Education, vol. 1, no. 3, pp. 363–385, 2015, doi: 10/gg57hj.
[57] R. Alvira, “The impact of oral and written feedback on EFL writers with the use of screencasts,”
Profile: Issues in Teachers’ Professional Development, vol. 18, no. 2, pp. 79–92, 2016, doi:
10/gg57jb.
[58] S. Armağan, O. Bozoğlu, E. Güven, and K. Çelik, “Usage of video feedback in the course of
writing in EFL: Challenges and advantages,” IJSBAR, vol. 30, no. 2, pp. 95–102, Oct. 2016,
Retrieved from
http://gssrr.org/index.php?journal=JournalOfBasicAndApplied&page=article&op=view&path%5B%
5D=6459.
[59] D. Cranny, “Screencasting, a tool to facilitate engagement with formative feedback?,” AISHE-J:
The All Ireland Journal of Teaching and Learning in Higher Education, vol. 8, no. 3, pp. 2911–
29127, Sep. 2016, Retrieved from http://ojs.aishe.org/index.php/aishe-j/article/view/291.
[60] A. Crook et al., “The use of video technology for providing feedback to students: Can it enhance
the feedback experience for staff and students?,” Computers & Education, vol. 58, no. 1, pp.
386–396, Jan. 2012, doi: 10.1016/j.compedu.2011.08.025.
[61] M. E. Griffiths and C. R. Graham, “Using asynchronous video to achieve instructor immediacy
and closeness in online classes: Experiences from three cases,” International Journal on E-
Learning, vol. 9, no. 3, pp. 325–340, Jul. 2010, Retrieved from
https://learntechlib.org/primary/p/30315/.
[62] V. Kim, “Technology-enhanced feedback on student writing in the English-medium instruction
classroom,” English Teaching, vol. 73, no. 4, pp. 29–53, Winter 2018, doi: 10/gg57hc.
[63] J. Orlando, “A comparison of text, voice, and screencasting feedback to online students,”
American Journal of Distance Education, vol. 30, no. 3, pp. 156–166, 2016, doi: 10/gg57hn.
[64] A. S. Walker, “I hear what you’re saying: The power of screencasts in peer-to-peer review,”
Journal of Writing A*nalytics, vol. 1, pp. 356–391, 2017, Retrieved from
https://www.journals.colostate.edu/index.php/analytics/article/view/108.
[65] S.-T. A. Hung, “Enhancing feedback provision through multimodal video technology,”
Computers & Education, vol. 98, pp. 90–101, Jul. 2016, doi: 10/f8mgxr.
[66] K. J. Cunningham, “How language choices in feedback change with technology: Engagement in
text and screencast feedback on ESL writing,” Computers & Education, vol. 135, pp. 91–99, Jul.
2019, doi: 10/gf9mk4.
[67] J. Borup, R. E. West, and C. R. Graham, “The influence of asynchronous video communication
on learner social presence: A narrative analysis of four cases,” Distance Education, vol. 34, no.
1, pp. 48–63, 2013, doi: 10/gg57hq.
[68] J. Griesbaum, “Feedback in learning: Screencasts as tools to support instructor feedback to
students and the issue of learning from feedback given to other learners,” International Journal
of Information and Education Technology, vol. 7, no. 9, pp. 694–699, 2017, doi:
10.18178/ijiet.2017.7.9.956.
[69] F. Soltanpour and M. Valizadeh, “The effect of individualized technology-mediated feedback on
EFL learners’ argumentative essays,” International Journal of Applied Linguistics and English
Literature, vol. 7, no. 3, pp. 125–136, 2018, doi: 10/gg57h7.
[70] S. Robinson, L. Centifanti, G. Brewer, and L. Holyoak, “The benefits of delivering formative
feedback via video-casts,” UCLan Journal of Pedagogic Research, vol. 6, no. 1, 2015,
Retrieved from http://pops.uclan.ac.uk/index.php/ujpr/article/view/326/0.
6543
[71] F. Harper, H. Green, and M. Fernandez-Toro, “Using screencasts in the teaching of modern
languages: Investigating the use of Jing® in feedback on written assignments,” The Language
Learning Journal, vol. 46, no. 3, pp. 277–292, Aug. 2015, doi: 10/gg57f2.
[72] B. S. Parton, M. Crain-Dorough, and R. Hancock, “Using flip camcorders to create video
feedback: Is it realistic for professors and beneficial to students,” International Journal of
Instructional Technology & Distance Learning, vol. 7, no. 1, pp. 15–23, 2010, Retrieved from
http://www.itdl.org/Journal/Jan_10/article02.htm.
[73] D. W. Denton, “Using screen capture feedback to improve academic performance,”
TechTrends, vol. 58, no. 6, pp. 51–56, 2014, doi: 10/gg57ft.
[74] J. Borup, R. E. West, R. A. Thomas, and C. R. Graham, “Examining the impact of video
feedback on instructor social presence in blended courses,” The International Review of
Research in Open and Distributed Learning, vol. 15, no. 3, pp. 232–256, 2014, doi: 10/gg57h2.
[75] P. R. Lowenthal and J. C. Dunlap, “Investigating students’ perceptions of instructional strategies
to establish social presence,” Distance Education, vol. 39, no. 3, pp. 281–298, 2018, doi:
10/gg57hd.
[76] F. Harper, H. Green, and M. Fernandez-Toro, “Evaluating the integration of Jing® screencasts
in feedback on written assignments,” 2012, pp. 1–7.
[77] N. Jones, P. Georghiades, and J. Gunson, “Student feedback via screen capture digital video:
Stimulating student’s modified action,” Higher Education, vol. 64, no. 5, pp. 593–607, Mar.
2012, doi: 10/gg57f8.
[78] B. Brereton and K. Dunne, “An analysis of the impact of formative peer assessment and
screencast tutor feedback on veterinary nursing students’ learning,” AISHE-J: The All Ireland
Journal of Teaching and Learning in Higher Education, vol. 8, no. 3, pp. 2941–29424, 2016,
Retrieved from http://ojs.aishe.org/aishe/index.php/aishe-j/article/view/294.
[79] P. J. O’Malley, “Combining screencasting and a tablet PC to deliver personalised student
feedback,” New Directions in the Teaching of Physical Sciences, no. 7, pp. 27–30, 2011, doi:
10/gg57fx.
[80] M. E. Griffiths and C. R. Graham, “Using asynchronous video in online classes: Results from a
pilot study,” International Journal of Instructional Technology & Distance Learning, vol. 6, no. 3,
pp. 65–76, Mar. 2009, Retrieved from http://www.itdl.org/Journal/Mar_09/Mar_09.pdf#page=69.
[81] M. Gonzalez and N. S. Moore, “Supporting graduate student writers with VoiceThread,” Journal
of Educational Technology Systems, vol. 46, no. 4, pp. 485–504, 2018, doi: 10/gg57hf.
[82] S. A. Hope, “Making movies: The next big thing in feedback?,” Bioscience Education, vol. 18,
no. 1, pp. 1–14, Dec. 2011, doi: 10/crrfhv.
[83] W. Schilling and J. K. Estell, “Enhancing student comprehension with video grading,” CoED
Journal, vol. 5, no. 1, pp. 28–39, 2014, Retrieved from http://asee-coed.org/index.php/coed/article/
view/Schilling_Enhancing.
[84] E. Letón, E. M. Molanes-López, M. Luque, and R. Conejo, “Video podcast and illustrated text
feedback in a web-based formative assessment environment,” Computer Applications in
Engineering Education, vol. 26, no. 2, pp. 187–202, Mar. 2018, doi: 10/gg57hw.
[85] I. Elola and A. Oskoz, “Supporting second language writing using multimodal feedback,”
Foreign Language Annals, vol. 49, no. 1, pp. 58–74, Feb. 2016, doi: 10/gg57f5.
6544