Conference PaperPDF Available

Exploring Student Perceptions of Video Feedback: A Review of the Literature

Authors:

Abstract

Feedback is an integral component of learning and attempts to provide students with information about a perceived gap between their performance and desired outcomes. The standard format, particularly in higher education, is text-based feedback, despite significant advances in the ease of recording and distributing video-based feedback in digital learning environments. While recent studies have investigated the experimental use of video-based feedback, the perceptions of students who have received video-based feedback are not well understood. The purpose of the current study, then, was to conduct a systematic literature review of research on the use of video-based feedback in higher education from 2009-2019. Sixty-seven peer-reviewed articles, selected from a systematic search of electronic databases, were organized and examined through the lenses of Diffusion of Innovation and Community of Inquiry theory. An area of research that emerged as common to many studies was how students perceived the video feedback they received and video feedback in general. Analysis of the literature revealed that students preferred this form of feedback over text-based feedback. Students perceived video-based feedback positively, seeing it as more detailed, clearer, and richer, noting that it improved higher-order thinking skills and prepared them for future work. Video-based feedback also had a positive influence on their perceptions of cognitive and social presence. When students perceived video-based feedback negatively, they cited accessibility problems, the linear nature of feedback, and the evocation of negative emotions as adverse effects of receiving video feedback. This paper concludes with some educational implications arising from the perceptions of students and a discussion of research opportunities.
EXPLORING STUDENT PERCEPTIONS OF VIDEO FEEDBACK:
A REVIEW OF THE LITERATURE
T. Bahula, R. Kay
University of Ontario Institute of Technology (CANADA)
Abstract
Feedback is an integral component of learning and attempts to provide students with information
about a perceived gap between their performance and desired outcomes. The standard format,
particularly in higher education, is text-based feedback, despite significant advances in the ease of
recording and distributing video-based feedback in digital learning environments. While recent studies
have investigated the experimental use of video-based feedback, the perceptions of students who
have received video-based feedback are not well understood. The purpose of the current study, then,
was to conduct a systematic literature review of research on the use of video-based feedback in
higher education from 2009-2019. Sixty-seven peer-reviewed articles, selected from a systematic
search of electronic databases, were organized and examined through the lenses of Diffusion of
Innovation and Community of Inquiry theory. An area of research that emerged as common to many
studies was how students perceived the video feedback they received and video feedback in general.
Analysis of the literature revealed that students preferred this form of feedback over text-based
feedback. Students perceived video-based feedback positively, seeing it as more detailed, clearer,
and richer, noting that it improved higher-order thinking skills and prepared them for future work.
Video-based feedback also had a positive influence on their perceptions of cognitive and social
presence. When students perceived video-based feedback negatively, they cited accessibility
problems, the linear nature of feedback, and the evocation of negative emotions as adverse effects of
receiving video feedback. This paper concludes with some educational implications arising from the
perceptions of students and a discussion of research opportunities.
Keywords: video feedback; screencast feedback; assessment; higher education; systematic review.
1 INTRODUCTION
Feedback is an integral component of learning and involves communication about a gap between
actual performance and desired outcomes [1]. Narrowly construed, feedback provides a justification
for an assigned grade, in which case student engagement with the comments becomes perfunctory
[2], [3]. However, a broader conception is that feedback facilitates understanding and future
performance through dialogue among participants in learning communities [4]. As such, the provision
of feedback that engages students and encourages high-quality dialogue is one of the primary roles of
instructors in higher education [4]. Research has confirmed the importance of feedback. A synthesis of
over 500 meta-analyses identified feedback as one of the most critical factors in improving student
achievement [5]. However, the study also found that feedback had a high degree of variance in the
effect size, indicating that not all feedback had the same effect on learning [5]. Furthermore, some
feedback interventions had a negative effect [6], highlighting the need for educators to think carefully
about the quality and format of feedback.
One-on-one tutorial instruction is thought of as the gold standardof education [7]. Similarly, face-to-
face conferences appear to be one of the best methods to receive feedback [8], [9] and necessary to
clarify written feedback [10]. However, text-based feedback has become the norm in higher education.
Before using computers, instructors provided feedback as handwritten comments and codes on
studentswritten submissions [11]. The practice of writing extensive corrections and comments with a
red pen has led to disappointment and discouragement [12]. The association of red ink with negative
emotions led to the recommendation that instructors use a neutral colour of ink for marking [13].
However, the limitations of handwritten markup went beyond the colour of the ink. Students found
much of the feedback they received to be unhelpful because the comments were not specific, lacked
guidance, focused on the negative, or did not align with the learning goals for the assessment [14],
[15].
With the advent of digital submissions, feedback shifted from a handwritten to a digital format with text
typed in the digital margins [9], [16]. This change to digital markup helped students overcome the
Proceedings of ICERI2020 Conference
9th-10th November 2020
ISBN: 978-84-09-24232-0
6535
challenge of deciphering illegible scratches [2], [17], [18]. However, other problems remained,
including the lack of detail [19], the absence of pedagogical training for instructors [20], the difficulty
students encountered making connections between grades, feedback, and assessment criteria [17],
and the negative emotional responses that feedback can elicit [21].
Students expect feedback that is timely, personal, explicable, criteria-referenced, objective, and useful
for improvement in future work, according to a review of 37 empirical studies on assessment feedback
in higher education [22]. While higher quality feedback could address some of these expectations,
large class sizes and media constraints make meeting students expectations with text-based
feedback challenging. Instructors have experimented with other forms of media to provide feedback to
students as far back as the days of reel-to-reel tape [23]. More recently, instructors have used video-
based media, including screencast and webcam video, to provide feedback. The purpose of the
current study was to explore student perceptions by reviewing the literature about the use of video-
based feedback in higher education.
2 METHODOLOGY
2.1 Overview
We conducted a systematic literature review on the use of video-based feedback in higher education
using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework
[24]. The PRISMA process attempts to create a reproducible, comprehensive, and reliable overview of
a topic by identifying, screening, analyzing, and synthesizing primary research sources [25]. The
identification and screening phases were conducted iteratively and included establishing selection
criteria, testing search terms, and using those terms to search targeted databases. The search was
extended to high-quality educational journals and by scanning articles that met eligibility requirements
for additional sources. Articles that met the eligibility criteria were analyzed through careful reading,
extracting characteristics, describing methodologies, and coding emergent themes. Results were
synthesized by aggregating quantitative data and configuring qualitative results [25]. The PRISMA
framework produced 67 peer-reviewed articles on the use of video-based feedback, of which 58
articles reported on student perceptions of the feedback received.
2.2 Data Analysis
To frame an understanding of the context of video-based feedback use, we collected and analyzed the
key characteristics of each article. Data items included the year of publication, country, academic
level, academic discipline, assessment type, media used, and length of feedback. Descriptive
frequency statistics were calculated for each item. Further, beginning with five highly relevant articles,
we discovered emerging themes by carefully reading the results and discussion sections and
recording key findings. We then employed a constant comparative method [26] to review and code the
remaining articles for consistency and alignment with emerging themes.
2.3 Context
The 58 articles reported in this study were published between 2009 and 2019, with a majority
published since 2014. Most studies occurred in the United States or the United Kingdom. The studies
primarily focused on undergraduate students, with a few focused on graduate students and pre-
service teachers. Video-based feedback, both formative and summative, was received in various
academic disciplines, including education, language learning, humanities, business, and STEM. The
principal media used was screencast, followed by webcam videos of the instructor. The average
length of recorded feedback was seven minutes, with recording ranging between two and 26 minutes.
3 RESULTS
The synthesis of the results of the 58 articles that reported on student perceptions of the video-based
feedback they received revealed positive perceptions about the quality of video-based feedback and
the cognitive and social presence produced by it, negative perceptions about the accessibility
problems encountered, the linearity limitations of video, and the evocation of negative emotions
triggered. Notwithstanding the negative perceptions, students generally preferred video-based
feedback to text-based feedback.
6536
3.1 Positive Perceptions
3.1.1 High Quality
Multiple studies reported that students perceived the quality of video-based feedback to be better than
[27][31] or the same as [32], [33] text-based feedback. Three factors contributed to this perception.
The first factor was that students considered video-based feedback to include a higher degree of detail
[9], [29], [34][40]. The second factor was that students indicated that the content of video-based
feedback was clearer than text-based feedback [33], [38], [40][44]. The third factor was that students
appreciated the multimedia nature of video-based feedback because visuals and audio helped them
better understand the feedback [28], [41], [45][47]. Video-based feedback accomplished this, in part,
by extending the available palette to convey emphasis [34]. The visual aspect helped students
understand the feedback received [48]. For example, pharmacology students reported that the
synchronization of audio and video enabled them to understand calculations better [49]. Media arts
students considered screencast feedback to be the most appropriate type of feedback for the
predominantly visual assignments being assessed [38]. Finally, the audio component of screencast
feedback made parallel processing of feedback possible [50] and also reduced the anxiety of one
student with dyslexia [51].
Students also reported that the video-based feedback they received addressed higher-order thinking
[40], [45], [52], [53]. Video-based feedback addressed important issues, provided supporting
explanations and examples and helped students to prioritize revisions [45]. Students reported that
screencast feedback addressed problems in their thesis, research questions, organization, and
supporting evidence [53].
In addition, students were of the opinion that the content of video-based feedback would feed-forward,
resulting in improvements to their future work [8], [27], [29], [31], [37], [52], [54][56]. However, in one
study, graduate communications students indicated that they rated the usefulness of screencast and
digital markup feedback equally [45].
3.1.2 Better Cognitive Presence
In a large number of studies (n = 24), students indicated that they found the content of video-based
feedback easy to understand [29], [31], [32], [34], [37], [40], [42], [50], [52][54], [56][64] and that it
consequently increased understanding[29], [31], [55], [65]. Two studies linked student understanding
explicitly to the perceived clarity of video-based feedback [40], [42]. Video-based feedback was
considered less prone to misunderstanding because of the visual and vocal cues [32], [61], [62].
The engagement of students in the feedback process is sought after in higher education [16], [18],
[66]. Students in numerous studies (n = 18) reported feeling more engaged when receiving video-
based feedback [28], [30], [34][37], [39], [41], [47], [55][57], [59], [60], [62], [67][69]. Students
reported that video-based feedback helped to increase their motivation [28], [57], [62], [67] and
engagement in the revision process [39], [41], [56], [57], [70]. In several studies, students mentioned
that the social presence they experienced motivated them to engage with the feedback [28], [39], [67].
Students manifested their engagement with video-based feedback in at least two ways. Students
reported [30], [39], [40], [53], [59], [60], [71], [72] or were observed [50] viewing feedback multiple
times. Students also reported spending more time reviewing feedback compared to their typical
pattern with text-based feedback [27], [29], [31], [63]. Students also reported talking about the
feedback with their peers [60] and paying equal attention to the feedback as to the grade awarded
[35], [47].
Learning was another way in which students reported their engagement in the feedback process. In
some studies, learning was evidenced by revising submissions through the application of the feedback
received [30], [47], [73]. Students reported that they incorporated more of the comments provided in
video-based feedback than in digital markup [30]. Students reported an increase in their
comprehension of the topic due to screencast feedback [68]. Other studies reported that learning was
vaguely defined and evidence lacking [28], [36], [68]. While students perceived learning dividends,
they could not articulate what the learning dividends were [28].
Although not a dominant theme across the literature, students perceived that their instructors were
more engaged when providing video-based feedback [42], [48].
6537
3.1.3 Better Social Presence
Students also reported experiencing a sense of social presence when provided with video-based
feedback in nearly three-quarters of studies (n = 43, 74%). Students experienced social presence
even when instructors provided video-based feedback without any physical presence, such as in
asynchronous online courses. Many articles reported that students perceived video-based feedback to
be personal in a general sense (n = 29), while some reported positive perceptions of affective (n = 14),
cohesive (n = 15), and interactive (n = 11) expression. On the other hand, the results of a small
number of studies stand in contrast. Borup et al. [32], [74] found no significant difference in perception
of social presence between students who received video feedback and those who received digital
markup. Further, students (n = 37) in asynchronous online courses rated detailed text-based feedback
as significantly more effective at establishing social presence than video-based feedback [75].
Nonetheless, there was a general sense that video-based feedback was personal [9], [27], [29], [33],
[36], [37], [40], [42][48], [53][56], [59], [61], [63], [65], [67], [76], [77]. The perception of video-based
feedback as personal was reported with screencast, video, pencast, and VoiceThread feedback. Even
general video-based feedback, such as pencast worked solutions in mathematics sent to the whole
class, was perceived as personal [56]. Mayhew [37] noted that students considered screencasts with
embedded video more personal than a screencast alone.
Video-based feedback was perceived by students to be effective at conveying affective expression.
Students reported that videos revealed their instructorsemotions more accurately [8], [74] and helped
them judge the instructors authenticity [74]. Students commented that hearing the instructors tone of
voice was as an important factor [44], [47], [51], [52], [78], [79]. Hearing the instructors tone of voice
helped students perceive feedback as friendly [47], to understand the feedback [51], [52], to interpret it
more positively [78], and to see it as a constructive opportunity for improvement [44], [79].
Video-based feedback was considered by students to be effective at promoting group cohesion.
Students noted that this type of feedback helped them feel closer to their instructor [8], [28], [36], [47],
[52], [54], [61], [72], [74] and increased rapport [31], [47]. Students also perceived instructors to be
more caring [43], [48], [62], encouraging [43], [48], [70], supportive[32], [43], [64], and respectful [80].
Video-based feedback was considered by students to be effective at inviting interaction. Students
found screencast feedback to be interactive, although they recognized that communication was
unidirectional [34]. Students commented that video-based feedback felt conversational [8], [47], [53],
[59], [74], promoted dialogue [81], and encouraged open communication [30]. Some students
indicated that video-based feedback was like a face-to-face feedback meeting [36], [40], [51], [81].
However, a few students stated that video-based feedback created an expectation of a conversation
without providing the opportunity to have one [36], [52]. In the study of Borup et al. [32], students
indicated that video feedback inhibited further communication with their instructor because they felt
like they needed to reply with a video. Those students were less likely to respond to video feedback
because they lacked the confidence and technical proficiency to record a video.
3.2 Negative Perceptions
3.2.1 Accessibility Problems
Students perceived that video-based feedback was not easily accessible in some cases. Twenty
percent of students in one study admitted that they did not know how to access screencast feedback
despite being provided with written instructions and having available student supports for using
technology [47]. In another study, teacher education students preferred text-based feedback because
accessing it did not require headphones or a private location [32]. Radiography students on placement
reported that they could not access or hear video-based feedback because of technological limitations
found in some medical environments, including internet restrictions and the absence of speakers or
headphones [35]. Media arts students indicated that they appreciated text feedback because it could
be printed and did not require an electronic device to review [38]. International students and students
learning a new language were negatively affected by the pace of spoken feedback because of limited
listening skills [62]. Further, some students indicated that they experienced difficulty finding video files
in an LMS [52], slow downloads [38], media files that were incompatible with their device [41], [55],
and low audio quality [41], [52], [82]. However, the number of students who encountered these
complications in each of these studies was relatively low. One exception was a study that reported
considerable difficulties with the audio quality (69% of students) and playing Flash files (27% of
students) [41].
6538
3.2.2 Linearity Limitations
Students in several studies reported on limitations they encountered because of the linear nature of
video-based feedback [32], [40], [41], [45], [47], [53], [81], [83]. Students noted that they experienced
difficulty in scanning the feedback [32], [45], [47]. This difficulty led to increased difficulty in making
revisions [53] because students could not easily find specific feedback comments [32], [47]. As a
result, they required repeated reviewing of the videos to act on the feedback [40], [81], [83] or needed
to take notes to keep track of the oral feedback comments [32], [81]. Further, students who received
feedback with only a video of their instructor talking reported that they found it difficult to match the
instructors comments to the appropriate location in their submission [43]. Even students who received
screencast feedback had difficulty matching oral comments to their submission when the instructor did
not also provide annotations on their submission [47].
3.2.3 Evocation of Negative Emotions
Seven studies noted that students experienced negative feelings when receiving video-based
feedback [35], [40][43], [45], [52]. Several studies did not detail the extent of these negative feelings;
however, those that provided details reported that more than 20% of students identified negative
feelings about video-based feedback on surveys [41] and in open-text responses [42], [43], [52].
Students expressed feeling anxiety [41], [43], nervousness [45], discomfort [40], [42], awkwardness
[52], and hesitancy to watch the feedback [35]. These findings seem to corroborate the thesis that
video-based feedback can be scarily personal for some students [43].
3.3 Preference
In the end, notwithstanding the negative perceptions, the majority of studies reported a widespread
preference for video-based versus text-based feedback. This preference was noted among students
across educational environments, including in face-to-face classroom courses [29][31], [34], [38]
[42], [44], [46], [50], [54], [59], [73], [79], [82], [84], blended courses [43], [81], [83], synchronous online
courses [33], and asynchronous online courses [27], [36], [45], [75]. Generic video-based feedback
addressed to a group of students was also preferred to other forms of feedback [60], as was generic
pencast feedback illustrating solutions for mathematical calculations [56], [84]. McCarthy [38] found
that male respondents and respondents under the age of 25 were slightly more inclined to prefer
video-based feedback to text-based feedback. Henderson & Phillips [43] found no discernible
relationship between variables such as gender, degree level, or ESL ability and a preference for video
or text-based feedback. A couple of studies reported nuance in students preferences, with some
preferring video-based feedback for comments about higher-order concerns such as structure and
text-based feedback for lower-order concerns such as grammar, spelling, and punctuation corrections
[53], [85].
4 CONCLUSIONS
Six issues, three with positive valence and three with negative, surfaced in a systemic review of the
literature on students perceptions of receiving video-based feedback in higher education. On the
positive side, students perceived video-based feedback to be of high quality and to promote cognitive
and social presence. On the negative side, students encountered problems accessing and using
video-based feedback, perceived limitations due to its linear nature, and found that it evoked negative
emotions. Despite the problems, limitations, and anxiety induced, students preferred video-based
feedback in most studies.
Consequently, instructors desiring to provide richer feedback to students may find video-based
feedback to be a useful tool. Research findings from this study indicate that attention should be given
to simplifying access to video-based feedback, communicating instructions for access, and ensuring
adequate audio and video quality. Further, instructors can mitigate the limitations of the linear nature
of video by providing an annotated copy of the submission to students. Increasing access to and
accuracy of video transcription may further mitigate this limitation by providing students with an easier
way to scan or search oral comments. Finally, instructors should be aware that video-based feedback
may be perceived as too personal by some students and seek to craft feedback that is sensitive while
being educative. While exploring student perceptions has provided insight, a significant opportunity
exists for research that examines the content of video-based feedback and compares it to text-based
and other forms of feedback.
6539
REFERENCES
[1] D. Carless, “Differing perceptions in the feedback process,” Studies in Higher Education, vol.
31, no. 2, pp. 219233, 2006, doi: 10/cktfdf.
[2] M. Price, K. Handley, J. Millar, and B. O’Donovan, “Feedback : all that effort, but what is the
effect?,” Assessment & Evaluation in Higher Education, vol. 35, no. 3, pp. 277289, May 2010,
doi: 10/drrnc3.
[3] A. M. Rae and D. K. Cochrane, “Listening to students: How to make written assessment
feedback useful,” Active Learning in Higher Education, vol. 9, no. 3, pp. 217230, 2008, doi:
10/dhjczz.
[4] C. Evans, “Making sense of assessment feedback in higher education,” Review of Educational
Research, vol. 83, no. 1, pp. 70120, 2013, doi: 10/gf82tm.
[5] J. Hattie and H. Timperley, “The power of feedback,” Review of Educational Research, vol. 77,
no. 1, pp. 81112, 2007, doi: 10/bf4d36.
[6] A. N. Kluger and A. DeNisi, “The effects of feedback interventions on performance: A historical
review, a meta-analysis, and a preliminary feedback intervention theory.,” Psychological
Bulletin, vol. 119, no. 2, pp. 254284, 1996, doi: 10/gtw.
[7] B. S. Bloom, “The 2 sigma problem: The search for methods of group instruction as effective as
one-to-one tutoring,” Educational Researcher, vol. 13, no. 6, pp. 416, Jun. 1984, doi:
10/ddj5p7.
[8] C. M. Anson, D. P. Dannels, J. I. Laboy, and L. Carneiro, “Students’ perceptions of oral
screencast responses to their writing: Exploring digitally mediated identities,” Journal of
Business and Technical Communication, vol. 30, no. 3, pp. 378411, Mar. 2016, doi:
10/gg57hm.
[9] T. Ryan, M. Henderson, and M. Phillips, “Feedback modes matter: Comparing student
perceptions of digital and non-digital feedback modes in higher education, British Journal of
Educational Technology, vol. 50, no. 3, pp. 15071523, 2019, doi: 10/gg57hg.
[10] J. Sommers, “The effects of tape-recorded commentary on student revision: A case study,”
Journal of Teaching Writing, vol. 8, no. 2, pp. 4976, 1989.
[11] N. Sommers, “Responding to student writing,” College Composition and Communication, vol.
33, no. 2, pp. 148156, 1982, doi: 10/cz9brj.
[12] H. D. Semke, “Effects of the red pen,” Foreign Language Annals, vol. 17, no. 3, pp. 195202,
1984, doi: 10/fnqggc.
[13] R. L. Dukes and H. Albanesi, “Seeing red: Quality of an essay, color of the grading pen, and
student reactions to the grading process,The Social Science Journal, vol. 50, no. 1, pp. 96
100, Mar. 2013, doi: 10/f4r7rf.
[14] C. Glover and E. Brown, “Written feedback for students: too much, too detailed or too
incomprehensible to be effective?,” Bioscience Education, vol. 7, no. 1, pp. 116, May 2006,
doi: 10/gg57bp.
[15] M. R. Weaver, “Do students value feedback? Student perceptions of tutors’ written responses,”
Assessment & Evaluation in Higher Education, vol. 31, no. 3, pp. 379394, 2006, doi:
10/cjknpn.
[16] H. J. Parkin, S. Hepplestone, G. Holden, B. Irwin, and L. Thorpe, “A role for technology in
enhancing students’ engagement with feedback,” Assessment & Evaluation in Higher
Education, vol. 37, no. 8, pp. 963973, 2012, doi: 10/d8njhq.
[17] I. Glover, H. J. Parkin, S. Hepplestone, B. Irwin, and H. Rodger, “Making connections:
technological interventions to support students in using, and tutors in creating, assessment
feedback,” Research in Learning Technology, vol. 23, no. 1, p. 27078, 2015, doi:
10.3402/rlt.v23.27078.
[18] S. Hepplestone, G. Holden, B. Irwin, H. J. Parkin, and L. Thorpe, “Using technology to
encourage student engagement with feedback: a literature review,” Research in Learning
Technology, vol. 19, no. 2, pp. 117127, 2011, doi: 10/fx6rbz.
6540
[19] E. Pitt and L. Norton, “‘Now that’s the feedback I want!’ Students’ reactions to feedback on
graded work and what they do with it.,” Assessment & Evaluation in Higher Education, vol. 42,
no. 4, pp. 499516, Jun. 2017, doi: 10/gdqbvq.
[20] K. Richards, T. Bell, and A. Dwyer, “Training sessional academic staff to provide quality
feedback on university students’ assessment: Lessons from a faculty of law learning and
teaching project,” The Journal of Continuing Higher Education, vol. 65, no. 1, pp. 2534, Jan.
2017, doi: 10/gg57fr.
[21] S. Shields, “‘My work is bleeding’: exploring students’ emotional responses to first-year
assignment feedback,” Teaching in Higher Education, vol. 20, no. 6, pp. 614624, Aug. 2015,
doi: 10/gf9k57.
[22] J. Li and R. De Luca, “Review of assessment feedback.,” Studies in Higher Education, vol. 39,
no. 2, pp. 378393, Mar. 2014, doi: 10/gfxd5q.
[23] J. B. Killoran, “Reel-to-reel tapes, cassettes, and digital audio media: Reverberations from a
half-century of recorded-audio response to student writing,” Computers and Composition, vol.
30, no. 1, pp. 3749, 2013, doi: 10/gcpgwb.
[24] A. Liberati et al., “The PRISMA statement for reporting systematic reviews and meta-analyses
of studies that evaluate health care interventions: Explanation and elaboration,” PLOS
Medicine, vol. 6, no. 7, pp. 128, Jul. 2009, doi: 10/cw592j.
[25] D. Gough and J. Thomas, “Systematic reviews of research in education: Aims, myths and
multiple methods,” Review of Education, vol. 4, no. 1, pp. 84102, 2016, doi: 10/gg57hx.
[26] J. M. Corbin and A. L. Strauss, Basics of Qualitative Research: Techniques and Procedures for
Developing Grounded Theory, 3rd ed. Los Angeles, CA: Sage Publications, Inc, 2008.
[27] W. Alharbi, “E-feedback as a scaffolding teaching strategy in the online language classroom,”
Journal of Educational Technology Systems, vol. 46, no. 2, pp. 239251, Apr. 2017, doi:
10/gg57hr.
[28] P. Mathisen, “Video feedback in higher education: A contribution to improving the quality of
written feedback,” Nordic Journal of Digital Literacy, vol. 7, no. 2, pp. 97113, 2012, Retrieved
from https://www.idunn.no/dk/2012/02.
[29] W. Turner and J. West, “Assessment for ‘digital first language’ speakers: Online video
assessment and feedback in higher education.,” International Journal of Teaching & Learning in
Higher Education, vol. 25, no. 3, pp. 288296, Jul. 2013, Retrieved from
http://www.isetl.org/ijtlhe/past2.cfm?v=25&i=3.
[30] E. J. Vincelette and T. Bostic, “Show and tell: Student and instructor perceptions of screencast
assessment,” Assessing Writing, vol. 18, no. 4, pp. 257277, 2013, doi: 10/gcpgkz.
[31] J. West and W. Turner, “Enhancing the assessment experience: Improving student perceptions,
engagement and understanding using online video feedback,” Innovations in Education &
Teaching International, vol. 53, no. 4, pp. 400410, Aug. 2016, doi: 10/gg57h4.
[32] J. Borup, R. E. West, and R. A. Thomas, “The impact of text versus video communication on
instructor feedback in blended courses,” Educational Technology Research and Development,
vol. 63, no. 2, pp. 161184, Feb. 2015, doi: 10/f65vp5.
[33] A. Grigoryan, “Audiovisual commentary as a way to reduce transactional distance and increase
teaching presence in online writing instruction: Student perceptions and preferences,” Journal of
Response to Writing, vol. 3, no. 1, pp. 83128, 2017, Retrieved from
https://journalrw.org/index.php/jrw/article/view/77.
[34] M. Ghosn-Chelala and W. Al-Chibani, “Screencasting: Supportive feedback for EFL remedial
writing students,” The International Journal of Information and Learning Technology, vol. 35, no.
3, pp. 146159, 2018, doi: 10/gg57hb.
[35] E. Hyde, “Talking results: Trialing an audio-visual feedback method for e-submissions,”
Innovative Practice in Higher Education, vol. 1, no. 3, 2013, Retrieved from
http://journals.staffs.ac.uk/index.php/ipihe/article/view/37.
6541
[36] K. Mathieson, “Exploring student perceptions of audiovisual feedback via screencasting in
online courses,” American Journal of Distance Education, vol. 26, no. 3, pp. 143156, 2012, doi:
10/gg57f7.
[37] E. Mayhew, “Playback feedback: The impact of screen-captured video feedback on student
satisfaction, learning and attainment,” European Political Science, vol. 16, no. 2, pp. 179192,
Jun. 2017, doi: 10/gg57hk.
[38] J. McCarthy, “Evaluating written, audio and video feedback in higher education summative
assessment tasks,” Issues in Educational Research, vol. 25, no. 2, pp. 153169, May 2015,
Retrieved from http://www.learntechlib.org/p/161352.
[39] S. Özkul and D. Ortaçtepe, “The use of video feedback in teaching process-approach EFL
writing,TESOL Journal, vol. 8, no. 4, pp. 862–877, Dec. 2017, doi: 10.1002/tesj.362.
[40] J. Sommers, “Response 2.0: Commentary on student writing for the new millennium,” Journal of
College Literacy & Learning, vol. 39, pp. 2137, Jan. 2013, Retrieved from https://j-
cll.org/volume-39-2013.
[41] A. D. Ali, “Effectiveness of using screencast feedback on EFL students’ writing and perception,”
English Language Teaching, vol. 9, no. 8, pp. 106121, Jun. 2016, doi: 10/gg57f4.
[42] T. Hall, D. Tracy, and A. Lamey, “Exploring video feedback in philosophy: Benefits for
instructors and students,” Teaching Philosophy, vol. 39, no. 2, pp. 137162, Jun. 2016, doi:
10/f3p8fc.
[43] M. Henderson and M. Phillips, “Video-based feedback on student assessment: Scarily
personal,” Australasian Journal of Educational Technology, vol. 31, no. 1, pp. 5166, Jan. 2015,
doi: 10.14742/ajet.1878.
[44] N. S. Moore and M. Filling, “iFeedback: Using video technology for improving student writing,”
Journal of College Literacy & Learning, vol. 38, pp. 314, Jan. 2012, Retrieved from https://j-
cll.org/volume-38-2012.
[45] K. Edwards, A.-F. Dujardin, and N. Williams, “Screencast feedback for essays on a distance
learning MA in professional communication,” Journal of Academic Writing, vol. 2, no. 1, pp. 95
126, 2012, doi: 10/gg57hp.
[46] P. Marriott and L. K. Teoh, “Using screencasts to enhance assessment feedback: Students’
perceptions and preferences,” Accounting Education, vol. 21, no. 6, pp. 583598, Dec. 2012,
doi: 10/gg57hh.
[47] R. Thompson and M. J. Lee, “Talking with students through screencasting: Experimentations
with video feedback to improve student learning,” The Journal of Interactive Technology and
Pedagogy, vol. 1, no. 1, 2012, Retrieved from https://jitp.commons.gc.cuny.edu/2012/02/17/.
[48] I. G. Anson, “Assessment feedback using screencapture technology in political science,”
Journal of Political Science Education, vol. 11, no. 4, pp. 375390, 2015, doi: 10/gg57fw.
[49] M. Flood, J. C. Hayden, B. Bourke, P. J. Gallagher, and S. Maher, “Design and evaluation of
video podcasts for providing online feedback on formative pharmaceutical calculations
assessments,” American Journal of Pharmaceutical Education, vol. 81, no. 10, pp. 100103,
Dec. 2017, doi: 10/gcvnfg.
[50] K. J. Cunningham, “Student perceptions and use of technology-mediated text and screencast
feedback in ESL writing,” Computers and Composition, vol. 52, pp. 222241, Jun. 2019, doi:
10/gg57fv.
[51] L. C. Bissell, “Screen-casting as a technology-enhanced feedback mode,” Journal of
Perspectives in Applied Academic Practice, vol. 5, no. 1, pp. 412, Jan. 2017, doi: 10/gg57fz.
[52] A. Lamey, “Video feedback in philosophy,” Metaphilosophy, vol. 46, no. 45, pp. 691702,
2015, doi: 10/gg57f6.
[53] M. L. Silva, “Camtasia in the classroom: Student attitudes and preferences for video
commentary or Microsoft Word comments during the revision process,” Computers and
Composition, vol. 29, no. 1, pp. 122, Mar. 2012, doi: 10/gcpgt7.
6542
[54] T. B. Crews and K. Wilkinson, “Students’ perceived preference for visual and auditory
assessment with e-handwritten feedback,” Business Communication Quarterly, vol. 73, no. 4,
pp. 399412, Nov. 2010, doi: 10/bs5nrw.
[55] S. J. Deeley, “Using technology to facilitate effective assessment for learning and feedback in
higher education,” Assessment & Evaluation in Higher Education, vol. 43, no. 3, pp. 439448,
Jul. 2017, doi: 10/gg57f3.
[56] M. Robinson, B. Loch, and T. Croft, “Student perceptions of screencast feedback on
mathematics assessment,” International Journal of Research in Undergraduate Mathematics
Education, vol. 1, no. 3, pp. 363385, 2015, doi: 10/gg57hj.
[57] R. Alvira, “The impact of oral and written feedback on EFL writers with the use of screencasts,”
Profile: Issues in Teachers’ Professional Development, vol. 18, no. 2, pp. 7992, 2016, doi:
10/gg57jb.
[58] S. Armağan, O. Bozoğlu, E. Güven, and K. Çelik, “Usage of video feedback in the course of
writing in EFL: Challenges and advantages,” IJSBAR, vol. 30, no. 2, pp. 95102, Oct. 2016,
Retrieved from
http://gssrr.org/index.php?journal=JournalOfBasicAndApplied&page=article&op=view&path%5B%
5D=6459.
[59] D. Cranny, “Screencasting, a tool to facilitate engagement with formative feedback?,” AISHE-J:
The All Ireland Journal of Teaching and Learning in Higher Education, vol. 8, no. 3, pp. 2911
29127, Sep. 2016, Retrieved from http://ojs.aishe.org/index.php/aishe-j/article/view/291.
[60] A. Crook et al., “The use of video technology for providing feedback to students: Can it enhance
the feedback experience for staff and students?,” Computers & Education, vol. 58, no. 1, pp.
386396, Jan. 2012, doi: 10.1016/j.compedu.2011.08.025.
[61] M. E. Griffiths and C. R. Graham, “Using asynchronous video to achieve instructor immediacy
and closeness in online classes: Experiences from three cases,” International Journal on E-
Learning, vol. 9, no. 3, pp. 325340, Jul. 2010, Retrieved from
https://learntechlib.org/primary/p/30315/.
[62] V. Kim, “Technology-enhanced feedback on student writing in the English-medium instruction
classroom,” English Teaching, vol. 73, no. 4, pp. 2953, Winter 2018, doi: 10/gg57hc.
[63] J. Orlando, “A comparison of text, voice, and screencasting feedback to online students,”
American Journal of Distance Education, vol. 30, no. 3, pp. 156166, 2016, doi: 10/gg57hn.
[64] A. S. Walker, “I hear what you’re saying: The power of screencasts in peer-to-peer review,”
Journal of Writing A*nalytics, vol. 1, pp. 356391, 2017, Retrieved from
https://www.journals.colostate.edu/index.php/analytics/article/view/108.
[65] S.-T. A. Hung, “Enhancing feedback provision through multimodal video technology,”
Computers & Education, vol. 98, pp. 90101, Jul. 2016, doi: 10/f8mgxr.
[66] K. J. Cunningham, “How language choices in feedback change with technology: Engagement in
text and screencast feedback on ESL writing,” Computers & Education, vol. 135, pp. 9199, Jul.
2019, doi: 10/gf9mk4.
[67] J. Borup, R. E. West, and C. R. Graham, “The influence of asynchronous video communication
on learner social presence: A narrative analysis of four cases,” Distance Education, vol. 34, no.
1, pp. 4863, 2013, doi: 10/gg57hq.
[68] J. Griesbaum, “Feedback in learning: Screencasts as tools to support instructor feedback to
students and the issue of learning from feedback given to other learners,” International Journal
of Information and Education Technology, vol. 7, no. 9, pp. 694699, 2017, doi:
10.18178/ijiet.2017.7.9.956.
[69] F. Soltanpour and M. Valizadeh, “The effect of individualized technology-mediated feedback on
EFL learners’ argumentative essays,” International Journal of Applied Linguistics and English
Literature, vol. 7, no. 3, pp. 125136, 2018, doi: 10/gg57h7.
[70] S. Robinson, L. Centifanti, G. Brewer, and L. Holyoak, “The benefits of delivering formative
feedback via video-casts,” UCLan Journal of Pedagogic Research, vol. 6, no. 1, 2015,
Retrieved from http://pops.uclan.ac.uk/index.php/ujpr/article/view/326/0.
6543
[71] F. Harper, H. Green, and M. Fernandez-Toro, “Using screencasts in the teaching of modern
languages: Investigating the use of Jing® in feedback on written assignments,” The Language
Learning Journal, vol. 46, no. 3, pp. 277292, Aug. 2015, doi: 10/gg57f2.
[72] B. S. Parton, M. Crain-Dorough, and R. Hancock, “Using flip camcorders to create video
feedback: Is it realistic for professors and beneficial to students,” International Journal of
Instructional Technology & Distance Learning, vol. 7, no. 1, pp. 1523, 2010, Retrieved from
http://www.itdl.org/Journal/Jan_10/article02.htm.
[73] D. W. Denton, “Using screen capture feedback to improve academic performance,”
TechTrends, vol. 58, no. 6, pp. 5156, 2014, doi: 10/gg57ft.
[74] J. Borup, R. E. West, R. A. Thomas, and C. R. Graham, “Examining the impact of video
feedback on instructor social presence in blended courses,” The International Review of
Research in Open and Distributed Learning, vol. 15, no. 3, pp. 232256, 2014, doi: 10/gg57h2.
[75] P. R. Lowenthal and J. C. Dunlap, “Investigating students’ perceptions of instructional strategies
to establish social presence,” Distance Education, vol. 39, no. 3, pp. 281298, 2018, doi:
10/gg57hd.
[76] F. Harper, H. Green, and M. Fernandez-Toro, “Evaluating the integration of Jing® screencasts
in feedback on written assignments,” 2012, pp. 17.
[77] N. Jones, P. Georghiades, and J. Gunson, “Student feedback via screen capture digital video:
Stimulating student’s modified action,” Higher Education, vol. 64, no. 5, pp. 593607, Mar.
2012, doi: 10/gg57f8.
[78] B. Brereton and K. Dunne, “An analysis of the impact of formative peer assessment and
screencast tutor feedback on veterinary nursing students’ learning,” AISHE-J: The All Ireland
Journal of Teaching and Learning in Higher Education, vol. 8, no. 3, pp. 294129424, 2016,
Retrieved from http://ojs.aishe.org/aishe/index.php/aishe-j/article/view/294.
[79] P. J. O’Malley, “Combining screencasting and a tablet PC to deliver personalised student
feedback,” New Directions in the Teaching of Physical Sciences, no. 7, pp. 2730, 2011, doi:
10/gg57fx.
[80] M. E. Griffiths and C. R. Graham, “Using asynchronous video in online classes: Results from a
pilot study,” International Journal of Instructional Technology & Distance Learning, vol. 6, no. 3,
pp. 6576, Mar. 2009, Retrieved from http://www.itdl.org/Journal/Mar_09/Mar_09.pdf#page=69.
[81] M. Gonzalez and N. S. Moore, “Supporting graduate student writers with VoiceThread,” Journal
of Educational Technology Systems, vol. 46, no. 4, pp. 485504, 2018, doi: 10/gg57hf.
[82] S. A. Hope, “Making movies: The next big thing in feedback?,” Bioscience Education, vol. 18,
no. 1, pp. 114, Dec. 2011, doi: 10/crrfhv.
[83] W. Schilling and J. K. Estell, “Enhancing student comprehension with video grading,” CoED
Journal, vol. 5, no. 1, pp. 2839, 2014, Retrieved from http://asee-coed.org/index.php/coed/article/
view/Schilling_Enhancing.
[84] E. Letón, E. M. Molanes-pez, M. Luque, and R. Conejo, Video podcast and illustrated text
feedback in a web-based formative assessment environment, Computer Applications in
Engineering Education, vol. 26, no. 2, pp. 187202, Mar. 2018, doi: 10/gg57hw.
[85] I. Elola and A. Oskoz, “Supporting second language writing using multimodal feedback,”
Foreign Language Annals, vol. 49, no. 1, pp. 5874, Feb. 2016, doi: 10/gg57f5.
6544
... It facilitates a dialogic exchange (Boud & Molloy 2013;Rowe 2017) or multidirectional exchange in which there are multiple channels of communication between instructor and student (Borup et al. 2015;Robinson et al. 2015;Mayhew, 2017;Lowenthal & Dunlap, 2018). However, though students perceived video feedback to feed-forward (Robinson et al. 2015;Bahula & Kay, 2020), leading to improvements in academic performance, several researchers posit that the feedback did not help students regulate their learning, develop better evaluative judgment (Mahoney et al. 2019), or even improve their performance (Turner & West, 2013). ...
... For some, the feedback was linear in nature (Borup et al. 2015;Henderson & Phillips, 2015), inaccessible (Ali, 2016;Deeley, 2017), evoked negative sentiments (Borup et al. 2014;Henderson and Phillips, 2015), and was time-consuming (Marriott and Teoh, 2012;Mathieson, 2012). Despite these criticisms, some students were found to perceive video feedback as having a positive impact on their learning (Bahula & Kay, 2020;McCarthy, 2015). Yet, few studies have investigated students' perceptions on video-based feedback (Alan Hung, 2016;Borup et al. 2015;Brown et al. 2016). ...
Article
Full-text available
This study examined student perceptions of screencast feedback and their learning behaviors following screencast feedback in an online graduate course. While there is widespread research on instructor feedback, there is far less literature focusing on video-based feedback and self-regulatory behavior within a Caribbean online learning environment. This case study addressed this gap by examining twelve management students' perceptions of screencast feedback on their online learning experience. Data were collected using visual documentation, student interviews and focus groups. The results suggest that students have positive perceptions of video-based feedback in adding value to the online learning experience. Emergent themes placed most value on the potential improved intimacy, communication, and timeliness of screencast feedback. The findings also corroborate preliminary research about the role video-based feedback plays in fostering self-regulated learning (SRL). This has implications for the design and development of instructor feedback to include video-based cues and feedback messages that promote SRL.
... In an extensive review of the literature exploring student perceptions on video feedback, Bahula and Kay (2020) revealed favorable attitudes towards video-based feedback among students. However, a noticeable absence from the almost 70 peerreviewed articles examined by Bahula and Kay's (2020) was the Japanese perspective. While some research has addressed the use of screencast technology in the Japanese EFL context (see Irwin, 2019;Lambacher, 1999), surprisingly none has researched student preferences for WCF or VSC feedback. ...
... While the earlier statements seemed to suggest that students perceived both forms of feedback nearly equally, here the majority of students demonstrated a clear preference for video feedback. This response also echoes previous research claiming that video feedback is preferred (Bahula & Kay, 2020;Bitchener & Ferris, 2012;Cunningham, 2017;Jones et al., 2012). ...
Article
One area English as a foreign language (EFL) teachers struggle with is providing written corrective feedback (CF) to their students. Additionally, written CF often comes in for criticism from both teachers and students. For example, students complain it lacks quality, is too brief, and often ambiguous. Similarly, teachers are frustrated that written CF is either ignored or misinterpreted by students as recurring mistakes are frequently observed. Recently, an alternative form of multimodal feedback, video screen capture (VSC) feedback, that has the potential to address these criticisms has slowly been gaining attention. Integrating the Zoom and Google Classroom platforms to deliver VSC feedback, this study examined 50 Japanese university students’ attitudes towards this feedback method. Specifically, students were asked to identify the perceived benefits of written and VSC feedback and state their preference. The results indicate that a slight majority of students preferred VSC feedback. EFL教員の主要な役割の一つに、学生への手書きの修正フィードバック(Corrective Feedback = CF)の提供が挙げられるが、残念ながら手書きのCFは、しばしば教員と学生の両方から批判を受ける。学生は手書きのCFが質と簡潔性に欠け、曖昧であることを不満に思う一方、教員は手書きのCFが、無視や勘違いをされて同じミスが繰り返されることに落胆する。近頃は代替案として、ビデオを用いたフィードバック(Video Screen Capture Feedback=VSCフィードバック)が、教員と学生の批判に対応する可能性を持つとして暫定的に注目を集めている。本研究はZoomとGoogle Classroomを用いて50人の日本人学生のVSCフィードバックに対する意見を調査した。具体的には、学生たちへ手書きとVSCフィードバックそれぞれの利点を識別し、どちらを好むか質問した。結果は若干多数の学生がVSCフィードバックを好むことが明らかになった。
... For example, research suggests that when utilizing video feedback, instructors are more likely to elaborate on specific details and notes helping to provide more constructive conceptual feedback (e. g., arguments, analysis, synthesis, judgements; Lamey, 2015;Mahoney et al., 2019;Parton et al., 2010;Thomas et al., 2017). As feedback can differ across disciplines, instructors, and institutions, it is not surprising that some research has found varied student perceptions of video feedback (see Bahula & Kay, 2020;Mahoney et al., 2019). Variations create challenges in synthesizing effective uses of video feedback. ...
Article
Feedback is an essential part of the learning process. Asynchronous online courses are marked by an abundance of text-based feedback. Yet, video feedback in asynchronous online courses is a nascent field of inquiry. This study investigated student perceptions of screencasting style of video feedback in online courses. During this course, students received video feedback from their instructor, and provided and received video feedback to their peers. A total of 84 graduate students completed an end-of-course survey between 2018 and 2020 that focused in part on student satisfaction and perceived learning with video feedback and overall perceptions of social presence. Results indicate students were satisfied with receiving video feedback, that video feedback contributed to their perceived learning, and that perceptions of social presence were comparable to previous research. Limitations and implications for practice are discussed.
... These aspects of social presence were evident in the artefacts of video-based feedback received by students. In contrast to the overwhelmingly positive perceptions of social presence in video-based feedback on the part of many instructors and students [35], one of the studies that investigated artefacts of video-based feedback found no significant difference in indicators [36], while three found positive results [37]- [39]. ...
Conference Paper
Full-text available
Feedback is essential for learning and identifies perceived gaps between students’ observed performance and desired outcomes. In higher education, feedback is often text-based, despite significant advances in the ease of recording and distributing video in digital learning environments. While recent studies have investigated student and instructor perceptions of video-based feedback, the characteristics of the videos created are only beginning to be understood. The purpose of this study was to conduct a systematic literature review (2009-2019) of research on the qualities of videos created to provide feedback to higher education students. Sixty-seven peer-reviewed articles on the use of video-based feedback, selected from a systematic search of electronic databases, were organized and examined. While most articles described the video feedback provided, only seven systematically researched the content of videos received by students. Analysis of the literature revealed that video-based feedback included more comments on thesis development, structure, and conceptual engagement. Language choices tended toward praise, growth, and relationship building. Further, the feedback was more conversational and featured more expanding language, fewer imperatives, and less proclaiming language. This paper concludes with recommendations for the provision of video-based feedback arising from the analysis of feedback artefacts and a discussion of research opportunities. Keywords: Video feedback, screencast feedback, assessment, higher education, systematic review.
Article
Research suggests that video can improve social presence in online courses. Video, though, is not a panacea; rather the success of video use depends in part on how and when it is used. Online instructors are increasingly using video in various ways, but questions remain on which types of videos students value most when it comes to establishing social presence. Given this, this mixed-methods sequential explanatory study explored student perceptions of three types of asynchronous video: video announcements, instructional videos, and video feedback. The results suggest that while video has the potential to improve social presence, it ultimately depends on both how the video is used in the online classroom as well as students’ individual preferences. Students in this study preferred instructional videos the most, followed by video feedback, and then video announcements. The paper provides implications for future research and practice.
Chapter
Das Zentrum für Lehrerbildung der Westfälischen Wilhelms-Universität Münster war vom 8. bis 10. September 2021 Gastgeber der Onlinetagung Diversität Digital Denken – The Wider View. Eine Vielzahl der Tagungsbeiträge ist in diesem Band dokumentiert. Nach der durch die SARS-CoV-2-Pandemie schnellen und teils überstürzten Digitalisierung wird die Frage nach nachhaltigen Chancen von Digitalisierung im Hinblick auf Diversität aktueller denn je. Auf der Tagung wurde diskutiert, wie Diversität an Schulen und Hochschulen mit Hilfe digitaler Methoden und Tools gewinnbringend begegnet werden kann – oder auch, wie Digitalisierung bei der Vorbereitung auf das diverse Klassenzimmer helfen kann. Der Band bietet zu dieser Fragestellung ein breites Spektrum an Theorie- und Praxisbeiträgen mit folgenden Schwerpunkten: - Diklusion als Entwicklungskonzept für Schule und Hochschule; - Barrierefreies Lernen für alle durch ganzheitliche digitale Ansätze; - Umgang mit Diversität im coronabedingten Distanzunterricht: Stärken und Schwächen; - Gestaltung eines diversitätssensiblen Fachunterrichts mit Hilfe digitaler Lernumgebungen und Lernplattformen; - Individuelle Förderung von Schüler*innen durch digitale Binnendifferenzierung; - Förderung sprachlicher Kompetenz von Schüler*innen durch digitale Hilfsmittel; - Digitale Lehrkonzepte in Hochschule / Ausbildung zum Thema Diversität sowie Digital vermittelte Förderung (fremd-) sprachlicher und selbstregulativer Kompetenzen.
Chapter
Das Zentrum für Lehrerbildung der Westfälischen Wilhelms-Universität Münster war vom 8. bis 10. September 2021 Gastgeber der Onlinetagung Diversität Digital Denken – The Wider View. Eine Vielzahl der Tagungsbeiträge ist in diesem Band dokumentiert. Nach der durch die SARS-CoV-2-Pandemie schnellen und teils überstürzten Digitalisierung wird die Frage nach nachhaltigen Chancen von Digitalisierung im Hinblick auf Diversität aktueller denn je. Auf der Tagung wurde diskutiert, wie Diversität an Schulen und Hochschulen mit Hilfe digitaler Methoden und Tools gewinnbringend begegnet werden kann – oder auch, wie Digitalisierung bei der Vorbereitung auf das diverse Klassenzimmer helfen kann. Der Band bietet zu dieser Fragestellung ein breites Spektrum an Theorie- und Praxisbeiträgen mit folgenden Schwerpunkten: - Diklusion als Entwicklungskonzept für Schule und Hochschule; - Barrierefreies Lernen für alle durch ganzheitliche digitale Ansätze; - Umgang mit Diversität im coronabedingten Distanzunterricht: Stärken und Schwächen; - Gestaltung eines diversitätssensiblen Fachunterrichts mit Hilfe digitaler Lernumgebungen und Lernplattformen; - Individuelle Förderung von Schüler*innen durch digitale Binnendifferenzierung; - Förderung sprachlicher Kompetenz von Schüler*innen durch digitale Hilfsmittel; - Digitale Lehrkonzepte in Hochschule / Ausbildung zum Thema Diversität sowie Digital vermittelte Förderung (fremd-) sprachlicher und selbstregulativer Kompetenzen.
Article
Full-text available
Computational thinking (CT) is considered an emerging competence domain linked to 21st-century competences, and educational robotics (ER) is increasingly recognised as a tool to develop CT competences. This is why researchers recommend developing intervention methods adapted to classroom practice and providing explicit guidelines to teachers on integrating ER activities. The present study thus addresses this challenge. Guidance and feedback were considered as critical intervention methods to foster CT competences in ER settings. A between-subjects experiment was conducted with 66 students aged 8 to 9 in the context of a remote collaborative robot programming mission, with four experimental conditions. A two-step strategy was employed to report students' CT competence (their performance and learning process). Firstly, the students' CT learning gains were measured through a pre-post-test design. Secondly, video analysis was used to identify the creative computational problem-solving patterns involved in the experimental condition that had the most favourable impact on the students’ CT scores. Results show that delayed feedback is an effective intervention method for CT development in ER activities. Subject to delayed feedback, students are better at formulating the robot behaviour to be programmed, and, thus, such a strategy reinforces the anticipation process underlying the CT.
Article
Full-text available
Highlights Students preferred video feedback for its ease of use, clarity and efficiency. Observations supported student perceptions of clarity & efficiency. Successful revision rates were similar across text & video feedback. Text feedback often needed clarification to be used, but video did not. Video feedback took less time to create than text feedback (2/3 the time). Abstract In an effort to expand understanding of the impact of technology choices in giving feedback, this exploratory study investigates the efficacy of screencast and text feedback given to 12 students over four assignments in an intermediate ESL writing course. Employing a series of six surveys in conjunction with screencast observations, draft comparisons, and a small group interview, it provides insight into student perceptions and use of technology-mediated screencast and text feedback. Results suggest that while students found utility in both screencast and text feedback, screencast video feedback was preferred for its efficiency, clarity, ease of use and heightened understanding. Observations supported these student assertions as students working with screencast feedback took less time to revise, remained in the target language and did not need to ask clarification questions, which was not the case with the text feedback. Successful changes were made at similar rates for both types of feedback with screencast resulting in a slightly, but not significantly, higher average percentage of successful global changes. To consider feasibility, the study also compared the length of time to create each feedback file, finding that video feedback offered a 33% time savings.
Article
Full-text available
An understanding of the impact of our technological choices in giving feedback has become a necessity for instructors. However, few studies have explored how technology choices might be influencing the nature and language of feedback. The present study investigates how the modes of video and text change the language used to give feedback and by doing so, shift its interpersonal aspects. The study employs engagement, from the appraisal framework, to investigate parallel collections of screencast and MS Word feedback from three English as a second language (ESL) writing instructors over four assignments in intact classes. This engagement analysis highlights how other voices are considered in the feedback and provides understanding of the position of the instructor and the role of the feedback itself and how they shift across modes. Text feedback was found to position the instructor as a single authority while video feedback better preserved student autonomy, offering feedback as suggestion and advice and positioning the instructor as one of many possible opinions. Understanding these differences can help instructors choose technology that will best support their pedagogical purposes.
Article
Full-text available
Assessment feedback is increasingly being provided in digital modes, from electronic annotations to digital recordings. Digitally recorded feedback is generally considered to be more detailed than text‐based feedback. However, few studies have compared digital recordings with other common feedback modes, including non‐digital forms such as face‐to‐face conversations. It is also unclear whether providing multiple feedback modes is better than a single mode. To explore these possibilities, an online survey asked 4514 Australian university students to rate the level of detail, personalisation and usability of the feedback comments they had most recently received. Of the students who received a single feedback mode only, electronic annotations and digital recordings were rated most highly on the three quality indicators. Students who received multiple modes were more likely to agree with all three indicators than those who received a single mode. Finally, students who received multiple modes were more likely to agree that the comments were detailed and usable when one of those modes was a digital recording. These findings enhance our understanding of feedback design, indicating that it is important to consider the strengths and weaknesses of particular modes, and the value of offering multiple modes.
Article
Full-text available
Kim, Victoria. (2018). Technology-enhanced feedback on student writing in the English-medium instruction classroom. English Teaching, 73(4), 29-53. High quality and timely assessment feedback is central to student learning in higher education; however, written feedback has many limitations. One of the innovative approaches to delivering feedback to EFL learners is individualized audiovisual feedback (AVF) using screencast technology. Previous research on AVF has been extensively descriptive and mostly focused on student preferences for feedback and evaluation of various screencast software. The present study employed a mixed-method design using pre-post writing tasks and pre-post questionnaires to investigate what particularly beneficial affordances this type of media-rich feedback might offer for writers in the English-Medium Instruction (EMI) classroom, to identify the effects of AVF on changes in learners' motivation, and to explore students' perceptions towards screencast feedback. The results suggest that AVF is positively received by EFL learners and that simultaneous visual cues and detailed explanations promote better understanding, engagement, and active listening. In addition, AVF significantly improves learners' writing performance and academic motivation. The paper concludes with practical implications and suggestions for further research.
Article
Full-text available
This quantitative quasi-experimental study, which followed a pretest-posttest-delayed posttest design, was aimed at investigating the effect of individualized technology-mediated feedback (henceforth, ITMF) on the overall quality of Iranian EFL learners’ argumentative essays. The effect of ITMF, as the experimental treatment, was compared with the common written corrective feedback (henceforth, CWCF) strategies as the control treatment. 57 learners, studying at general EFL courses at upper-intermediate level, formed the participants. They were assigned to two groups: ITMF and CWCF, which, in this study, is meant as the pen-and-paper form of direct and indirect feedback. Each group received six sessions of treatment. The writing tasks and tests were all of argumentative type. First, whether there was any significant difference between the ITMF and CWCF in the overall quality of the essays was investigated. The ITMF group significantly outperformed the CWCF one. Then, whether the difference between the groups varied over time was explored, and it was revealed that the ITMF was still significantly superior over the CWCF. Next, whether there would be any significant change in the ITMF in the long term was examined, and no change was seen. The study supports the advocates of screencasting feedback, revision and teacher-learner negotiation following the feedback.
Article
Social presence is a popular construct used to describe how people socially interact in online courses. Online educators continue to try different ways to establish and maintain social presence in online courses. However, research to date has not identified which strategies, or types of strategies, are best for establishing social presence. We investigated student perceptions of various strategies of establishing and maintaining social presence using a mixed methods case study approach in two different fully online courses. Results suggest that students are more interested in connecting with their instructor than their peers; different students like different social presence strategies; and students have different overall social presence needs. Various strategies and implications for practice are addressed throughout.
Article
Purpose: The purpose of this paper is to explore screencasting as a computer-mediated feedback approach for Arabic native (L1) speakers taking an English as a foreign language (EFL) college remedial writing class. Design/methodology/approach: This case study focused on an EFL remedial writing class consisting of eight Lebanese, Arabic L1 students at a private university in Lebanon. Students received screencast feedback through Jing® for one essay intended to assist them with subsequent revision. The multimodal screencast videos included indirect corrections, annotations, and oral commentary guided by a rubric. Students then completed a perspectives survey on screencast feedback. The instructor also led an informal group discussion to allow for further elaboration of students’ responses. Findings: Students reported that screencasting’s multimodality provided for better engagement and support of learning preferences. They also perceived screencast feedback to be clearer and more useful than traditional written feedback. Research limitations/implications: This study applied screencasting to address feedback challenges pertaining to clarity, learning preferences, and engagement. As this was a classroom case study, further research using a larger sample is recommended. Originality/value: The aim of research into computer-mediated human feedback is to address such challenges as increasing student engagement, improving clarity, and responding to students’ preferences. Studies of screencast feedback have been few, particularly for EFL writing students. A survey of the literature indicates the need to explore contextualized classroom feedback case studies and approaches to enhance feedback. https://www.emeraldinsight.com/eprint/RUPVHC3DJ6Q9HCHW2IIC/full
Article
This qualitative case study examined the influence of the use of VoiceThread technology on the feedback process for thesis writing in two online asynchronous graduate courses. The influence on instructor feedback process and graduate student writers’ perceptions of the use of VoiceThread were the foci of the study. Master’s-level students (n = 18) in two different degree programs received and responded to multiple rounds of instructor feedback on their thesis paper via VoiceThread technology for one semester. Instructor and student comments on VoiceThread and an open-ended survey of students’ experiences using VoiceThread in the course were analyzed. Findings show that VoiceThread promoted a two-way dialogue between the instructors and the students during the revision process, students had a generally positive perception of the use of the technology, and that instructors’ feedback processes were impacted in different ways by the use of the technology.
Article
This experimental study investigated the use of video feedback as an alternative to feedback with correction codes at an institution where the latter was commonly used for teaching process-approach English as a foreign language (EFL) writing. Over a 5-week period, the control and the experimental groups were provided with feedback based on comments and correction codes and video feedback, respectively, and the extent of feedback incorporation was analyzed through descriptive and inferential statistics. In addition, a questionnaire was administered to the experimental group to explore their perceptions of video feedback. The findings show that teacher feedback delivered in the form of videos is more effective than written feedback when EFL learners revise their written work in process writing. The study confirms that video feedback is more information-rich, and in return results in more correction in learners’ subsequent drafts. The findings imply that video feedback, because of its features of conferencing and multimodality, is an effective method of providing EFL learners with teacher feedback and is therefore eligible for classroom practice and for future research.