ArticlePDF Available

Using asynchronous audio feedback to enhance teaching presence and students' sense of community

Authors:

Abstract

This paper reports the findings of a case study in which audio feedback replaced text-based feedback in asynchronous courses. Previous research has demonstrated that participants in online courses can build effective learning communities through text based communication alone. Similarly, it has been demonstrated that instructors for online courses can adequately project immediacy behaviors using text- based communication. However, we believed that the inclusion of an auditory element might strengthen both the sense of community and the instructor's ability to affect more personalized communication with students. Over the course of one semester, students in this study received a mixture of asynchronous audio and text-based feedback. Our findings revealed extremely high student satisfaction with embedded asynchronous audio feedback as compared to asynchronous text only feedback. Four themes, which accounted for this preference, were culled out in an iterative, inductive analysis of interview data: 1. Audio feedback was perceived to be more effective than text-based feedback for conveying nuance; 2. Audio feedback was associated with feelings of increased involvement and enhanced learning community interactions; 3. Audio feedback was associated with increased retention of content; and 4. Audio feedback was associated with the perception that the instructor cared more about the student. Document analysis revealed that students were three times more likely to apply content for which audio commenting was provided in class projects than was the case for content for which text based commenting was provided. Audio commenting was also found to significantly increase the level at which students applied such content. Implications of this case study and directions for future research are addressed in the discussion and conclusions section of this paper.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
3
USING ASYNCHRONOUS AUDIO FEEDBACK TO
ENHANCE TEACHING PRESENCE AND
STUDENTS’ SENSE OF COMMUNITY
Philip Ice
Department of Middle, Secondary and K–12 Education
College of Education, University of North Carolina Charlotte
Reagan Curtis
Department of Technology, Learning, and Culture
College of Human Resources and Education, West Virginia University
Perry Phillips
Department of Curriculum & Instruction / Literacy Studies
College of Human Resources and Education, West Virginia University
John Wells
Department of Teaching and Learning
School of Education, Virginia Polytechnic Institute and State University (Virginia Tech)
ABSTRACT
This paper reports the findings of a case study in which audio feedback replaced text-based feedback in
asynchronous courses. Previous research has demonstrated that participants in online courses can build
effective learning communities through text based communication alone. Similarly, it has been
demonstrated that instructors for online courses can adequately project immediacy behaviors using text-
based communication. However, we believed that the inclusion of an auditory element might strengthen
both the sense of community and the instructor’s ability to affect more personalized communication with
students. Over the course of one semester, students in this study received a mixture of asynchronous audio
and text-based feedback. Our findings revealed extremely high student satisfaction with embedded
asynchronous audio feedback as compared to asynchronous text only feedback. Four themes, which
accounted for this preference, were culled out in an iterative, inductive analysis of interview data: 1.
Audio feedback was perceived to be more effective than text-based feedback for conveying nuance; 2.
Audio feedback was associated with feelings of increased involvement and enhanced learning community
interactions; 3. Audio feedback was associated with increased retention of content; and 4. Audio feedback
was associated with the perception that the instructor cared more about the student. Document analysis
revealed that students were three times more likely to apply content for which audio commenting was
provided in class projects than was the case for content for which text based commenting was provided.
Audio commenting was also found to significantly increase the level at which students applied such
content. Implications of this case study and directions for future research are addressed in the discussion
and conclusions section of this paper.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
4
KEYWORDS
Online Learning, Personalized Communication, Student Satisfaction, Embedded Asynchronous Audio
Feedback, Nuance, Retention of Content, Instructor Caring
I. INTRODUCTION
As the number of online courses continues to expand, so must the ways in which instructors engage in
active facilitation of learning among their students. This study focuses on one aspect of facilitation, the
way in which we communicate and guide students in asynchronous learning networks (ALN) and how
this process might be improved upon.
While the evolution of ALN has made it increasingly easier to involve remotely based students in two-
way communications [1] and enable students to process more complex information [2], instructors are
often required to adapt to new roles [3]. While several frameworks have been developed to explain the
role of the instructor [3, 4, 5], a system first proposed by Berge [6] and later refined by others [7]
proposes a four part model consisting of pedagogical, social, technical, and managerial dimensions, each
with a varying number of roles. For purposes of this study, the social dimension and three roles
(profession-inspirer, feedback-giver, and interaction-facilitator) within the pedagogical dimension are
considered the most important. These are depicted in the following table which was derived from work by
Liu and colleagues [2].
Dimensions Roles Description of Roles
Pedagogical Profession-inspirer Promote professional dialogue among online learners; relate
personal experiences and cases to the discipline; point to
professional organizations.
Feedback-giver Provide timely and high quality feedback; provide
formative feedback for continuous learning engagement.
Interaction-facilitator Facilitate peer interaction in online discussion through a
wide range of facilitation strategies.
Social Social rapport builder Build social rapport; establish online teams; build online
learning community.
Table 1. Select Roles of Online Instructors
In the traditional face-to-face classroom setting, each of these roles would be dependent upon both verbal
and non-verbal cues. In the online environment, however, the primary form of communication is via text
and therefore devoid of traditional paralinguistic cues [2]. Arbaugh [8] suggests that the relative low
richness of text-based communication may make interdependent, ambiguous tasks particularly
challenging.
Critics of online learning, building on the low richness of text-based communication, contend that
because interactions occur in a disembodied form, this lack of nuance leads to a loss of meaning [9, 10,
11, 12]. As such, it is argued that asynchronous learning is not sufficiently rich in the socially mediated
practice that Vygotsky [13] described as necessary to construct knowledge. However, this narrow
interpretation of Vygotsky discounts the ability of learners to conceptualize “being” as anything other
than a physical construct.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
5
The ability to project oneself through various media, termed social presence, was initially described by
Short, Williams and Christie [14] who proposed that, as critics of asynchronous learning contend, the
ability to project verbal and nonverbal information directly impacted the degree to which presence was
perceived. However, Rourke, Anderson, Garrison and Archer [15] and Swan [16] argued that this may not
be the case as learners in online courses appeared to build effective learning communities by projecting
their personalities through text alone.
Lombard and Dutton [17] viewed this creation of a presence in online courses as the ability to project
oneself into a virtual. In an extension of this concept, Laffey, Lin and Lin [18] described the social
element of asynchronous communication evolving as learners come to view their interactions with tasks
and tools as being a fluid, integrated process rather than as a series of tasks. They compared this process
to a speaker interacting with others in a foreign language. The more fluent the speaker becomes with the
new language the less difficult interactions become. Theoretically, this would mean that the technologies
become part of the interaction itself and are therefore not viewed as objects upon which learners have to
act to create virtual embodiments [19].
Gunawardena and Zittle [20] found that the sense of “being there” was established in the online
environment through providing and interpreting emoticons as a replacement for nuance and nonverbal
cues. Using a 14-item questionnaire, they found 60% of the variance in student satisfaction was
attributable to perceptions of social comfort and presence. Rovai [21] explained that this type of
satisfaction can occur when text based, socio-emotional-driven interactions promote a sense of
connectedness among learners in asynchronous learning networks (ALN).
Richardson and Swan [22] used regression analyses to determine the relationship between perceived
social presence and perceived learning. Analysis of data collected from 17 courses revealed that 46% of
the variability in perceived learning could be predicted by student perceptions of social presence.
However, the study also revealed that an even stronger relation (R
2
= 0.53) existed between perceived
learning and overall satisfaction with the instructor. This finding indicated that satisfaction with the
instructor was at least as important as was perceived social presence. Further, the authors found that a
strong relation (R
2
= 0.36) existed between students’ perceptions of social presence and satisfaction with
the instructor. Based on these findings it was concluded that “students’ perceptions of social presence
were related to the perceptions of their instructors as having a satisfactory online presence in terms of
amount of interaction and/or quality of that interaction.”
Through factor analysis, Arbaugh [23] found instructor immediacy behaviors in online courses were a
significant predictor of student learning. Based on Gorham’s [24] verbal immediacy scale, Arbaugh
defined immediacy behaviors as being comprised of two parts. The first, classroom demeanor, “reflected
the instructor’s use of personal examples, humor, and openness toward and encouragement of student
ideas and discussion.” The second, name recognition, referred to the “extent to which the instructor was
addressed by name by the students and vice versa.”
A. Instructional Design Features that Foster Community
Informed by the studies previously discussed, we have been improving on our design of ALN instruction
to facilitate meaningful discourse and create dynamic learning environments. Specifically, in our courses
over the past six semesters, we have attempted to incorporate recommendations found in the literature
related to the projection of teaching presence through immediacy behaviors. Surveys of student
satisfaction from these courses indicated that students were generally highly satisfied with our efforts and
students’ qualitative feedback, when provided, typically made us believe we were doing a good job of
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
6
creating a rich learning environment. However, even if social presence is strong, student may prefer even
more interactive communication [25, 26], and we question whether greater interaction might also apply to
the projection of teaching presence. Arbaugh [8] found media variety to be positively associated with
perceived learning among students in web-based MBA courses. In a review of the literature, Liaw and
Haung [27] suggested that presentation of web-based course content through a variety of media positively
impacted learner experiences.
Ideally, we would have liked to have used an asynchronous videoconferencing mechanism similar to that
envisioned by Watt, Walther and Nowak [28]. Extending work by Walther and Burgoon [29], Watt and
colleagues wrote that such a system would take full advantage of both verbal and nonverbal cues thereby
increasing copresence; “the sense that one is actively being perceived and that one is actively perceiving
another [28].” However, based on previous student surveys we knew that approximately one third of our
students were likely to be taking classes via dialup connections, making the use of streaming video
impractical. Thus, the only feasible alternative available to us was the use of asynchronous audio.
Research on the use of stand alone audio in ALN, especially audio feedback, is rather limited. The study
that provided us with the most insight as to how audio feedback might be perceived by students was
conducted by Jelfs and Whitelock [30]. These researchers created a virtual environment in which various
navigational techniques were used. All of the participants indicated in follow-up interviews that the
preprogrammed auditory feedback was as important to their success and satisfaction with the environment
as was ease of navigation. Significantly, these two factors were considered to be even more important
than interactivity or previous experience.
B. Use of Audio Feedback
Use of audio commenting in the face-to-face classroom can be traced to at least 1982, when Olson [31]
reported using the technique in English courses at a two year college. In a discussion of the technique,
Olson opined that his students believed audio commenting reflected a sense of caring on the part of the
instructor that extended beyond their written products. The ability to project through tone of voice, he
argued, enabled the instructor “to be more supportive and caring.”
Building on Olson’s work, Mellen and Summers [32] provided students in an English course with tapes
containing audio feedback and conducted surveys and interviews at the end of the semester. Results
demonstrated that students were likely to view audio feedback as being positive regardless of the context.
Additionally, 70% of students reported that they felt encouraged to revise their work as a result of
receiving auditory feedback and 54% felt more confident about their writing. These findings provide
strong, highly positive indicators of student perceptions regarding the use of audio feedback and point to
its potential as a tool in asynchronous online courses.
In a study of student-student audio based interactions in ALN, Kim [33] found that students had generally
positive perceptions of the medium, but that its use decreased motivation. However, audio did increase
social presence, a finding that supported earlier research in which Reeves and Nass [34] concluded that
human voice increased social presence. In a seeming contradiction, Bargeron and colleagues [34] found
that students preferred to use text rather than audio in threaded discussions because they found it easier
and quicker to read text messages than listen to audio.
However, the sample size in the study conducted by Bargeron and colleagues [35] was small with only 4
of the 6 total participants indicating a preference for text based feedback. We conducted a pilot study
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
7
asking 83 of our students to complete a survey regarding the relative time required to utilize text based
versus audio feedback. We found that 28 students believed it took longer to listen to audio feedback than
to read text-based feedback, 35 believed the time required was approximately the same, and 20 believed it
took less time to listen to audio feedback. In addition, after answering questions about the time required to
listen to audio feedback, 6 students emailed the instructor wishing to clarify their answers. The following
is representative of the emails received:
I just finished answering some questions about the time it took to listen to comments or read
comments. My answer was that it took longer. However, I wanted to clarify that a little. It took
longer because I replayed the comments a couple of times so I could really see what was being
said as it related to my work and get more out of it. I don’t do this when the comments are written
because I don’t think they are as good.
Based on these findings, we concluded that the difference in time required to listen to audio feedback
versus reading text-based feedback was not a significant factor in deciding whether the technique should
be used. In fact, based on the supplemental feedback, there was reason to believe that even though some
students perceived audio feedback to be more time consuming, they still preferred it because they
believed they got more out of it. Clearly, more research is needed in this area to explore students’
perceptions related to each type of feedback.
The research clearly shows connections between perceived learning, perceptions of social presence,
instructor satisfaction, and immediacy behaviors in building a sense of community among ALN learners.
Yet to be established, however, is the extent to which auditory feedback might further enhance teaching
presence and therefore build a stronger student sense of community.
II. METHOD
From spring 2004 through summer 2005, we served as instructors in seven asynchronous online courses.
Despite being highly satisfied with the experiences and believing that our students had significant
learning experiences, we wondered if we had done all we could to make our relationships with students as
personal as possible given the constraints of the medium. While we disagree with those who view online
learning as detached and impersonal [36, 37], we were concerned about our ability to adequately convey
nuance in a manner similar to that which occurs in face-to-face classrooms. This concern prompted our
research to better understand the nature of audio feedback in an asynchronous learning network.
Specifically, in this study we sought to answer the following set of research questions (RQ):
RQ 1: Between audio and text-based student feedback in ALN, which do students believe is a
more effective means of interaction with their instructor?
RQ 2: To what degree do students believe audio feedback is an effective replacement of
instructor/student interaction that typically occurs in traditional face-to-face classes?
RQ 3: How does the use of audio feedback impact the sense of community in ALN?
RQ 4: In what manner is perceived learning impacted by the use of audio feedback?
RQ 5: What relationship exists between the use of audio feedback and student satisfaction?
A. Instructional Setting
Curriculum and Instruction 687, Advanced Teaching Strategies, was the course through which this study
was conducted. Prior to this study, C&I 687 had been offered completely online for three consecutive
semesters.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
8
Structurally, C&I 687 consisted of ten learning units in which students explored and evaluated advanced
teaching concepts and strategies. In the first unit, students were introduced to the philosophical
foundations of constructivist teaching and asked to evaluate a series of readings with respect to their
personal experiences in the classroom. In seven of the remaining units, students were introduced to eight
teaching strategies (concept attainment, inductive learning, cooperative learning, synectics, direct
instruction, mnemonics and classroom discussion) through readings that addressed methodology, through
text and video based case study analysis and through discussion postings in which students were asked to
apply the various models to content area lesson plans of their choice. Students then evaluated each other’s
postings and refined lesson plan strategies based on the communal knowledge constructs that emerged.
One of the two remaining units was a mid-term assessment activity where students selected two video-
based classroom vignettes and conducted an evaluative case study for which they identified the teaching
strategies employed, explained the usage rationale and suggested how the teacher might have improved
the manner in which their students acquired knowledge. The final unit consisted of two parts: part 1
consisted of six reflective activities in which students were asked to evaluate how praxis might be
impacted by contemporary and emerging societal and technical issues; the second part of the final unit
required groups of students to develop a series of thematic, interdisciplinary lesson plans in which
strategies explored during the semester were utilized. These plans required that students use a minimum
of three teaching strategies explored during the semester. After all projects were submitted, students were
expected to evaluate plans submitted by other groups and suggest revisions.
The course was a major elective for both master’s and doctoral level students in the Curriculum and
Instruction program. The course had no prerequisites and was taken at various times during students’ plan
of study.
In previous years when this course was taught, feedback was provided to students in two ways. In the
first, the instructor would interact with the students’ text based postings on the discussion board using
Socratic questioning to enhance and expand upon various threads that emerged. Additional group
feedback was provided at the conclusion of each thread. In the second, the instructor would provide
individualized text based feedback via email to students on each discussion topic or submission.
B. Use of Audio Commenting Within the Instructional Setting
In addition to utilizing approaches to text-based feedback from previous years, we incorporated audio
commenting in this iteration of the course. When posting audio comments to the discussion board, in
emails to the entire class, or to small groups, the instructors produced wav files using Audacity freeware.
The files were then added to the discussion board or email as attachments.
In the case of individualized feedback, the instructors selected various discussion posts made by a student,
copied them to a Word document, inserted comments and sent the document back to the student via
course email. This type of individualized commenting was also used for the midterm case studies, final
reflections and the group project.
We provided approximately half of the individualized feedback in a text-based format and the other half
via audio. At the end of the course all students had received six documents in which text feedback was
used and five in which audio feedback was used. To avoid the introduction of bias, prior to the beginning
of the semester each assignment was given a number from one to 12. These numbers were then entered
into excel and randomized. From this list, we assigned alternating text-based or audio feedback as the
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
9
modality that would be used.
On the discussion board, we engaged in Socratic questioning as in previous semesters. At the end of the
semester, the discussion board contained a total of 1471 postings and replies. Of these, 203 were Socratic-
type questions that we posed to students on an individual basis. In addition, we provided another 59
postings that took the form of group feedback: 31 of these were text-based and 28 used audio.
Technically, the audio feedback was produced by first copying select discussion board postings into a
Word document or opening a Word document in which students had submitted individual assignments.
The Word document was then converted into a PDF document using Adobe Acrobat Pro 7. Once in this
format, the instructor used the Record Audio Comment tool within the Comment and Markup option.
Depending on a host of factors, including length, number of topics discussed and quality of the work
submitted, the instructor placed varying numbers of audio files within the document, as well as a
summary statement at the end of each document. The audio feedback was spontaneous in nature, as it was
intended to replicate the non-scripted verbal interactions that occur in F2F environments.
In the instances where students received text feedback, it was in the form of a PDF document using the
Note Tool selected from the Comment and Markup option. Text comments were placed at various points
throughout the document and at the end, in a fashion mirroring that used in the audio feedback. The same
document format and comment placement strategies were used to ensure that any difference in
perceptions of the commenting modality would not be influenced by these extraneous variables.
To determine what impact using audio commenting had on time required to provide feedback, we
maintained a log of the amount of time required to provide both text-based and audio feedback. During
the analysis of data, we also compared the volume of audio and text based feedback that was provided to
students.
C. Participants
West Virginia University’s Institutional Review Board approved the protocol for this study to ensure
ethical treatment of all participants. For the semester in which this study occurred, enrollment consisted of
26 master’s level students and 8 doctoral students. Of the 26 master’s students, 17 were practicing
teachers and 9 pre-service teachers. Geographically, 29 of the students who took the course were located
in West Virginia, 3 were located in Maryland, 1 in Alabama and 1 was on military deployment in
Djibouti.
An email was sent to all students during the last week of the course asking for volunteers to participate in
post-course interviews. Seven doctoral students, 15 master’s level practicing teachers and 5 master’s level
pre-service teachers volunteered to participate.
D. Design
A nested mixed methods design with both concurrent and sequential components was implemented [38].
We gave priority to the qualitative components nesting quantitative data within them in order to enrich
our description of participants’ perceptions related to audio feedback [39]. Three separate sets of data
were originally planned for triangulation during data analysis and interpretation: end of course survey
data, post-course interview data and final projects. Unsolicited qualitative feedback generated throughout
the semester, though not originally part of the research design, was added as a data set because it
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
10
contained rich and compelling data that could not be ignored. The end of course survey data included both
qualitative and quantitative components collected concurrently with the final project data. Interview data
gathered sequentially allowed us to follow up on themes generated from the end of course survey results.
We selected a mixed methods research design for our work, and being guided by a “pragmatic approach”
or paradigm [40] we sought to capitalize on the strengths of both quantitative and qualitative approaches
to data collection. This clearly required following established criteria for generating high quality
quantitative and qualitative data. While criteria for judging the quality of quantitative studies are well
established, there is less agreement regarding what quality criteria are applicable to qualitative research
[41, 42]. Searle [43] argued that triangulation of data sources aimed at enriching understanding through
and of multiple perspectives should be the central criteria by which qualitative research is judged. Taking
his point, we included multiple forms of qualitative data (survey, interview, and document), blended with
quantitative (survey) and quantified (document) data, and analyzed these using strategies designed to
achieve triangulation.
1. Unsolicited Feedback
During the semester, 14 students sent a total of 16 unsolicited emails to the instructor related to the use of
audio feedback. The rich data in these emails provided early insight into how students perceived the
modality, as well as technical difficulties that a small number of students were experiencing. The emails
were coded and categorized based on thematic similarities that emerged in cross case analyses. Although
this was not data originally designed into the study, this unanticipated feedback clearly added to our
understanding of students’ perceptions of audio feedback. Capitalizing on the emergent nature of
qualitative inquiry, this data set was included as an extra point of validation in the triangulation process.
2. End of Course Survey Data
At the end of the course, students were asked to complete a survey to assess satisfaction and perceived
learning. The survey consisted of 52 items. The first 50, derived from instruments previously developed
by Spencer and Thompson [44, 45], addressed student satisfaction with course design, perceived learning
and sense of community. Two additional items related specifically to the use of audio feedback were
added: 1) a Likert-type scale item addressing student perceptions of the relative effectiveness of audio
versus text-based feedback, and 2) an open-ended item soliciting additional comments relative to audio
feedback. The Likert-type item was analyzed using descriptive statistics. Responses to the open ended
item were coded and thematically categorized using cross case analysis. This analysis then informed the
semi-structured post course interview protocols.
To guard against a novelty effect, as is often seen in student satisfaction with online courses [15, 46], we
continued to collect data from other courses in which the instructors used audio feedback. This
quantitative data consisted of responses to two questions. In the first, “I prefer audio feedback to text-
based feedback,” students were asked to respond on a five point Likert-type scale with choices ranging
from Strongly Disagree to Strongly Agree. The second question asked students how many courses they
had previously taken in which audio commenting was used (0, 1, 2, 3, or more than 3).
3. Post Course Semi-structured Individual Interviews
Of the 34 students enrolled in the course, 27 volunteered to participate in post course interviews. These
semi-structured interviews were conducted during the two weeks following the end of the semester.
During interviews, individual students were asked their impression of both the course and each type of
feedback using an interview protocol guide (see Appendix A) developed following principles described
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
11
by Berg [47] and Patton [48]. Two interviewers were involved in the process to ensure consistency. In-
depth probing of responses was conducted on an individualized basis to draw out more detailed data
related to why students perceived audio feedback to be more or less effective, as well as how it may have
altered their perceptions of what it meant to be a participant in an asynchronous learning network.
Interviews lasted approximately 50 minutes and were audio taped using a portable mp3 recorder. After all
interviews were complete, transcriptions were generated for coding. The transcribed interview texts were
analyzed following suggestions by both Strauss [49] and Tesch [50] using an interpretive, iterative
approach with emphasis placed on drawing out thematic strands. Because of the data richness, both within
and cross case analyses were utilized to more fully represent what occurred at both the individual level
and as part of a group dynamic.
To guard against a novelty effect, check for consistency in themes, and detect new themes, a total of 51
students were randomly selected from 17 courses in which the instructor had used audio commenting
since the completion of the original study. These students were emailed a questionnaire (Appendix C) in
which they were asked to reply to a series of open ended questions. The questionnaire was derived from
the interview protocol used to conduct the post-course interviews (Appendix A). Using an iterative,
interpretive process, themes were drawn out in the same manner used for transcribing the original
interviews.
4. Final Project Document Analysis
The final project for this course required groups of students to develop a series of thematic,
interdisciplinary lesson plans that utilized a minimum of three strategies explored during the semester.
Document analysis of final projects was conducted by first coding for the types of strategies students
chose to use for lesson plan design and then categorizing based on the type of instructor feedback (text
versus audio) used when students studied these strategies earlier in the course. The incidences of the
various categories were quantified and descriptive statistics calculated to explore how feedback modality
might have impacted content usage.
The final projects were then recoded to determine the level of Bloom’s taxonomy [51] applied to each
strategy. In this process, the lesson plans students developed were decompressed and individual activities
evaluated using a rubric derived from Slavin’s [52] application of Bloom’s taxonomy to pedagogy (see
Appendix B). Coded documents were reviewed by two researchers to ensure consistency. The reviewers
unanimously agreed on the coding. The results were presented using descriptive statistics to determine if
audio feedback impacted the level at which content was used.
5. Triangulation
After analyzing each data set in the manner described above, open coding was used to isolate prevalent
themes followed by negative case analysis to explore consistency across data sources [53]. First, the
results of the quantitative end of course survey question were compared with the findings from the post
course interviews and unsolicited feedback for additional confirmation. Next, the findings from analyzing
the qualitative question in the end of course survey were crosschecked with the interview data and
unsolicited feedback. The end of course survey did not address content retention and so could not be
crosschecked with the document analysis. Usage frequency and level counts derived from document
analysis were checked for consistency with interview data focused on content retention. The interpretive
conclusions from triangulation analyses were then compared to what is known about corresponding
elements in learning theory and social presence literature to develop grounded theory that could be
applied to future research.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
12
III. RESULTS OF THE STUDY
This study was originally designed with three data sources: end of course survey data, semi-structured
interviews and document analysis. However, data rich material in the form of unsolicited feedback from
students was included as we found it to lend significant insight into what students thought at the time they
were actually receiving the audio feedback. In addition, these unsolicited emails allowed us to refine
some of the potential probing areas in the interview guide. Results of analyzing each data source
separately are provided below. Triangulation, observations and conclusions follow in section IV.
A. Unsolicited Feedback
Fourteen students in the course sent a total of 16 unsolicited emails regarding the use of audio feedback.
In 14 of these, 11 of which were sent within three days of the initial use of audio feedback, students wrote
to express a high degree of satisfaction with the modality. The remaining two emails were related to
technical problems with getting the audio files to play. No unsolicited emails expressing negative
sentiments about the use of audio were received.
The following is typical of the unsolicited emails:
It is very rewarding and helpful to HEAR your comments. Now I understand more about what you
are trying to say than I did with the last set of feedback we got. Thanks!
In an email received about three weeks after audio commenting was first used a student offered the
following:
We’ve had written comments twice and verbal comments twice now. Let me guess—this is
someone’s research project right? Let me just save you some time. The verbal feedback is much,
much, much better than the written. I said the same thing when I talked to you on campus last month.
So can you just send me the voice comments from here on out, say there is no comparison between
the two at all and nix the written stuff? That’s probably not going to happen, but I thought it was
worth a shot!
B. End of Course Survey Data
The end of course survey (response rate = 91%) included two audio feedback specific items: one
quantitative Likert-type item and one qualitative open-ended item. For the quantitative item, 26 of 31
respondents indicated that they believed audio feedback was more effective than written feedback. Four
believed there was no difference between the two modalities and one responded with a N/A. The N/A
response was explained in the qualitative item as described below.
When asked for additional comment related to the use of audio feedback, 11 students responded. Of these
responses, 10 were highly positive and cited audio feedback as a primary reason for being satisfied with
the course.
I usually find online classes rather boring. That was not the case here. It was definitely because of the
way the instructor communicated with us using the audio PDF’s. That approach made me interested
for the first time in what was happening in an online class. I didn’t feel like I was just jumping
through the hoops when I got to hear the comments on my work.
No students provided negative comments related to the audio feedback. The response not categorized as
positive addressed technical problems, clarifying the single N/A response to the quantitative item.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
13
I would definitely take an online course again, but I hope I can get this audio thing worked out if that
is the way we will get comments in other courses. Even after working with tech support I never could
get the files to play on my home computer. I did get them to play at work though. Because of this
issue I didn’t believe I could answer the question on audio commenting in the way it was intended
and therefore said it was not applicable.
After the course was over, this student contacted us regarding her technical problems. It was discovered
that a broken sound card in her home computer was at fault.
The survey data collected from other courses to address a potential novelty effect resulted in a 68%
response rate. Of the 312 respondents, the mean number of previous courses with audio commenting was
1.31 (SD = 1.29) with 99 students having at least two previous courses utilizing this feedback modality.
Responses to “I prefer audio feedback to text-based feedback” averaged 4.46 (SD = 0.78) corresponding
with halfway between strongly agree and agree. In fact, only 9 students out of 312 strongly disagreed or
disagreed with the statement. Directly addressing any potential novelty effect, there was no significant
relation between the number of courses students had experienced with audio commenting and their
relative preference for that feedback method (Spearman r
s
= .07, n = 312, ns).
C. Semi-Structured Interviews
Students indicated that they preferred audio feedback to written feedback in 25 of the 27 interviews. One
student had no preference and one preferred written feedback. From the 25 students who preferred audio
feedback, four general themes emerged: 1) increased ability to understand nuances that might be lost in
written communication, 2) feeling more involved in the course, 3) improved retention of content and 4) a
belief that the instructor cared more about the student’s learning. The mean number of themes expressed
per interviewee was 2.28 (SD = 0.79).
1. Ability to Understand Nuance
The most frequently expressed theme (n = 19) was the ability to detect nuance and inflection in the audio
commenting. In general, students believed that verbal feedback gave them increased insight into what the
instructor was attempting to convey and that it produced a more comfortable, less formal learning
environment.
This perspective is best illustrated by one student who said:
I have taken a couple of online classes and every time I would get these notes or critiques or
comments back from the instructor and I would be wondering exactly what they were trying to say. I
mean, I would understand what they were saying but not the way they were trying to say it.
Sometimes you would wonder if they were agreeing with you or trying to figure out how to politely
say you had it all wrong.
Now, when I first heard the audio feedback I was like wow! I get what he is saying to me. It was all in
your voice and I understood when you were saying something like well this is good, but……
I understood then that you really liked what I was doing but were trying to tell me to add a little more,
but in a good way. Now, in the first time we got feedback it was written and you said some things that
were kind of the same but I thought you were really trying to bust me for not doing a good enough,
you know, job. Then I looked at my grade and it was good so I couldn’t understand exactly what you
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
14
were thinking.
Was my work not so good and you just gave me a decent grade? Or was it ok and I just didn’t
understand what [was being] said to me. When I heard you say something similar though the whole
thing made sense.
One student, who had some online teaching experience, took an analytical approach to introspection as
revealed by the following:
To answer what I think about this I need to tell you what I did. I’ve taught one online class for my
department… well two if you count the one I am just finishing, so obviously I was fascinated when I
got the first audio files along with my work. But I didn’t want to just jump on it because it was
something new. What I did was sit down and transcribe what you sent over and then I looked at it. I
looked at it and listened to the files again and kept doing this for a while. What I realized was that its
two completely different things.
I know you were saying the same things in your [audio files] and in what I transcribed, but the
difference was you were saying them. When I looked at the transcription there was no stress placed
on any of the words or sentences. Then I tried putting the stress there by adding in caps or
exclamation marks and I wondered if I would have thought that you might have been yelling or
something if I would have read it that way. What I figured out was that there is really no way that you
could have gotten the same info across the same way.
This all made me think about the way my students have perceived me in courses when I write to them
with comments. It’s not the same is it? No, it’s really not. We lose so much in the written word
sometimes and I think maybe we haven’t thought about that enough in our online teaching. [Online
courses] are going to become ever more, uhm, you know, prevalent for all types of learners and I
think we really need to figure out the best way to get our intent across. I think this is probably a really
good first step. I know there are some things coming down the line that will make this look like we
are taking baby steps, but they are steps I think we need to start taking so we can keep moving in the
right direction. In a direction where we don’t get dehumanized and our students don’t lose what we
are trying to get to them… or the way we are trying to get it to them.
2. Feelings of Increased Involvement
The belief that audio feedback increased feelings of being more involved and “a real part” of the class
was the second most commonly expressed theme (n = 15). Though students often began their discussions
of involvement in general terms, subsequent probing revealed that this perception was usually related to
what they believed to be a lessening of social distance when audio was used.
The richest data related to this perception came from a student who cited her feeling of being more
involved as the primary reason for preferring audio feedback. Her response was as follows:
Yes, I would have to say that audio [commenting] made all the difference in the world to me. I’ve
taken several online classes here and at [another university] because they are so much more… uhm,
easier for me to get to. The downside is that I have felt like I am the girl in the bubble. Some of the
instructors have done these things like the biography postings and online groups that help you meet
other students and get to know them; some haven’t. But even where they have [used these types of
activities] you still feel like you are at home in your own little bubble and you are telegraphing out to
all these other bubbles that other people are sitting in. Then between all of you there is this cold wall
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
15
type thing. It’s the course, the technology, all of that stuff that makes the course. There is this barrier
there.
Now, some of that has went away a little when we did things like be in chats, but it’s still all kind of
unreal you know? Being an Art teacher and having done my undergrad at a [very liberal college] I
suppose I’ve always been one to seek out some of that personal interaction. So, because of that I’ve
always felt that these online classes are a little, you know, dehumanizing.
That said, I get this file where you put in this audio and boom! It was all a big change for me you
know? It was like that bubble started getting popped in all these different places and made me feel
like you were reaching in there and touching me. I know that’s probably kind of silly, but just your
voice alone made me feel like it was a real class and not this big technology construct that was
locking us into its parts.
This really changed the way I viewed the whole online learning thing. I know we aren’t looking at
learning the way that Judy Jetson might be learning but this tells me that we are moving that way. We
are starting to reach out to each other across our phone lines and I think that’s really important you
know? I wish we could be doing this with each other as well as just you sending us these clip things
and all. Like when we did our group projects, if we could have talked to each other like this it would
have been a whole Brave New World thing going on between us but in a really good way.
Guys, keep doing this kind of stuff. Next semester and I’m done with my masters and I didn’t know if
I would every take another online class or not, but if I could see a class where this was going on
between me and the instructor and me and the other [students] then I would be all about learning this
way.
Another student who cited feelings of increased involvement was less eloquent in her initial response
when she simply answered:
The audio, well, I also like it because it makes me feel like a real part of the class. You don’t feel like
a number when you get that.
However, subsequent probing revealed much more about her perceptions:
Here’s the thing, we get all these written comments back and they are all really dense and dry. At
least they seem dry. This goes back to what I meant about the inflection in the instructor’s voice.
When you get this written feedback it could be something where maybe the instructor has taught this
course lots of times before and has all of these canned responses ready on a Word file and just cuts
and pastes them into our work to save all [of their] time. I know that’s probably not what’s going on,
at least I hope it’s not, but sometimes you can feel that way. You feel you might have a robot
responding to you.
What’s different though with the audio though is that you know that its not canned. It could even be
the same comments, but the delivery makes you feel like you are part of this learning group and that
makes it all good. It makes you want to be involved, because you have this involvement level that is
going to be coming back at you.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
16
3. Content Retention
For students (n = 12) who cited increased learning and content retention as reasons for preferring audio to
written feedback, most (n = 9) related their preference to learning style. The following is typical of
students in this category:
I think the reason I like the comments made with the audio thing is that I learn better that way. Let’s
take when I’m in a lecture class. I look around and everyone takes all these notes but I set there and
listen and record what’s being said. Then when I’m studying I listen to the recording over again. I just
retain better that way. With this feedback its just an extension of that; the audio I retain the first time,
the written I might read four or five times.
For the remaining three students who cited increased retention with audio feedback, the following is
representative:
I like this [audio feedback] because I am listening to what you are saying and scanning what I wrote. I
can see what you are talking about and it clicks that way. Now, granted, I might have to listen to it
and read it two or three times because doing both at once makes it all not stick as well, but in the end
it works better than if both parts had been written only.
Interestingly, no students expressed a dislike for audio feedback because of learning styles. However, four
did express views similar to the following:
What I find… well odd, is that I’ve taken learning style inventories and I know that I am very, very
visual. Based on that you would think that I wouldn’t like this type of feedback at all. I know that I
should be liking the written comments much more, but that wasn’t the case. I can read comments
once and I remember. Here I was listening twice, sometimes three times to what you said to make
sense of it all. However, it goes back to what we talked about earlier about feeling like I was part of
the class, a real part. That offset by far the whole learning styles issue. I guess its like when we are in
the classroom, we feel like the teacher is telling us something and bringing us into a discussion so we
don’t expect them to write it too. Maybe that’s what’s going on here. Maybe because you made me
feel more like I was part of the class I didn’t feel like I necessarily needed everything presented in the
way that I learn the best.
4. Instructor Caring
The final theme expressed by students (n = 10) was related to the degree they perceived the instructor to
care about their learning when audio versus written feedback was provided. In most instances (n = 8), this
perception was closely associated with nuance and feelings of involvement as eloquently expressed by
one student when she said:
The final thing is about the way I think the audio shows that you cared about us. It’s not really
something that’s out there by itself though so I need to talk about the whole picture if that’s alright
with you?
I started talking about all of this by talking about feeling the tone of your voice and knowing more
about what you were trying to say than when I got just the words on paper… err rather on screen…
well whatever. We can start there and then when I got to understand what you were saying it gave me
some idea of who you were and that made me want to be more involved. Then when I started feeling
really involved and all it made me feel like you really cared about what was going on. That’s a warm
fuzzy I haven’t gotten with online classes before.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
17
A similar sentiment, though expressed quite differently, was provided by another student who said:
You took the time to try out this new audio file thing and actually communicate with us. Earlier I told
you how I thought that it was way better than just reading words that might be misunderstood. That’s
true and so is the part when I said it made those connections that brought the class together. But what
I left out is that it also showed that you were interested in our, in us learning what was going on.
When you take the time to establish something that’s this complex it shows you want us to really be a
class and not just a group of individuals all doing something similar. I know teaching is pretty
thankless, but I do want you to know that I appreciate what went on this semester. I can’t really say
that I’ve said that about any of my other online classes, but you talking to me, I mean really talking to
me, and everything that was built up from that, made me feel that way here.
From the 51 questionnaires (100% return rate) sent to students in 17 other courses where audio feedback
was used in order to address any potential novelty effect, the same themes emerged with slightly different
weighting than in the original study. No new themes were revealed. The prevalence of themes is
presented in the following table:
Theme Prevalence
Ability to Understand Nuance 42
Feelings of Increased Involvement 26
Content Retention 27
Instructor Caring 32
Table 2. Prevalence of Themes in Follow-Up Questionnaires
D. Document Analysis
Final projects were analyzed in terms of relative usage of strategies for which audio or text feedback was
provided. Two measures were used in this process to assess both frequency and level of use.
The assignment required students to use a minimum of three strategies that had been covered during the
semester in completing their final project. The mean number of strategies used across five groups was 4.2
(SD = 1.09). The number of strategies incorporated into final projects after having received audio versus
written feedback is provided in Table 3.
Total Number of
Strategies
Strategies for Which
Audio Feedback Was
Received
Strategies for Which
Written Feedback Was
Received
Group 1 4 3 1
Group 2 4 4 0
Group 3 3 1 2
Group 4 4 3 1
Group 5 6 4 2
Table 3. Comparison of Strategies Used in Final Projects by Feedback Type Received
Coding of documents revealed that students were far more likely to apply higher order thinking and
problem solving skills (Synthesis and Evaluation in Bloom’s Taxonomy) to content for which they had
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
18
received audio feedback. Table 4 depicts the level at which strategies were applied in final projects
disaggregated by the type of feedback received for those strategies.
Knowledge Comprehension Application Analysis Synthesis Evaluation
Written 1 2 1 1 0 1
Audio 1 1 2 0 5 6
Table 4. Comparison of Level of Application by Feedback Type Received
E. Comparison Time Requirements and Quantity of Feedback for
Audio vs. Text
During the course of the semester, 204 documents containing text-based feedback and 170 containing
audio feedback were generated. The mean feedback volume for text feedback was 129.75 words (SD =
57.43) and 331.39 (SD = 89.31) for audio. The mean time required for the instructor to provide feedback,
was 13.43 minutes (SD = 4.53) for text-based feedback and 3.81 minutes (SD = 0.76) for audio. The time
required to read the documents prior to / during commenting did not differ significantly as a function of
the feedback modality used. The mean time for reading the documents when text-based feedback was
used was 14.13 minutes (SD = 5.45) and 13.94 minutes (SD = 5.74) when audio feedback was used. The
average file size for audio feedback was 258 kb / min (SD = 23.21).
IV. DISCUSSION AND CONCLUSIONS
Our investigations revealed an overwhelming student preference for asynchronous audio feedback as
compared to traditional text based feedback, with no negative perceptions of the technique. The fact that
over one third of students cited the use of audio feedback as a key factor they would use in selecting
future online courses is significant. When these findings are combined with data comparing the use of
knowledge constructed using audio feedback and the level at which that knowledge was applied, we
believe asynchronous audio commenting merits serious consideration in the development and delivery of
future courses.
Though students can project themselves and their emotions through text based communication [15, 16,
20, 21], two thirds of students (n = 19) in this study cited ability to understand nuance as reason for
preferring audio to text feedback. This finding is important because it extends upon Richardson and
Swan’s [22] social presence research, in which a strong relation (R
2
= 0.36) was found to exist between
students’ perceptions of social presence and satisfaction with the instructor. In addition, it is likely that an
enhanced ability to detect nuance impacts student perceptions of the instructor’s use of humor, and
openness toward and encouragement of student ideas and discussion; key immediacy behaviors cited by
Arbaugh [23].
The second most commonly expressed theme, increased feelings of involvement, is important because it
reinforces the sense of community and perception of “being there.” In terms of how audio commenting
decreased social distance for students, the best example can be found in words offered by one student:
It was like that bubble started getting popped in all these different places and made me feel like you
were reaching in there and touching me.
We consider the role audio feedback played in developing this type of interpersonal relationship with
students in our asynchronous courses to be a compelling enough reason for its continued use even if no
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
19
other positive factors had been discovered.
Findings related to perceptions of increased caring on the part of the instructor, a theme that was
frequently tied to nuance and increased involvement, confirm opinions held by Olsen [31] from his use of
the technique in the traditional classroom. Though students were hesitant to explore this theme in great
detail during the initial interviews, it was apparent that it was of considerable importance and increased
overall satisfaction with the course and the instructor. The significant increase in the percentage of
students expressing this theme in follow-up questionnaires in subsequent courses is worth noting. We
believe that audio feedback should be considered a means by which to increase positive perceptions of the
quality of instructor interactions and, by extension, social presence in ALN.
While the preceding three themes support our contention that asynchronous audio feedback increased
teaching presence and decreased social distance, it may be even more important to examine the positive
impact the technique had on perceived learning. Though slightly less than half of all respondents, in both
the original and follow-up interviews, indicated that they retained information and were able to synthesize
instructor comments better when they received audio feedback, document analysis in the original study
indicated that the impact may have been even greater.
Random assignment was used to determine whether audio or text feedback was utilized for each topic and
our analysis revealed no differences in difficulty for topics assigned to each type of feedback. Even given
that control, information for which audio feedback was provided was used approximately 350% more
frequently than information for which text based feedback was provided. With respect to level of
application, students applied content for which audio feedback was provided at the two highest levels of
Bloom’s Taxonomy in slightly more than 70% of the cases. In contrast, content for which text based
feedback was provided was only explored at similar levels in less than 20% of cases (see Table 2). Not
only did students retain material better when they received audio commenting on it, but they applied that
content in more cognitively complex ways.
These findings indicate that audio feedback enhanced learning for our students; though much more
research needs to be conducted to determine how generalizable these finding may be across subject
matter, instructors, and institutional contexts. Since the completion of this study, other early adopters in
our College have experimented with audio feedback following the techniques we employed. The
quantitative, qualitative and anecdotal evidence has been overwhelmingly positive. Over 450 students in
courses taught by these instructors have now received audio feedback. According to these instructors,
approximately one third of their students have submitted unsolicited feedback expressing a strong
preference for this technique over text based feedback. No negative feedback has been received.
From the instructors’ perspective, the ability to reduce the time required to provide feedback by
approximately 75% was a compelling reason to adopt the technique. However, it is important to note that
this reduction in time was coupled with a 255% increase in the quantity of feedback provided. While
increases in quantity of feedback delivered with less demand on instructors’ time is a strong reason to use
the technique, evidence that it also increased retention and understanding of content at deeper levels
makes it hard to argue against using audio commenting at this point. Still, more research is needed to
determine potential differences in the types of feedback provided when text-based and audio feedback are
used, and the precise mechanisms that facilitate increases in student learning.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
20
V. REFERENCES
1. Berge, Z. L. New Roles for Learners and Teachers in Online Education, 2001.
http://www.globaled.com/articles/BergeZane2000.pdf
.
2. Liu, X., C. J. Bonk, R. J. Magiuka, S. Lee, and B. Su. Exploring four dimensions of online
instructor roles: A program level case study. Journal of Asynchronous Learning Networks 9(4): 29–
48, 2005. http://www.sloan-c.org/publications/jaln/v9n4/v9n4_liu.asp
.
3. Bennett, S. and L. Lockyer. Becoming an online teacher: Adapting to a changed environment for
teaching and learning in higher education. Educational Media International 41(3): 231–244, 2004.
4. Goodyear, P., G. Salmon, J. M. Spector, C. Steeples, and S. Tickner. Competences for online
teaching: A special report. Educational Technology Research & Development 49(1): 65–72, 2001.
5. Salmon, G. E-moderating: The Key to Teaching and Learning Online. London: Taylor & Francis,
2000.
6. Berge, Z. L. Facilitating Computer Conferencing: Recommendations from the Field. Educational
Technology 15(1): 22–30, 1995.
7. Bonk, C. J., J. R. Kirkley, N. Hara, and N. Dennen. Finding the Instructor in Post-secondary
Online Learning: Pedagogical, Social, Managerial, and Technological Locations. In J. Stephenson
(Ed.), Teaching and Learning Online: Pedagogies for New Technologies, 76–97. London: Kogan
Page, 2001.
8. Arbaugh, B. Is there an optimal design for online MBA courses? Academy of Management Learning
& Education 4(2): 135–149, 2005.
9. Dreyfus, H. On the Internet: Thinking in Action. London: Routledge, 2001.
10. Ward, M. and D. Newlands. Use of the Web in undergraduate teaching. Computers and Education
31(2): 171–184, 1998.
11. Bullen, M. Participation and critical thinking in online university distance education. Journal of
Distance Education 13(2): 1–32, 1998.
12. Collis, B. Tele-Learning in a Digital World: The Future of Distance Learning. London: International
Thomson Computer Press, 1996.
13. Vygotsky, L. Mind in Society: The Development of Higher Psychological Processes. Cambridge,
MA: Harvard University Press, 1978.
14. Short, J., E. Williams and B. Christie. The Social Psychology of Telecommunications. London:
John Wiley and Sons, 1976.
15. Rourke, L., T. Anderson, D. R. Garrison, and W. Archer. Assessing social presence in
asynchronous text-based computer conferencing. Journal of Distance Education 14(2): 50, 2001.
16.
Swan, K. Building communities in online course: The importance of interaction. Education,
Communication and Information 2(1): 34–49, 2002.
17. Lombard, M. and T. Ditton. At the heart of it all: The concept of presence. Journal of Computer
Mediated Communication 3(2): 1997. http://jcmc.indiana.edu/vol3/issue2/lombard.html
.
18. Laffey, J., G. Lin and Y. Lin. Assessing social ability in online learning environments. Journal of
Interactive Learning Research 17(2): 163–177, 2006.
19. Doursih, P. Where the Action Is: The Foundations of Embodied Interaction. Cambridge, MA: MIT
Press, 2001.
20. Gunawardena, C. and F. Zittle. Social presence as a predictor of satisfaction within a computer-
mediated conferencing environment. The American Journal of Distance Education 11(3): 8–26, 1997.
21. Rovai, A. A preliminary look at the structural differences of higher education classroom communities
in traditional and ALN courses. Journal of Asynchronous Learning Networks 6(1): 41–56, 2002.
22. Richardson, J. and K. Swan. Examining social presence in online courses in relation to students’
perceived learning and satisfaction. Journal of Asynchronous Learning Networks 6(1): 68–88, 2002.
23. Arbaugh, J. How instructor immediacy behaviors affect student satisfaction and learning in web-
based courses. Business Communication Quarterly 64(4): 42–54, 2001.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
21
24. Gorham, J. The relationship between verbal teacher immediacy behaviors and student learning.
Communication Education 37(1): 40–53, 1988.
25. Rice, R. Media appropriateness: Using social presence theory to compare traditional and new
organizational media. Human Communication Research 19(4): 451–484, 1993.
26. Tang, J. and E. Isaacs. Why do users like video? Studies of multimedia-supported collaboration.
Computer Supported Cooperative Work: An International Journal 1(3): 163–196, 1993.
27. Liaw, S. and H. Haung. Enhancing interactivity in Web-based instruction: A review of the literature.
Educational Technology 39(1): 41–51, 2000.
28. Watt, J., J. Walther and K. Nowak. Asynchronous videoconferencing: A hybrid communication
prototype. Proceedings of the 35th Hawaii International Conference on System Science, 2002.
29. Walther, J. and J. Burgoon. Relational communication in computer-mediated interaction. Human
Communication Research 19(1): 50–88, 1992.
30. Jelfs, A. and D. Whitelock. The notion of presence in virtual learning environments: What makes the
environment “real.” British Journal of Educational Technology 31(2): 145–152, 2000.
31. Olson, G. Beyond evaluation: The recorded response to essays. Teaching English in the Two-Year
College 8(2): 121–123, 1982.
32. Mellen, C. and J. Sommers. Audio-taped responses and the two-year-campus writing classroom:
The two-sided desk, the guy with the ax, and the chirping birds. Teaching English in the Two-Year
College 31(1): 25–39, 2003.
33. Kim, E. The effects of digital audio on social presence, motivation and perceived learning in
asynchronous learning networks. Dissertation, 2005. http://www.library.njit.edu/etd/2000s/2005/njit-
etd2005-075/njit-etd2005-075.html.
34. Reeves, B. and C. Nass. The Media Equation. New York: Cambridge University Press, 1996.
35. Bargeron, D., J. Grudin, A. Gupta, E. Sanocki, F. Li and S. Leetiernan. Asynchronous
collaboration around multimedia applied to on-demand education. Journal of Management
Information Systems 18(4): 117–145, 2002.
36. Flahery, L. and K. Pearce. Internet and face to face communication: Not functional alternatives.
Communication Quarterly 46(3): 250–268, 1998.
37. Noble, D. Digital diploma mills: The automation of higher education. First Monday 3(1): 1998.
http://firstmonday.org/issues/issue3_1/noble/index.html
.
38. Creswell, J. W, V. L. Plano Clark, M. L. Gutmann and W. E. Hanson. Advanced mixed methods
research designs. In A. Tashakkori and C. Teddlie (Eds.), Handbook of Mixed Methods in Social and
Behavioral Research. Thousand Oaks, CA: Sage Publications, Inc., 2002.
39. Morse, J. M. Approaches to qualitative-quantitative methodological triangulation. Nursing Research
40: 120–123, 1991.
40. Morgan, D. L. Paradigms lost and pragmatism regained: Methodological implications of combining
qualitative and quantitative methods. Journal of Mixed Methods Research 1(1): 48–76, 2007.
41. Denzin, N. K. and Y. S. Lincoln (Eds.). Collecting and Interpreting Qualitative Materials, 2
nd
Edition. Thousand Oaks, CA: Sage Publications, Inc., 2003.
42. Marshall, C. and G. B. Rossman. Designing Qualitative Research. Thousand Oaks, CA: Sage
Publications, Inc., 1989.
43. Searle, C. Quality in qualitative research. In Y. S. Lincoln and N.K. Denzin (Eds.), Turning Points in
Qualitative Research: Tying Knots in a Handkerchief. Thousand Oaks, CA: Sage Publications, Inc.,
2003.
44. Spencer, D. Student survey. 2001. http://www.sloan-c-wiki.org/wiki/index.php?title=Instructor
_interview_guide
.
45. Thompson, M. Student post-course questionnaire. 1999. http://www.sloan-c-wiki.org
/wiki/index.php?title=World_Campus_Course_Outcomes_Survey:_Spring_2000&action=submit
.
46. Gibson, C. and T. Gibson. Lessons learned from 100+ years of distance learning. Adults Learning
7(1): 15, 1995.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
22
47. Berg, B. L. Qualitative Research Methods for the Social Sciences. Boston, MA: Pearson Education
Inc., 2004.
48. Patton, M. Q. Qualitative Evaluation and Research Methods. Thousand Oaks, CA: Sage
Publications, Inc., 1990.
49. Strauss, A. L. Qualitative Analysis for Social Scientists. New York: Cambridge University Press,
1987.
50. Tesch, R. Qualitative Research: Analysis Types and Software Tools. New York: Falmer, 1990.
51. Woolfolk, C. Educational Psychology, 10
th
Edition. Boston: Allyn and Bacon, 2006.
52. Slavin, R. Educational Psychology: Theory and Practice, 7
th
Edition. Boston: Allyn and Bacon,
2002.
53. Ryan, G. W. and H. R. Bernard. Data management and analysis methods. In: N. K. Denzin and
Y.S. Lincoln (Eds.), Collecting and Interpreting Qualitative Materials, 2
nd
Edition. Thousand Oaks,
CA: Sage Publications, Inc., 2003.
VI. AUTHOR BIOGRAPHIES
Philip Ice is Clinical Assistant Professor in the College of Education’s Department of Middle, Secondary
and K–12 Education at the University of North Carolina Charlotte. His research interests include
immediacy behaviors, pedagogy and multimedia applications in ALN. Philip is especially interested in
the intersection of these elements as they relate to the Community of Inquiry model.
Reagan Curtis is an Assistant Professor of Educational Psychology in the College of Human Resources
and Education’s Department of Technology, Learning and Culture at West Virginia University. A
research and evaluation methodologist, his research agenda is diverse including online course
development and delivery, cognitive development in mathematics, and gender issues in science learning
among other areas.
Perry Phillips is an Associate Professor in the College of Human Resources and Education’s Department
of Curriculum & Instruction/Literacy Studies at West Virginia University. He received his doctorate in
Curriculum and Instruction with a specialization in social studies education. His current research interests
include teaching presence and pedagogy in ALN.
John Wells is an Associate professor of Technology Education in the School of Education at Virginia
Polytechnic and State University. His line of research has been in two distinct fields: Instructional
Technology Integration and Problem-Based Interdisciplinary Science and Technology Methods. John’s
current research interests are aimed at better understanding the intersection of learning theory and
interdisciplinary STEM (science/technology/engineering/mathematics) instructional practices. Prior to
Virginia Tech he was an associate professor at West Virginia University (WVU) where he served as
Director of the Trek 21: Educating Teachers As Agents Of Technological Change PT3 (US Department of
Education) project, the Technology Education Biotechnology Curriculum Project (NASA), and Director
of the Teaching and Learning Technologies Center of the College of Human Resources & Education at
WVU. While faculty at WVU he developed and taught graduate courses related to the application of
computer-mediated communication in education, web-based instructional design, transportation systems,
appropriate technology, housing and shelter design, and community development.
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
23
VII. APPENDIX A
Interview Protocol Guide
Good morning/afternoon/evening. The goal of this study is to examine some of your observations related
to the course you have just completed, C&I 687, and the auditory feedback mechanisms that were used.
The information generated by the study will be used in a research project that is designed to benefit both
students and faculty with respect to the use of this medium. With your permission, I would like to
audiotape this interview.
Before we begin, I would like to notify you of the following:
Your participation is entirely voluntary. You may halt the interview at any time and/or choose not
to answer certain questions.
Your responses will remain anonymous. Complete confidentiality will be maintained. At no time
will your identity be revealed either by the procedures of the study or during reporting of the
results.
No negative consequence will result for choosing not to participate.
Please feel free to tell us what you really think and feel; this will be the most helpful in trying to find out
how to improve things for students and faculty members in the future.
Thank you for your participation in this research.
[Note code number and start recording.]
1. What was your overall perception of C&I 687?
(probe for each one: 1. likes and dislikes 2. time required to complete assignments)
2. How did the course compare with traditional courses you have taken?
(probe for: 1. activity types 2. interaction)
3. How did the course compare with other online courses you have taken (if any)?
(probe for differences as needed)
4. How effective, in your experience, is online learning as opposed to f2f?
(probe for: 1. quality of discussion 2. quality of products 3. quality of interaction 4. other concerns)
5. What did you think of the types of feedback used in the course?
(probe for individual versus group responses and auditory versus written media)
6. When you think about the auditory feedback that was used, how would you describe your reaction to
the instructor comments as opposed to written feedback?
(probe as needed)
7. Do you think that auditory feedback is more or less personal than written feedback?
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
24
(probe as needed)
8. Other than what we have discussed, what did you like or dislike about auditory feedback?
(probe as needed)
9. Are there any ways in which you believe that audio feedback impacted your ability to construct
knowledge in this course?
(probe as needed)
10. That is all I have. Is there anything else you would like to add?
Thank you for participating.
VIII. APPENDIX B
Final Project Rubric
Knowledge: Students explain the step-wise procedures for delivering instruction using a specific teaching
strategy. Syntax is in the appropriate order; however, there is no elaboration on the methodology
employed.
Comprehension: Students expand on the syntax of various teaching strategies by describing the model,
as it is applied to their lesson plans, by explaining key concepts, predicting outcomes or identifying key
issues that influence student learning.
Application: Students clearly apply their knowledge of teaching strategies to the content area; defined as
content pedagogy.
Analysis: Students break down lesson plans into component parts and analyze the strategies employed.
As an example a student would match the syntax of a given teaching strategy to the goals and objectives
of activity.
Synthesis: Students apply prior knowledge from content and curriculum studies to the teaching strategy.
Indicators will include inclusion of modifications to the primary teaching strategy that require the
inclusion of innovative designs or combining multiple strategies into a single construct.
Evaluation: Students include, in their lesson plans, discussion elements in which judgments are made and
justified by the inclusion of a set of criteria. Terminology such as compare, summarize, decide and asses
are likely to be present in such discussions.
IX. APPENDIX C
Student Satisfaction Questionnaire
The goal of this study is to examine some of your observations related to the course you have just
completed and the auditory feedback mechanisms that were used. The information generated by the study
Using Asynchronous Audio Feedback to Enhance
Teaching Presence and Students’ Sense of Community
25
will be used in a research project that is designed to benefit both students and faculty with respect to the
use of this medium.
Before you complete the survey please be aware of the following:
Your participation is entirely voluntary. You may choose to answer or not answer any or all
questions.
Your responses will remain anonymous. Complete confidentiality will be maintained. At no
time will your identity be revealed either by the procedures of the study or during reporting of the
results.
No negative consequence will result for choosing not to participate.
Please feel free to tell us what you really think and feel; this will be the most helpful in trying to find out
how to improve things for students and faculty members in the future.
Thank you, in advance, for participating in this study.
1. What was your overall perception of (course name and number here)? Please describe what you liked
and disliked about the course.
2. How did the course compare with traditional courses you have taken? When answering this question
think about the types of activities, interaction with the instructor and interaction with fellow students.
3. How did the course compare with other online courses you have taken (if any)? Please elaborate a little
on differences (either positive or negative).
4. How effective, in your experience, is online learning as opposed to f2f? If, in your opinion, the
following are applicable, please elaborate: 1. Quality of discussion. 2. Quality of learning. 3. Quality of
interaction. 4. Any other issues you care to discuss.
5. What did you think of the types of feedback used in the course?
6. When you think about the auditory feedback that was used, how would you describe your reaction to
the instructor comments as opposed to written feedback you may have received in this course or previous
courses?
7. Do you think that auditory feedback is more or less personal than written feedback? Why?
8. Other than what we have discussed, what did you like or dislike about auditory feedback?
9. Do you have any other comments about the course or the instructor?
When you have completed the survey please save it as a Word document and email it to (insert email drop
here). Thank you once again, for agreeing to complete this survey.
... This indicates that scheduling time to reply to learner discussion posts should be a priority within an online course. The faculty must be an inspirer in the online classroom, one whom will "promote professional dialogue among online learners; relate personal experiences and cases to the discipline" (Ice, Curtis, Phillips, & Wells, 2007, p. 3) Also, he or she should be plan for time to give feedback in which they "build social rapport; and build online learning community" (Ice et al., n.d., p. 3) as well as detailed assignment feedback. Learners do appreciate audio feedback. ...
Chapter
Full-text available
With the onset of a pandemic, there were opportunities and challenges for supporting learners. Schools and universities were physically closed while interaction shifted to a distance learning modality. In some instances, courses became asynchronous, while other courses met synchronously using video conferenc-ing. Educators were adaptable when the pandemic occurred, quickly setting up home offices to meet their learners’ needs. This occurrence showed that it was in educators’ best interest to understand distance best practices. Distance learning has been utilized at institutions in the United States for the past two decades. However, it has not been widely adopted as mainstream because of the inequities that arise for learners. This chapter will address solutions for systematically addressing inequity from the educator’s perspective, maintaining academic rigor, building a community of learners, creating a workflow for educators to interact with learners, and how to amplify learner engagement in the online learning environment.
... Student satisfaction is predicated by motivation and achievement (Fredricks et al., 2004), and as the meaning of "learning" has shifted to "online learning", research into student satisfaction has transformed to include constructs outside the boundary of a traditional "course" (Dziuban et al., 2015). Factors related to student satisfaction with an online course include student and instructor interactions, engaged learning strategies (Bangert, 2006), sense of community (Ice et al., 2007), and student engagement (Gray & DiLoreto, 2016). Prior research in the hospitality education context has demonstrated that the dominant predictive power of student satisfaction with an online course was driven by student and instructor interactions (Song, 2010), supporting earlier findings by Benbunan-Fich et al. (2005) in the development of the Online Interaction Learning model. ...
Article
Grounded in the Community of Inquiry framework in the online teaching context, the purpose of this mixed-methods study was to determine the role of social (peer) presence in predicting hospitality students’ engagement and satisfaction in an online culinary or beverage lab. Findings included the indirect effect of social presence on satisfaction with the online lab through emotional cognitive engagement. Qualitative results revealed that while students did not have as many opportunities to engage with each other as they hoped for, the opportunities they did have were “meaningful”. Results also study revealed that instructors, rather than peers, became the proxy for social presence as the semester progressed. Theoretical implications demonstrate the importance of emotional and cognitive engagement as the underlying mechanisms linking social presence to online lab satisfaction. Practical implications offer guidance for hospitality instructors to build and enhance opportunities for student engagement and peer-to-peer interaction.
... Direct instruction contradicts being a "guide on the side" but is needed to diagnose misconceptions and bring expertise to the class (Garrison & Akyol, 2013). Teaching presence is essential for perceived learning and satisfaction (Akyol et al., 2009;Richardson & Swan, 2003;Swan & Shih, 2005) and the development of a community (Brook & Oliver, 2007;Ice et al., 2007;Shea et al., 2006). In this study, we focus on the overlap between an instructor's teaching presence and social presence, or instructor social presence. ...
Article
Full-text available
During the first weeks of the COVID-19 pandemic, instructors at a southeastern university had one week to convert their current face-to-face courses to an online format, under a time frame that did not allow for a "well-designed" online course. The current study investigates how some instructors were able to maintain social presence in the transition to the online environment, and the instructional practices they used to support those continued connections. In a cross-sectional survey of undergraduate and graduate students (N = 432) conducted during the last week of the Spring 2020 semester, we asked students to focus on a class that was successful in keeping them in touch with their instructor, content, and peers. Analyses of the data revealed four major themes: connectedness, instructor responsiveness and coaching, online learning best practices such as chunking materials, and empathic facilitation.
... GAVIN could be used by instructors for grading papers and providing feedback to students. Prior research on comparing the feedback modality (Voice vs. Text) has shown that instructors provide richer feedback in less time while using voice than compared to text [28,51]. The difference arises primarily because the act of speaking is much faster than typing, which allows instructors to give more detailed feedback using voice. ...
Preprint
Annotation is an effective reading strategy people often undertake while interacting with digital text. It involves highlighting pieces of text and making notes about them. Annotating while reading in a desktop environment is considered trivial but, in a mobile setting where people read while hand-holding devices, the task of highlighting and typing notes on a mobile display is challenging. In this paper, we introduce GAVIN, a gaze-assisted voice note-taking application, which enables readers to seamlessly take voice notes on digital documents by implicitly anchoring them to text passages. We first conducted a contextual enquiry focusing on participants' note-taking practices on digital documents. Using these findings, we propose a method which leverages eye-tracking and machine learning techniques to annotate voice notes with reference text passages. To evaluate our approach, we recruited 32 participants performing voice note-taking. Following, we trained a classifier on the data collected to predict text passage where participants made voice notes. Lastly, we employed the classifier to built GAVIN and conducted a user study to demonstrate the feasibility of the system. This research demonstrates the feasibility of using gaze as a resource for implicit anchoring of voice notes, enabling the design of systems that allow users to record voice notes with minimal effort and high accuracy.
... 185) when they compared audio and written feedback. Ice et al. (2007) studied audio and written comments in online courses and also found that students liked audio feedback because the feedback was more detailed. Audio feedback made the students feel like the instructor was more devoted to their learning because they received more detailed feedback, and the teacher's voice added a level of personalization that made the learning experience less tense and anxious. ...
... Therefore, it is important to analyze the variation in instructor feedback across different feedback modes as a step toward better understanding how these modes may influence linguistic choices that impact the social aspects of feedback. (Anson, 2018;Edwards et al., 2012;Ryan, Henderson, & Phillips, 2016;Thompson & Lee, 2012), showing some similarities to audio-only feedback (Ice, Curtis, Phillips, & Wells, 2007). ...
Article
Technology enables multiple modalities (e.g., audio, video, and text) for providing feedback on students’ writing. As interest in technology-mediated feedback grows, it is important to consider how different modalities can impact instructors’ evaluative language choices. These choices can influence student emotions and student-instructor relationships. Thus, a better understanding of language use across feedback modes can provide insight into how interpersonal considerations can be negotiated in feedback. In this study, we explore how video and text impact language choices in the formative feedback of three instructors in US university ESL writing courses. Feedback on 136 essays was analyzed using the appraisal framework, with a focus on attitude. This analysis provides a nuanced, linguistically grounded understanding and description of how evaluation is conveyed, while highlighting the social or interpersonal aspects of language within the communicative context of responding to student writing. Text feedback was found to be more negative, while video offered more balance and showed greater concern for social relationships in the feedback process. Finer grain analysis suggested that instructors may engage with students differently across feedback modes, prompting future possibilities for instructor training.
... Students value how the instructor's voice personalizes the feedback and positively impacts their learning experience (Issa, Isaias, and Issa, 2014). The teacher's voice can make the learning experience feel less tense and authoritative (Ice et al., 2007). Audio feedback can represent an instructor's devotion to their students' learning and be an example of student-centered pedagogies. ...
Article
Full-text available
Abstract: In higher education, feedback is an effective but underappreciated teaching tool that expands students’ opportunities for learning. Students need more formative feedback that can lead to dialogic experiences, and they need more feedback experiences in different mediums and modes. Providing students with multimodal feedback that is formative may lead to more dialogic experiences for students and improve their learning. Multimodal feedback experiences benefit all students, including those from diverse and disabled communities. This paper examines some of the advantages and limitations of written, audio, and video feedback and argues that feedback that is primarily formative and delivered using multiple modes and mediums to accommodate certain assignments, students, and contexts increases the potential for students to have dialogic learning experiences. Instructors can take advantage of the affordances of writing, audio, and video to design multimodal feedback experiences for students that extend their learning environment and facilitate more dialogic interaction between students and instructors, students and their peers, and students and themselves. https://digitalcommons.unomaha.edu/ctlle/vol5/iss1/2/
Preprint
Full-text available
Higher education institutions (HEIs) have been required to abruptly move their education online in response to recent events. Technology has always existed as a disrupter. Prior to these challenging events, the twenty-first century was already bringing an increased emergence of new digital tools which have begun to profoundly change higher education. It is widely recognised that students consistently report that feedback is provided sub-standardly in higher education. The danger is that rapid uptake to online maintains the status quo. The key is to empower institutions and therefore academics to reap the transformative benefits of digital innovation and encourage Socratic, sustainable and dialogic feedback through reexamining the relational dimensions of tutor/teacher relationships.
Article
In recent years changes in universities, especially in North America, show that we have entered a new era in higher education, one which is rapidly drawing the halls of academe into the age of automation. Automation - the distribution of digitized course material online, without the participation of professors who develop such material - is often justified as an inevitable part of the new "knowledge-based" society. It is assumed to improve learning and increase wider access. In practice, however, such automation is often coercive in nature - being forced upon professors as well as students - with commercial interests in mind. This paper argues that the trend towards automation of higher education as implemented in North American universities today is a battle between students and professors on one side, and university administrations and companies with "educational products" to sell on the other. It is not a progressive trend towards a new era at all, but a regressive trend, towards the rather old era of mass-production, standardization and purely commercial interests.
Article
Education is a social practice and the ability to interact socially is important to social cognitive learning and social learning. Online education is frequently criticized because it lacks social interaction, a sense of social engagement, and the benefits of learning with others. Social ability with computer-mediated social mechanisms is key to participation and contributions in online learning environments. What is social ability in online education, and can it be measured? The purpose of this study was to explicate and develop an instrument to measure the construct of social ability in online learning. The findings demonstrate construct and predictive validity for a measure of social ability in online learning, indicate that the instrument has power for measuring social ability and underlying factors, and suggests that the instrument may be a valuable tool in technology research for collaborative and networked learning.
Article
This preliminary study again provides evidence that it is the method and not the media that matters the most in learning effectiveness. The present work examines classroom community in order to determine how sense of community differs between students enrolled in traditional face-to-face and those enrolled in asynchronous learning network (ALN) courses. Subjects consist of 326 adult learners who were enrolled in a mix of 14 undergraduate and graduate courses at two urban universities. As operationalized by the Sense of Classroom Community Index (SCCI), there appears no significant difference in classroom community between the two groups of subjects. However, a discriminant analysis shows a significant overall difference in community structure between the two groups. Variations between groups on feelings of similarity of needs, recognition, importance of learning, connectedness, friendship, thinking critically, safety, acceptance, group identity, and absence of confusion are the characteristics contributing mostly to this difference in learning effectiveness.
Article
While there is an increasing body of literature on the effectiveness of on-line courses, studies on the effects of factors predicting Web-based course success over time are limited. I consider the effects of technological characteristics, the pedagogical structure of courses, and their relationship to student learning and satisfaction with the Internet as a course delivery medium in hopes of identifying a framework for Web-based course design that on-line MBA programs and individual faculty members might apply to their courses and be reasonably effective even if they are new to teaching on-line. My results suggest that a framework of selecting an on-line course software platform that students perceive to be both useful and easy to use, using a variety of media on the course Website, promoting the "any time, anywhere" aspects of the learning environment, and encouraging participant interaction is positively associated with effective course outcomes for Web-based courses.