ChapterPDF Available

#DigitalPowerups Scaffolds and Hashtags To Empower Higher-Order and Humanized Student Engagement in Online Discussions


Abstract and Figures

To engage students in online discussions, instructors must design and facilitate the student experience in authentic and relevant ways. In this study, I consider student perceptions and student engagement in online discussions by examining the design and implementation of an emerging online discussion strategy called digital powerups. Digital powerups are keywords displayed as hashtags that are associated with corresponding prompts in online discussion forums allowing for student choice and voice. As the instructor in the course being evaluated in this study, I implemented the digital powerups strategy to improve student engagement with 13 graduate students enrolled in an online course on diverse teaching strategies. Using the community of inquiry framework as a lens, the research questions for this mixed-methods sequential explanatory case study were approached in two phases (QUAN à qual). Descriptive statistics were used to identify that students used #remember and #connect digital powerups most often in initial posts and comments to peers. A Pearson correlation identified a statistically significant positive relationship between the use of the #create powerup and earning a bonus from peers in the discussion. Additionally, a stepwise linear regression analysis revealed that of all the powerups, only the #remember independent variable predicted 46% of the variance in earning a bonus from peers. Online instructors interested in using the digital powerups strategy to engage students in online discussions will find particular interest in this study.
Content may be subject to copyright.
To engage students in online discussions, instructors must design and facilitate the student
experience in authentic and relevant ways. In this study, I consider student perceptions and
student engagement in online discussions by examining the design and implementation of an
emerging online discussion strategy called digital powerups. Digital powerups are keywords
displayed as hashtags that are associated with corresponding prompts in online discussion forums
allowing for student choice and voice. As the instructor in the course being evaluated in this
study, I implemented the digital powerups strategy to improve student engagement with 13
graduate students enrolled in an online course on diverse teaching strategies. Using the
community of inquiry framework as a lens, the research questions for this mixed-methods
sequential explanatory case study were approached in two phases (QUAN
!qual). Descriptive
statistics were used to identify that students used #remember and #connect digital powerups most
often in initial posts and comments to peers. A Pearson correlation identified a statistically
significant positive relationship between the use of the #create powerup and earning a bonus
from peers in the discussion. Additionally, a stepwise linear regression analysis revealed that of
all the powerups, only the #remember independent variable predicted 46% of the variance in
earning a bonus from peers. Online instructors interested in using the digital powerups strategy
to engage students in online discussions will find particular interest in this study.
Keywords: online discussion; scaffolds; hashtags; student engagement; humanized learning
Architecture of Engagement - Dr. Travis N. Thurston, PhD
1. Introduction
Online instructors take on two important roles in the development and implementation of
online courses. First, the role of designer is important to structure the virtual learning
environment, and, second, the role of facilitator is vital because the instructor needs to inhabit the
learning environment with a social presence that facilitates student learning throughout the
course. These two roles are encompassed in the term instructor presence (Garrison, Archer, &
Anderson, 2000). An important aspect of the course development process focuses on how the
design and facilitation of a course will provide regular and substantive interactions among
students, and between instructor and students. Online discussion forums are considered to be
“the beating heart of nearly every online course” (Sull, 2014, p. 11) because the learning
environment is where these interactions and community building take place (Mazzolini &
Maddison, 2007; Nilson & Goodson, 2017; Phirangee, Demmans Epp, & Hewitt, 2016).
Research in the field has identified that when online discussions are designed and
facilitated properly, students engage in the co-construction of knowledge through discourse,
student perception of learning is higher, and students report higher satisfaction with the overall
learning experience and have much higher rates of retention (Arbaugh, 2008; Garrison &
Arbaugh, 2007; Garrison, Cleveland-Innes, & Fung, 2010; Nilson & Goodson, 2017; Richardson
& Swan, 2003; Swan, Garrison, & Richardson, 2009). However, a contrasting body of literature
suggests there are key inadequacies and criticisms to using online discussions and to the learning
managements systems (LMS) used to facilitate online discussions. These criticisms include not
engaging students in higher-order thinking, not allowing co-construction of knowledge and
reflection, and burying pertinent discussion posts in the discussion threads. Although the
literature on online discussions is replete with studies addressing general approaches to either
designing or facilitating online discussions, or recommendations for new platforms to support
online discussions, there is a need to examine new and emerging strategies for online discussions
that address both aspects and utilize features of a contemporary LMS. Specifically, the digital
powerups strategy (Gustafson, 2016) addresses these issues, as well as provides opportunities for
students to engage in substantive interactions in relevant and authentic ways. In this single-case
study I focus specifically on the emerging digital powerups strategy, which hasn’t been
researched previously. As a single-case explaining a phenomenon that hasn’t been researched
previously, this study is considered revelatory in nature (Yin, 2009). As there is no common
definition of digital powerups in the literature, I define digital powerups as “keywords displayed
as hashtags that are associated with corresponding prompts in online discussion forums.
Implementing this strategy requires the instructor to consider both design and facilitation.
1.1. Role of the instructor in online discussions
The role of the instructor in designing and facilitating online discussions is a vital
component to the success of this new strategy. In this study, I served in the role of instructor in
both the course design and facilitation of the discussion forums. I initially learned about the
strategy in preparing to teach the course and, rather than telling my students about the strategy, I
wanted to immerse them in it. In this study, instructor presence (Garrison, Archer, & Anderson,
2000) is used to refer to the role of the instructor, and I define this term as “the intentional and
hands-on approach an instructor takes in both instructional design, and course facilitation.”
1.1.1. Designing the learning environment with explicit expectations
Online instructors can design the structure of their course by implementing the concept
that Riggs and Linder (2016) refer to as an architecture of engagement. Designing an
architecture of engagement, a borrowed term in education, like scaffolding, was initially used as
a metaphor that referred to the planning of urban neighborhoods and community spaces. As the
term implies, this includes the design of space and structures “that center human lives within
meaningful contexts of engagement” (Dotson, 2013, p. 140) and allows for “shared emotional
connection among members develop[ing] from the frequency and quality of social interactions as
well as experiencing shared events and feeling as if they and others are personally invested in the
group” (p. 145). In other words, the design of the space speaks to the intended use and the
ability or inability for individuals to authentically engage and connect with each other. The
need for a virtual learning space that is designed for interaction and social presence brings to
mind the design of brick-and-mortar school classrooms.
The architecture of school structures and classrooms have been specifically designed to
establish predetermined social norms and expectations of interactions for students in a specific
and prescribed way (Piro, 2008). Students are socialized from an early age in elementary
school to know how to behave and what to expect when they enter a classroom. The
architecture of the classroom or the physical space inherently signals expectations to students
based on these established social norms. Once students reach college, these social norms signal
certain expectations for behavior and interaction when they sit down in a large lecture hall
designed for them to listen to the instructor compared the expectations for interaction in a
small classroom with chairs arranged in a circle designed for students to engage in peer
discussion (Riggs & Linder, 2016). Based on these established social norms, students
understand the expectations for their behavior simply based on the learning environments in
which they enter.
Although students have been socialized for many years to understand the inherent
expectations in physical spaces and brick-and-mortar classrooms, these same types of
expectations have not necessarily been established for virtual learning environments. Online
instructors do not have the benefit of simply relying on the learning environment to signal
behavioral expectations for students, thus it becomes necessary for online instructors to
explicitly establish social norms and student behavioral expectations in asynchronous courses
and facilitate student participation. This can in part be done through communication guidelines
presented in the syllabus and detailed instructions in each assignment.
1.1.2. Facilitating student engagement with constructive interactions
Approaching the design of the virtual learning environment with a focus on supporting
student success is known as humanizing the online course (Jones, 2017; Kilgore, Bartoletti, & Al
Freih, 2018; Pacansky-Brock, 2012). In this study, humanizing is defined as “designing and
facilitating an online course as a social-constructivist experience through intentional interaction
and awareness of student needs.” Structuring student expectations is merely the first step to
humanizing an online course, but it sets the stage for facilitating student engagement in authentic
and relevant ways as shown in Figure 1.
Facilitation is of utmost importance because instructor presence “is the mechanism that
bridges the transactional distance between instructor and student in a virtual classroom” (Orcutt
& Dringus, 2017, p. 16) and has been found to influence both student perception of learning and
overall student satisfaction in online courses (Afolabi, 2016; Akyol & Garrison, 2011). Cho and
Cho (2016) detailed that this instructor presence is comprised of several elements, including:
providing regular feedback on student interactions, providing positive and supporting comments
in discussions, scaffolding with guidelines and reminder messages, and monitoring interactions.
Instructors can stunt discussion interaction or influence the flow of the forums, as
Mazzolini and Maddison (2007) report, when instructors jump into the discussion threads too
often. Instructor interactions with students can take the form of posting course announcements,
participating in the online discussion forums, sending individualized messages, and providing
timely feedback on assignments for each student (Martin, Wang, & Sadaf, 2018). When
instructors intentionally focus on engaging with students through individualized feedback,
Figure 1. Sample Canvas LMS discussion prompt.
students show increased cognitive presence in online courses (Barnes, 2016), and that interaction
is vital for the perceived learning of online students (Garrison & Cleveland-Innes, 2005).
1.2. Community of inquiry as a theoretical and conceptual framework for online instruction
Although it is important to consider course content and the technology being leveraged in
online courses, it is perhaps of greater importance to consider the human elements involved.
Humanizing an online course requires the intentional planning and integration of human
interaction, and therefore requires a virtual space to facilitate interaction and reflection among
those involved (Kilgore, Bartoletti, & Al Freih, 2018). Learning environments designed to
engage students in discourse with peers and individual reflection in order to construct knowledge
are considered to be social constructivist in nature and require instructors to explicitly connect
learning activities to discussions in interesting ways to engage students (Jones, 2017).
An epistemological approach of social constructivism contributes to online learning in
that it focuses on the use of scaffolds to support learners toward meaning making. Vygotsky
(1980) viewed learning as being socially constructed through activity, communication, and
interactions with others (Swan, 2005). Additionally, the pragmatic views of Dewey (1923)
included the concept that any educational experience worth engaging in should be grounded in
the process of reflective inquiry and that inquiry itself should be a social activity (Swan, Garrison
& Richardson, 2009). These principles of social constructivism provide the groundwork for the
Community of Inquiry (CoI) framework which is described as “a comprehensive theoretical
model that can inform both research on online learning and the practice of online instruction”
(Swan, Shea, Richardson, Ice, Garrison, Cleveland-Innes, & Arbaugh, 2008, p. 1). The CoI
framework (Figure 2) is composed of three main elements: cognitive presence, social presence,
and teaching presence. Cognitive presence encompasses how students interact with content in a
course, social presence addresses aspects of community and peer to peer interaction, and
Figure 2. Community of inquiry framework (Barnes, 2016).
teaching presence addresses structure and process (or design and facilitation). Social presence
tends to get the most attention in the literature as it is the “ability of participants to identify with
the group or course of study” (Garrison, 2011, p. 34) and “exchange ideas freely, explore
different perspectives, and solve problems collectively” (Joksimović, Gašević, Kovanović,
Riecke, & Hatala, 2015, p. 640).
When creating online learning environments that are conducive to student engagement,
Lee, Pate, and Cozart (2015) recommended three guidelines to design for autonomy-support:
provide choices (Flowerday & Schraw, 2000; Radenski, 2009), provide rationale (Reeve & Jang,
2006; Xie & Ke, 2011), and provide opportunities for personalization (Knowles, 1986; Patall,
Cooper & Wynn, 2010). By including these three aspects, the design of an online discussion
forum can become an autonomy-supportive community space or a humanized learning
environment. The social presence aspect of discussions represents the humanized aspects, or the
personally fulfilling and affective elements of engaging in a community of learners that
contributes to students engaging cognitively (Garrison, Anderson, & Archer, 2000). Humanized
learning discussion forums allow students and instructors to pose questions and engage in
reflection and the co-construction of knowledge (Chen, deNoyelles, Patton, & Zydney, 2017;
Garrison et al., 2010; Gilbert & Dabbagh, 2005; Yang, Yeh, & Won, 2010). Although an
important starting point, a discussion forum in and of itself does not inherently provide
humanized learning without the application of constructivist instructional techniques (Covelli,
2017). However, when designed and implemented to do so, online discussions unequivocally
support social constructivist learning (Maor, 2003).
Although online discussions are designed as inherent social spaces, which seems to speak
specifically to social presence, the strength of the CoI framework is in the dynamic interplay and
overlap between the cognitive, social, and teaching presence constructs (Hora & Ferrare, 2013;
Shea & Bidjerano, 2009). Although the literature focused on social presence outweighs studies
on CoI as a whole (Lowenthal, 2010), it is worthwhile to consider that the instructor plays a
critical role in guiding students through the learning experience in an online course (Martin,
Wang, & Sadaf, 2018). Indeed, the researchers who established CoI define teaching presence as
“the design, facilitation, and direction of cognitive and social presence for the purpose of
realizing personally meaningful and educationally worthwhile learning outcomes” (Garrison et
al., p. 5). Thus, the CoI framework is an appropriate lens for this study to explain the
phenomenon of higher-order online learning experiences using digital powerups (Garrison,
Cleveland-Innes, & Fung, 2010). This case in particular focuses the research questions on the
cognitive presence and social presence domains, and the teaching presence domain is addressed
in the rich description of the digital powerups strategy.
1.3. Context of the current study
One emerging approach to student engagement in online discussions, known as digital
powerups, has recently come to the forefront of innovative strategies in higher education,
although it originated in K-12 settings (Gustafson, 2014; Gustafson, 2016). Before this approach
begins to be implemented more widely, it is important to investigate how this phenomenon was
used in the design and facilitation of online discussions, as well as to investigate how students
engaged in the strategy and their perceptions surrounding the learning experience. To date, the
digital powerups strategy does not appear in the literature making this study one that can
contribute to the knowledge base on effective online instruction. Although considered an
emerging strategy for online discussions, the digital powerups strategy is deeply connected to the
literature on social constructivist approaches to learning and the literature on online discussions.
1.3.1. Connecting digital powerups to the literature
Many online instructors question their ability to authentically engage students in online
courses (Allen, Seaman, Poulin & Straut, 2016; Herman & Nilson, 2018), which is not surprising
given the inadequacies that exist in the design and facilitation of online discussion forums that
appear in the literature. Specifically, these inadequacies include: not engaging students in higher-
order thinking (Andresen, 2009; Gao, Zhang, & Franklin, 2013; Hay, Peltier, & Drago, 2004);
not allowing co-construction of knowledge and reflection (Cho & Cho, 2016; Liambase, 2010);
and burying pertinent discussion posts in the discussion threads as new posts get most attention
(Hewitt, 2003; Rubin, Fernandes, & Avgerinou, 2013). Although these three inadequacies are
representative of the way many online discussions are implemented, they are not insurmountable
barriers. Further, these inadequacies can be mitigated by utilizing the design and instruction of a
strategy, like digital powerups.
Gustafson (2014) originally presented digital powerups to address the first inadequacy on
online discussions because it can enhance student engagement from lower-level responses to
higher-order thinking in Bloom’s revised taxonomy from Krathwohl (2002). Utilizing Bloom’s
taxonomy as a scaffold for enhancing student engagement, higher-order learning, and labelling
content in discussion forums aligns with the literature in online learning (Cheung, Hew, & Ng,
2008; Christopher, Thomas, & Tallents-Runnels, 2004; Darabi, Arrastia, Nelson, Cornille, &
Liang, 2011; Ertmer, Sadaf, & Ertmer, 2011; Gilbert & Dabbah, 2005; Valcke, De Wever, Zhu,
& Deed, 2009; Whiteley, 2014). One way to use Bloom’s as a scaffold in online discussions is
by utilizing hashtags. A hashtag, which is a number sign followed by a keyword (Pacansky-
Brock, 2012), can be used as visual representations of prompts in the digital powerups strategy.
By explicitly labeling different levels of Bloom’s with specific prompts or hashtags, students are
given an immediate cue as to the level at which they are engaging in the discussion. Making
students aware of these levels is the first step in shifting the discourse toward higher-order levels
of thinking.
This concept of scaffolding also speaks to the second inadequacy in not allowing co-
construction of knowledge and reflection. According to Wood, Bruner, and Ross (1976)
“Scaffolding situations are those in which the learner gets assistance or support to perform a task
beyond his or her own reach if pursued independently" (as cited in Pea, 2004). This support can
come from the environment or the support can also come from other learners in what Vygotsky
(1980) refers to as the zone of proximal development, which emphasizes the need for learner-
learner interactions. Instructor presence is required in the development of the environment and in
the scaffolding of learner-learner interaction using specific communication expectations
(Emelyanova & Voronia, 2014; Kanuka, Rourke, & Laflamme, 2007; Zydney & Seo, 2012).
Further, Gustafson (2014) frames digital powerups as a strategy to scaffold online interactions in
Figure 3. Sample Canvas LMS discussion thread with digital powerups.
the discussion forums and to empower students with both choice and voice. Providing an
autonomy-supportive learning environment (Lee et al., 2015; Reeve & Jang, 2006) allows for the
co-construction of knowledge and reflection between learners (Cho & Cho, 2016).
The third inadequacy, that discussion posts become buried in the threads, is not addressed
with the digital powerups strategy inherently, but it can be addressed when paired with an LMS,
like Canvas, that supports specific features to address this issue.
1.3.2. Features of the LMS for online discussions
Although online discussions have the potential to provide a space for CoI, one of the
inadequacies of online discussion forums is that higher quality discussion posts can get lost in
the discussion threads. These concerns have been observed in a variety of LMS’s including
Moodle, Blackboard, D2L and others. However, there are certain affordances that an LMS can
provide to users that directly impact the success of students. For example, “An effective LMS
must make it easy for students to find what they need when they need it… [and] it must facilitate
easy communication, both through informal contact and formal feedback” (Rubin, Fernandes, &
Avgerinou, 2012, p. 50).
Along with the digital powerups strategy, there were additional features that were utilized
in the Canvas LMS specifically to help mitigate some of the common pitfalls associated with
online discussions. The option to use threaded discussions was used, which uses a hierarchal
structure in the forum to show linear continuity to interactions. In other words, as shown in
Figure 3, each students’ initial post (labeled in Figure 3 with a) showed in the forum, along with
the comments (labeled in Figure 3 with b) from peers. Canvas LMS uses white space to indent
the comments below each initial post as a visual cue to the student of the hierarchy in the thread.
To address the constraint of not allowing co-construction of knowledge another feature
borrowed from social media was enabled. The Canvas LMS discussion feature of “liking” was
used to promote engagement and external motivation because students who earned the most
“likes” from their peers earned bonus points for the discussion assignment that week. This design
decision was made to create a sense of social responsibility and community in posting content
that could contribute to the class and appeal to peers, which provided all students with the
“opportunity to contribute to teaching presence” (Garrison, 2011, p. 62). This also allowed an
associated feature to be used that mitigated relevant content being buried in the threaded
discussion. Specifically, posts were initially sorted in the discussion thread chronologically by
date of initial post, but as posts started receiving “likes” from students, the LMS reordered the
posts from the most likes to the least likes. This allowed students to curate the discussion forums,
Figure 4. Canvas LMS discussion posts including social media design elements.
which along with the comments and discourse in the thread led to “co-constructing knowledge
while engaging with course content” (Moreillon, 2015, p. 44). This again not only helped to
scaffold engagement in curating the best quality posts, but was intentionally implemented to lead
students toward feelings of responsibility to the group to write quality posts.
Attempting to encapsulate the overarching social-constructivist learning philosophies of
Bruner, Piaget, and Vygotsky, the authors Conrad and Donaldson (2011) explained that “humans
do not learn in a vacuum but rather through interaction” (p. 2). And given that “In higher
education, quality social interaction matters for student success” (Chen, Chang, Ouyang, &
Zhou, 2018, p. 22), instructors should intentionally plan to engage students in learning using
virtual environment functionality that they are already familiar with. For digital natives taking
online courses (Thurston, 2018), the virtual environment that many are most familiar with is
social media. The discussion thread feature in Canvas LMS is designed to have the same look,
feel, and functionality as a social media platform (Lee & Bonk, 2016). As depicted in Figure 1,
media can also be embedded in discussion prompts, which further resembles social media.
As shown in Figure 4, included in a Canvas LMS discussion post are design elements that
are also found in social media platform posts. The first recognizable feature of a Canvas LMS
discussion post provides personally identifying information about the individual who made the
post (labeled in Figure 4 with a). These identifying items include an avatar, the student name,
and the date the post was made. Next, students can quickly reply (labeled in Figure 4 with c) in a
thread with additional comments and replies, and others participating in the discussion have the
opportunity to give a “thumbs-up” or “like” for each discussion post (labeled in Figure 4 with d).
The digital powerups strategy builds on the features inherent in the discussion forum as a social
media interaction space by introducing hashtags (labeled in Figure 4 with b) that serve first as a
reminder of the prompt being addressed, and secondly as a tag or marker to quickly indicate the
level of Bloom’s that is being engaged. Not only is the powerup indicating the level in which
students are engaging (lower, mid, or higher), but also the prompts associated with the powerups
scaffold or frame the student responses. Often students engage in the lower levels (#remember,
#understand) of Bloom’s based on the design and facilitation of the discussion (Gao, Zhang, &
Franklin, 2013), but the powerups nudge students into engaging in discussions in the mid-levels
(#apply, #analyze, #evaluate) and the higher-order levels as well (#create, #connect).
1.3.3. Planning and implementing the digital powerups strategy
Positionality and context are critical factors in the way this case study is framed because I
planned the implementation of the digital powerups strategy in the semester before it was used
with students. I served in the role of instructor and was planning the digital powerups strategy as
an instructional improvement to the course and not as a research intervention. Therefore, the
explanation of the planned implementation focuses on both the instructional design aspects of the
teaching strategy, and the instructional facilitation aspects.
As I prepared to teach an online graduate course focused on exploring a variety of
innovative teaching and leadership strategies, it became important for me to follow the
guidelines Gustafson (2016) provided as a brief conceptual overview of digital powerups and
how this strategy can be employed to progressively push students toward higher-order
engagement in online discussion forums. Gustafson (2014) also described digital powerups as
scaffolds for each level of the Bloom’s revised taxonomy action verbs. The importance of both
choice and personalization as motivational factors is central to this strategy because students are
allowed to choose two to four of the powerups in responding to prompts and use additional
powerups in comments to engage with peers. Gustafson (2016) presented digital powerups as
badges students earned when engaging at the different levels, and students included a code in
their post, but I adapted this so that students used the powerup verbs as tags or hashtags to
structure and organize their discussion posts (e.g., #create). Table 1 shows each powerup and the
corresponding prompts.
Table 1
Digital powerups for online discussions.
respond to your question.
timeline, diagram, or model in your response.
why you think they're related.
evidence in your response.
supplies to innovate.
you collaborated in your response (this requires actual action on your part).
Note. Adapted from “Figure 8.3 Digital Power-Ups” (Gustafson, 2016, p. 115).
In moving toward implementation, additional details needed to be clarified beyond
Gustafson’s recommendations for successful facilitation. Therefore, I developed additional in-
depth instructions for the students that were posted in each of the weekly discussions. This was
necessary because discussions should be “explicitly described [with] well-structured prompts [to]
support the students to interact and co-construct higher-order knowledge” (Zydney & Seo, 2012,
p. 78). Indeed, “course design with clear guidelines, expectations and scaffolds for participation
in online discussions as well as a high level of leadership by a course teacher are necessary for
students to take a deep approach to learning” (Joksimović et al., 2015, p. 642). Therefore, well-
structured prompts were provided each week based on the course content for each module, and
the seven digital powerup options shown in Table 1 were provided for students to choose from to
create their initial posts using a hashtag and the key action verb from Bloom’s.
Again, as per Gustafson’s (2014) suggestions, students were asked to use two or three
powerups in their initial post, and an additional powerup in at least one comment to a peer’s post.
Although every student was asked to respond to the same prompt, they were approaching the
prompts from different perspectives and drawing on their own learning experience paired with
their professional experience. I provided students with a rationale for participation, and the
powerups provided choice and personalization in the way students engaged in the weekly
discussions. Students would label each digital powerup used in their post with the corresponding
hashtag. When used on social media platforms, hashtags provide a construct to “efficiently
aggregate dialogue within a specific subject domain, allowing users to contribute and view
relevant content in one place” (Chiang et al., 2016). Hashtags were used in the discussion forums
as labels or tags representing the powerups or prompts being addressed. For example, if a student
chose to address the prompt #create, then they would label that portion of their discussion post
with “#create” before or after the paragraph to indicate the specific prompt they were addressing
as shown in Figure 4 with label (b).
1.4. Purpose of this research
The aim of this study was to explain the design and implementation of the digital
powerups instructional strategy for asynchronous discussions in a graduate level online course.
Along with the intentional design decisions, this study aims to explain student engagement
through the perceptions and behaviors of students engaging with the strategy in situ. This study
considers how digital powerups impacted student engagement with course content, interactions
with other students in online discussions, and interactions with the instructor. To effectively
explain this case, two questions were established using the CoI framework. Question one
addresses cognitive presence, and question two addresses social presence. The questions were
first explored quantitatively, and then qualitatively, in a sequential manner, to allow for analysis,
explanation, and to provide a complete case. Through the lens of CoI, the following research
questions are addressed in this study:
1. How did student preferences of digital powerups and instructor presence
impact student cognitive presence in online discussions?
2. How did the digital powerups strategy impact social presence in online
2. Methodology
2.1. Mixed-methods approach
This mixed-methods sequential explanatory research design (Creswell & Plano Clark,
2011) followed a two-phase dependent case study approach as the data collected and analyzed in
the first phase were used to inform the data and analysis in the second phase (Ivankova,
Creswell, & Stick, 2006). An explanatory case presents a distinct approach to explaining a
phenomenon, and specifically explaining the student experience (Fetters, Curry, & Creswell,
2013). In this particular case study, there is an intense focus on the digital powerups instructional
strategy as a phenomenon that necessitates a clear boundary for the investigation and to
distinguish between actual evidence and my own interpretation of the results (Harder, 2010).
This empirical inquiry “investigate[d] a contemporary phenomenon in depth and within its real-
life context” (Yin, 2009, p. 18) in order to illustrate a unique case (Creswell, 2012) that took
place within an educational setting and that was bounded by both time and place (Stake, 1995).
Specifically, this case study narrowly focused on one online course during one semester within
the discussion forums where the digital powerups strategy had been implemented. Yazan (2015)
specified that case studies must have clear boundaries, and this study utilized one of the most
effective ways of bounding a case by using both time and activity as the boundary (Baxter &
Jack, 2008). Therefore, this case study focused on the time in which learners specifically
participated in the online discussion forums in which the digital powerups strategy had been
2.1.1. Data sources
The data for this study were collected within the Canvas LMS of a concluded online
graduate course at a public university in the western United States and de-identified by an honest
broker (Qayyum, Zipf, Gungor, & Dillon, 2019) to protect student identity. Data collection and
analysis followed a two-phase design (QUAN à qual). All quantitative data were collected and
analyzed in the first phase. The data and analysis from the QUAN phase were used to inform the
collection of the qual data in the second phase. Sampling in this study is convenient purposive
(Creswell, 2012) and homogeneous in nature by targeting a sample of graduate students with a
common occupation and common educational background (Patton, 1990). The course used in
this case study is offered through an instructional leadership program that culminates in a
master’s degree in education and administrative licensure. Students in this course are in-service
teachers or administrator professionals in primary and secondary schools in K-12 school
districts. This convenient purposive sample was important factor in this study as the students in
the course were especially knowledgeable about instructional practice and could provide detailed
insights into their own learning processes and engagement strategies throughout the experience
(Creswell & Plano Clark, 2011). Participants for the study included the 13 graduate students
(made up of seven females and six males) in the course using a purposeful sample.
This study utilized de-identified archival data. Data collection sources included learning
analytics of student behavior and participation in the Canvas LMS or trace data, midterm
anonymous student survey results (Appendix A), and the digital powerups used within the
boundaries of the 11 class discussion forums throughout the semester. The “trace data (also
known as log data) [was] recorded by LMS contain[ing] time-stamped events about views of
specific resources…or discussion messages viewed or posted” (Gašević, Dawson, Rogers, &
Gasevic, 2016, p. 68) and can provide interesting insights to the behaviors of students in an
online learning environment. However, Stewart (2017) insists that these trace data don’t tell the
whole story, and additional context of the course, the teaching strategies employed, and the
student perceptions of their experience should be used to triangulate findings. As the instructor
for the course, I created the midterm survey as a means to receive student feedback and insight
on the course, and the survey was not utilized as a validated instrument specific to data collection
for a research study. I utilized the midterm anonymous survey as a qualitative data source, so
questions one, three, and six, were not used in this study given the liker-style nature of the
questions. Specifically, I identified questions two, four, and five to be used in Phase 2 of the
analysis as they pertained specifically to the digital powerups. I decided to exclude the other
open-ended questions in this study (seven and eight), because they were more framed toward
aspects of the entire course, and not specific to the digital powerups. In order to provide a rich
description, this study also employs triangulation as “corroborating evidence from different
sources [which was used] to shed light on a theme” (Creswell, 2012, p. 251). The use of multiple
data sources in this case study to support the interpretation of findings provides triangulation to
validate findings (Stake, 1995).
2.1.2. Data collection
The data were collected and analyzed in two main phases (QUAN
qual) in a
“quantitatively driven sequential design” (Johnson & Christensen, 2017, p. 478). As depicted in
Figure 5, this study began with collecting and analyzing the quantitative data in Phase 1 for each
of the research questions, then collecting and analyzing the qualitative data in Phase 2 for each
domain, and finally inferences were made by integrating and combining the data in the narrative
provided in the results and discussion section of the paper. This sequential-dependent design
appropriately allowed outcomes in Phase 2 to emerge from the results in Phase 1 (Onwuegbuzie
& Johnson, 2006).
Figure 5. Data collection & analysis in two phase research design (QUAN
In this study, descriptive statistics were used to report the use of digital powerups used by
students in the discussion threads, and separated by those used in initial posts versus those used
in comments to peers. The descriptive statistics or trace data were collected by an honest broker,
and de-identified before I reviewed the data for analysis. Additional trace data of instructor
interactions were also collected and displayed in a multi-variable graph, known as a bubble chart,
which is visual combination of a scatterplot and a proportional area chart.
2.2. Data Analysis
2.2.1. Quantitative procedure: Correlation and regression
In Phase 1 of section 3.2.1. I used a Pearson correlation to see if a relationship existed
between the digital powerups used and earning a bonus from peers in the discussion forums. A
Cohen's standard was used to evaluate the strength of the relationship, where coefficients
between .10 and .29 represent a small effect size, coefficients between .30 and .49 represent a
moderate effect size, and coefficients above .50 indicate a large effect size (Cohen, 1988). A
Pearson correlation requires that the relationship between each pair of variables is linear
(Conover & Iman, 1981). This assumption is violated if there is curvature among the points on
the scatterplot between any pair of variables. I produced a scatterplot was produced and indicated
no curvature and that the relationship between each pair of variables was linear. Skewness and
kurtosis were also calculated to determine a normal distribution for the powerups (Westfall &
Henning, 2013).
Also in Phase 1 of section 3.2.1., I conducted a linear regression analysis to test if using a
particular powerup or total powerups used predicted earning a bonus from peers. I approached
each question sequentially, therefore the qualitative data were gathered from the midterm
evaluation and from the final metacognitive assignment addressing online discussions and
student engagement. Given the explanatory nature of this study, I determined that a stepwise
approach to the regression would provide the best opportunity for identifying a model of best-fit.
The dependent variable for the analysis was an earned bonus (Bonus), and the eight independent
variables were the seven individual powerups, and the total number of powerups used segmented
by each student. Therefore, the 'stepwise' variable selection algorithm was chosen to determine
the optimal combination of predictors to include in the regression model. The Akaike
Information Criterion (AIC) was used to indicate model fit (Posada & Buckley, 2004). The AIC
statistic uses the maximized log likelihood value as a baseline for model fit, and adds a penalty
for estimating additional parameters. A smaller AIC statistic indicates improved model fit.
Prior to conducting the linear regression, I examined the assumptions of normality of
residuals, homoscedasticity of residuals, absence of multicollinearity, and the lack of outliers.
Normality was evaluated using a Q-Q scatterplot (Bates, Mächler, Bolker, & Walker, 2014;
DeCarlo, 1997; Field, 2009) comparing the distribution of the residuals. Homoscedasticity was
evaluated by plotting the residuals against the predicted values (Bates et al., 2014; Field, 2009;
Osborne & Walters, 2002), and the assumption was met as the points appear randomly
distributed with a mean of zero and no apparent curvature. High VIFs indicate increased effects
of multicollinearity in the model. VIFs greater than five are cause for concern, whereas VIFs of
10 should be considered the maximum upper limit (Menard, 2009); however, VIFs for
#remember and Total were both calculated at 1.17 showing no cause for concern.
To identify influential points, studentized residuals were calculated and the absolute
values were plotted against the observation numbers (Field, 2009; Stevens, 2009). Studentized
residuals were calculated by dividing the model residuals by the estimated residual standard
deviation. An observation with a studentized residual greater than 3.93 in absolute value, the
.999 quartile of a t distribution with 12 degrees of freedom, was considered to have significant
influence on the results of the model. The studentized residuals on all observations were less than
two, and no data points were identified as influential. Out of the available eight independent
variables the stepwise regression analysis resulted in a model with one independent variable
(#remember) explaining 46% of the variance in earning a bonus from peers in the discussions.
2.2.2. Qualitative procedure: Explanatory weaving narrative
Using the de-identified qualitative data from the open-ended midterm survey questions
(Appendix A) I analyzed and coded student responses for emerging patterns and themes
(Creswell, 2012). I analyzed the qualitative data from question two using deductive category
application (Mayring, 2004) based on common barriers and supports that adult learners
encounter in online courses from the literature. The coding scheme included five common
barriers; lack of time, lack of motivation, difficulty using technology, lack of resources, and lack
of technical support (Bonk, Lee, Kou, Xu, & Sheu, 2015; Michinov, Brunot, Le Bohec, Juhel, &
Delaval, 2011). The coding scheme for supports was derived from the same literature with four
common supports; exploration of topics, abundant access to technology, choice and control over
activity, highly-motivated, abundant technical support, abundant instructor support. After I coded
the data, I grouped the responses by theme, and I also analyzed student responses from both
questions two and four for specific exemplars that were also grouped by theme (Bernard,
Wultich, & Ryan, 2016). The coding scheme I used to analyze the data from question four was
based on the “phase descriptors used to code student responses” (Garrison, Anderson, & Archer,
2001, p. 2) from the CoI framework; triggering event, exploration, integration, and resolution.
Question five allowed students to provide more freeform answers than questions two and four, so
I coded the data collected from question five inductively to allow for emerging patterns from the
student responses based on their online discussion experiences (Bernard et al., 2016). This open
coding approach of constant comparison of the similarities and differences between student
responses to question five is appropriate for the explanatory and revelatory nature of this study
(Berelson, 1952; Yin, 2009) to better understand student perceptions as they participated in the
emerging digital powerups strategy.
The final source of qualitative data was the content from the discussion to get a better
sense of the social presence and peer-to-peer interaction that took place (Lee & Bonk, 2016). I
determined that the best way to explain or illustrate student use of the digital powerups strategy
in-situ would be to utilize the student post that earned the most likes from peers, and by
extension earned the bonus for the final discussion as the exemplar post to be analyzed. In the
final discussion, students were asked to address the topic of “invigorating online discussions”
while using the digital powerups strategy, which revealed the efficacy of the digital powerups
strategy from the student perspective. This also situated students in a social presence
environment encouraging meaningful interactions, which has been found to demand high levels
of learning and critical thinking for online students (Garrison, Anderson, & Archer, 2000).
After I coded and grouped the qualitative data by themes, they were integrated with the
quantitative data in the results and discussion section using the narrative weaving approach
(Fetters, Curry, &, Creswell, 2013). Specifically, addressing this type of mixed-methods
approach, Fetters, Curry, and Creswell (2013) further describe integration using narrative
weaving as including the reporting of both quantitative and qualitative results together based on a
particular concept or theme, which are “synthesized through narrative both in the results and
discussion [section]” (p. 2147). Narrative weaving allows for an overarching theme, like
community of inquiry, to serve as a guide for researchers to organize the sometimes messy
complexity of explaining asynchronous social interactions and academic discourse in online
courses. For example, a weaving narrative approach to integration allows for a researcher, as in
the case of this study, who served as an instructor in the intentional planning and design of an
online course can integrate “designerly ways of knowing” (Cross, 1982, p. 223) into the overall
narrative of the results and discussion. This approach serves to present “an in-depth
understanding of a case” (Cresswell, 2012, p. 98) to connect the dots between a variety of
qualitative and qualitative data sources (Scammon, Tomoaia-Cotisel, Day, Day, Kim, Waitzman,
Farrell, & Magill, 2013).
The teacher presence aspect or the structure of the digital powerups strategy is an integral
part of the overall study, and is explained in great detail in section 1.3. As an explanatory case
study, it is fitting that the integration of the quantitative and qualitative data be reported in both
the results and discussion sections of this paper, and that the selected domains of the CoI
framework (cognitive presence and social presence) serve as the overarching themes to scaffold
the results and discussion. Finally, this study is revelatory (Yin, 2009), because research on the
digital powerups approach does not appear in the literature for online discussions, nor does it
appear in the teaching and learning literature for higher education. This study aligns with Yin’s
(2009) definition of revelatory studies as the digital powerups strategy is currently emerging.
2.3. Illustration and utility
The meaningful integration of quantitative and qualitative methods is a centerpiece of
mixed-methods research and allows for better illustration and utility in explanatory sequential
research designs (Guetterman, Fetters, & Creswell, 2015). This methodological approach allows
for a more complete understanding of the context of intentional course design decisions that were
made before students participated, the implementation and facilitation of the digital powerups
strategy by the instructor, and student perception of several features utilized within the Canvas
LMS including liking, sort by liking, and embedded threads. The mixed-methods approach of
this study is complementary because it seeks the illustration “of results from one method with the
results from the other method” (Schoonenboom & Johnson, 2017, p. 110). After the two main
phases (QUAN
qual) the mixing of quantitative and qualitative results took place in the final
integration or inferential phase (Teddlie & Tashakkori, 2009) to provide illustration and utility
which makes this study more useful for practitioners in both design and instruction (Greene,
Caracelli & Graham, 1989; Schoonenboom & Johnson, 2017).
3. Results and Discussion
3.1. How did student preferences of digital powerups and instructor presence impact student
cognitive presence in online discussions?
3.1.1. Students preferred to use the #remember and #connect powerups most often
To address question one in Phase 1 of the study, frequencies and percentages were
calculated for each variable, and differentiated by powerups that were used in an initial post and
powerups that were used to respond to peers in the comments of the discussion threads. None of
Table 2
Summary of descriptive statistics for digital powerups in initial posts.
Note: Table represents digital powerups used by all students (n=13) in initial posts over 12
discussions. Column “n” represents number of students attempting the powerup at least once.
Table 3
Summary of descriptive statistics for digital powerups in comments.
Note: Table represents digital powerups used by all students (n=13) in the comments over 12
discussions. Column “n” represents number of students attempting the powerup at least once.
the 13 students utilized every powerup, so percentages are representative of those students who
actually used the individual powerup throughout the semester. As represented in Table 2, the
most frequently observed powerups in initial posts were #remember (110, 31%) and #connect
(71, 20%). Nearly all of the students (n=12, 92%) used the #remember powerup most often in
initial posts when engaging in reflection, whereas in the comments the majority of students (n=7,
54%) used the #connect powerup most often.
As represented in Table 3, the most frequently observed powerups in comments were
#connect (56, 29%) and #remember (51, 26%). This is an interesting observed shift in
frequencies in a social constructivist framework, because there was a slight shift toward
preference of higher-order powerups in the comments, which is where students tend to engage
with peers to co-construct meaning (Joksimović et al., 2015). Students were explicitly instructed
in the directions for the discussions to try each powerup at least once in their initial posts over
the course of the semester. However, just under half of the students (n=6, 46%) actually used
each of the powerups at least once in initial posts, and only one student attempted all of the
powerups at least once in the comments.
In question five of the survey (Appendix A), students were asked to reflect on which
powerups they were using most often, and which powerups were avoided most often. Responses
were coded inductively and then integrated with the quantitative data presented in Tables 1 and
2, showing 85% (n=11) of the students identified #remember as their most preferred of the
powerups to use. This was not a surprising finding given that students tend to participate in
discussions using the lower levels of Bloom’s (Gao, Zhang, & Franklin, 2013), but was a
somewhat sobering finding because the powerups strategy was intended to mitigate this
behavior. However, the responses for why students chose to use #remember were coded
inductively for emerging themes from experiences using digital powerups in online discussion.
The reported student perceptions for using #remember were: ease of use (n=5, 38%), way to
label key points from the reading to remember for later (n=4, 31%), and aligns with the way
student studies and reflects on readings (n=2, 14%). Similarly, student responses for why they
avoided certain powerups were coded using constant comparison of the similarities and
differences. The majority of students (n=7, 54%) reported avoiding #apply, #analyze and
#evaluate, which was supported by the descriptive statistics in Tables 2 and 3 as those three
powerups were three of the least used in initial posts and comments. The majority of students
(n=7, 54%) reported having difficulty conceptualizing how to use those specific powerups,
which shows that the mid-level or application level of the powerups were difficult for students to
illustrate in practice. Only one student reported avoiding #create, which was also underutilized in
the discussion forums but proved to be important for earning a bonus from peers.
3.1.2. Instructor presence focused on individualized interactions and assignment feedback
Next in Phase 1, the trace quantitative data from the learning analytics available in the
Canvas LMS were collected to show instructor presence quantitatively. A bubble chart was
intentionally created as a visualization of all instructor interactions with students across time of
the semester and interactions in four domains of the course: announcements, discussion posts,
Canvas inbox messages, and (Figure 6). Organizing the quantitative data in a visual format
provides a quick reference to the sustained interaction throughout the course. Figure 6 illustrates
the volume and frequency of instructor communication with students across the semester via
announcements, discussion posts, inbox messages and assignment comments. The larger a
bubble appears, the more students were interacted with on that date and in that domain.
Of particular note were the interactions using the individualized Canvas messages, and
the individual submission comments provided to students in each of the assignments. This was
an intentional instructional decision because I viewed individual interactions with students as
Figure 6. Instructor presence: Interactions with students using announcements, discussion posts,
Canvas LMS inbox messages, and submission comments.
important in a CoI framework to keep communication lines open with students and provide
timely feedback. My participation as the instructor in the discussion forums was intentionally
kept to a minimum, although individualized messages via the Canvas Inbox and submission
comments were utilized much more often.
3.1.3. Cognitive presence was best supported by digital powerups and instructor presence
Phase 2 began with analyzing student responses in the anonymous survey to provide
additional context. Students were asked to identify both their barriers and their supports in
completing the discussions in this course in question two of the anonymous survey (Appendix
A). Students identified their perceived barriers in an open-ended question format, and their
responses were coded to reflect common themes among students. Student responses for barriers
to completing the discussions included: lack of time or time constraints (n=8, 57%), and
difficulty using the LMS technology (n=6, 43%). These results are consistent with common
barriers reported by adult learners in online courses in the literature (Bonk, Lee, Kou, Xu, &
Sheu, 2015; Michinov, Brunot, Le Bohec, Juhel, & Delaval, 2011). Of particular interest were
the supports identified because the majority of the students pointed to the discussion structures or
digital powerups themselves as supports (n=8, 57%), with the remainder of the students
identifying instructor presence as their biggest supports (n=6, 43%). Comments from students
that were coded as exemplars for the assignment structures included:
[The powerups] have challenged me in ways I never thought possible while still allowing
my creativity to thrive.
I have been inspired to step outside of my comfort zone, and [the powerups] have helped
in that now I can apply new skills to support teachers, and student learning.
Comments from students that were coded as exemplars for instructor presence speak to
asynchronous communication between the instructor and the students, included:
I appreciate [instructor] reminders and written personal responses…to help me know that
I’m on the right track.
I appreciate the one-on-one feedback and quick answers to questions as I work through
the [discussions].
I explored the concept of supports further using the student responses from question four
of the midterm survey. Students were asked specifically how the powerups scaffolded the way
they prepared for and engaged in the online discussions for the course. The nine student
responses to question four were coded deductively using the way in which they approached
cognitive presence: exploration or critical thinking (n=3, 33%), resolution or application of
content (n=3, 33%), integration or construction of meaning (n=2, 22%). These are interesting
results, however it should be noted that in CoI students often explore the various phases
iteratively (Garrison, Anderson, & Archer, 2001) and don’t remain in just one phase or another.
Further, the intended use of the digital powerups was to support both cognitive and social
presence by intentional and structured teaching presence. Student comments coded as exemplars
revealed additional insights that indicate employing the digital powerups strategy impacted their
cognitive presence:
In some ways the powerups help me to engage more specifically with the content rather
than just making vague references or going on my "gut" feelings or personal experiences.
They add focus and intent. [The powerups] help eliminate most cheap, content-poor
[The powerups] allow me to make connections that are important to me, and relevant to
me. They also help me to see different ways to consider topics.
The powerups made me more aware of how I am learning, and I’ve had to do a lot more
reflection on what I read. Honestly, I wasn't too excited about them at first, but I now
know that they are increasing my learning capacity.
Each of these student comments point to the impact of cognitive presence when utilizing
the digital powerups. These qualitative responses from students support some of the foundational
findings of CoI from Garrison and Cleveland-Innes (2005) that instructors to student interaction,
or the humanized elements of a course are vital in establishing cognitive presence. Further,
approaching this question by mixing quantitative and qualitative results highlights ideas from
Gašević, Dawson, Rogers, and Gasevic (2016), who concluded that utilizing data only from
learning analytics provides shallow results because of variations in the approaches students take
to utilizing technology, engagement patterns in the LMS, and overall cognitive presence. In the
integration of the quantitative and qualitative data in this section I not only gained insights into
how students were using the digital powerups, but found that the digital powerups strategy
actually impacted the way students approached engaging in the content before they began
interacting in the discussion form. This is a point that should be further explored in future
3.2. How did the digital powerups strategy impact social presence in online discussions?
3.2.1. Correlation between using #create and earning bonus from peers
To address social presence using the digital powerups in Phase 1, I decided to first
investigate whether there was a statistically significant relationship between the use of the
different digital powerups and earning a bonus from peers. Therefore, the quantitative approach
began with a Pearson correlation analysis which was conducted between each of the digital
powerups and the bonus from peers. The observations for Bonus had an average of 1.08 (SD =
1.04, SEM = 0.29, Min = 0.00, Max = 3.00). The analysis showed there was a statistically
significant (p < .05) positive correlation between #create and bonus (rp = 0.56, p = .049). The
correlation coefficient between #create and bonus was 0.56 indicating a large effect size (Cohen,
1988). This indicates that students who used the #create powerup more often, they had a higher
chance of earning a bonus from their peers.
This is an encouraging finding because one of the goals of the digital powerups strategy
is to move students out of the lower levels of Bloom’s so that “higher-order learning emerges in
a community of inquiry” learning environment (Garrison & Cleveland-Innes, 2005, p. 137).
Higher-order student engagement is often referred to as deep learning, which is characterized by
students internalizing the content and finding personalized meaning. Although it is common for
students to put forth only minimal effort toward required outcomes when instructors adopt a non-
individualized approach to instruction (Trigwell, Prosser, & Waterhouse, 1999), the results of
this correlation highlight the concept of humanized learning. When students can incorporate
relevant experiences and knowledge into their work and feel validated in doing so, students
naturally tend to shift from surface level understanding to higher-order engagement.
3.2.2. Linear regression revealed use of #remember lowers chance to earn a bonus from peers
Also in Phase 1, I conducted a stepwise linear regression analysis to understand the
relationship between the dependent variable of earning a bonus from peers (Bonus) and the
independent variables of #remember, #understand, #apply, #analyze, #evaluate, #create,
#connect, and the total digital powerups (Total) used as predictors. That is, for the regression
analyses, the dependent variable was calculated as the Bonus earned from peers, and each
individual powerup used per student and the total number of powerups used per student were
processed as the predicting variables. The best-fit model from the regression only included
#remember. The results of the regression model were significant (t = -2.87, p < .05), indicating
that approximately 46% of the variance in Bonus is explainable by the use of #remember.
However, of all the powerups, only #remember significantly predicted earning a bonus from
peers. This indicates that, on average, a one-unit increase of #remember will actually decrease
the value of Bonus by 0.35 units. This result indicates that the students who used #remember the
most had a lower chance of earning a bonus from their peers. Interestingly, the total number of
powerups used overall did not significantly predict earning a bonus (t = 1.51, p > .05). Based on
this sample, a one-unit increase in total powerups does not have a significant effect on bonus
from peers. Table 4 summarizes the results of the regression model for #remember and for total
number of powerups used per post.
It should be noted that the #remember powerup was used in some of the posts that ended
up receiving a bonus from peers. The result of this regression highlights the fact that the three
students who used the #remember powerup the most throughout the semester were also the three
students who did not receive a bonus from their peers in any of the weekly discussions.
Causation of using the #remember and not earning a bonus is not being implied here. Rather, this
finding suggests that students in this course found value in peer posts that moved away from
lower-level posts (like #remember) and toward higher-order levels which is a finding that has
also been confirmed in the literature (Akyol & Garrison, 2008; Garrison et al., 2001).
Table 4
Best fit model summary for stepwise linear regression with #remember predicting bonus.
95% CI
[-1.53, 5.07]
[-0.62, -0.08]
[-0.04, 0.21]
Note. Results: F(2,10) = 4.23, p = .047, R2 = 0.46, n = 345
Unstandardized Regression Equation: Bonus = 1.77 - 0.35*remember + 0.08*Total
3.2.3. Students perceived digital powerups to support social presence and authentic engagement
Although there are a number of quantitative measures that are typically used in
discussion forum analysis including key words used, total words used, semantic analysis, quality
responses, etc. (Marra, Moore, Kilmczak, 2004), I determined for this study that the qualitative
responses from students were much more meaningful for the explanatory nature of this case
study. The final discussion required students to engage in a metacognitive exercise comparing
their previous experience of engaging in online discussions to the current semester and more
specifically to this course utilizing digital powerups. Therefore, Phase 2 led to qualitative
analysis of the discussion post by the student who earned the most likes from her peers, who will
be referred to as Lucia to protect her identity. Lucia prefaced her initial post to her peers by
explaining that this discussion topic intrigued her because she was taking two online courses
concurrently, and both courses relied heavily on the discussion forums as the space for students
to interact and engage with each other. She explained that although this course utilized the digital
powerups strategy, the other course focused heavily on each student answering the exact same
question and then forcing students to comment on at least two other posts, which she perceived
to create unauthentic and forced discourse. The approach to discussion in the other class that
Lucia mentioned is addressed by Riggs and Linder (2016) when they explain that “When asked
to line up and answer in this manner, very little is said—and in great, repetitive volume” (p. 7).
Lucia further reflected on her experience with digital powerups in this course:
“I am personally learning and growing by connecting with others. I feel that people's
comments are more "real" in this class, in that there is an element of safety to say things
the way they really are and admit when things are hard or beyond our current
capabilities. I feel that our comments and contributions are authentic and that I have
learned so much from the great things all of you are doing out in your classrooms,
schools, and districts.”
After sharing a few more of her perceptions of how she felt online course discussions
should be presented to students, she succinctly shared her overall takeaways from the course.
Like many of her classmates had done throughout the semester, Lucia chose to frame those
takeaways using the #create digital powerup:
#create How can I EMPOWER my students:
Each opinion is important. Create ways for all voices to be heard.
Motivation is key to learning. Games, competition, and debate create motivation.
Participation that is meaningful is much better than participation that is required.
Online doesn't mean impersonal. Make online encounters engaging.
What works for one class or students may not work for all. Keep trying new things.
Emphasize the process, not the product.
Relinquishing some control to the learner is good teaching.
Lucia’s acrostic creation on how to empower students in online discussions is not only a
stellar example of using the #create powerup, but it is a meaningful culmination of her
experience from the course content. Further, her final takeaway could be used as a guiding
philosophy for future instructors interested in using the digital powerups strategy. Her response
appears to have resonated well with her classmates given the number of likes and comments she
received in response to her post. This final post was an ideal way to exhibit in how “the
interactions related to social presence illustrate that the students were willing to share their
experiences and encourage interactions that were related to the assignments given to them”
(Annamalai & Tan, 2016, p. 12), and that “higher-order learning emerges in a community of
inquiry (Garrison & Cleveland-Innes, 2005, p. 137).
4. Conclusion
This case study contributes to the literature on CoI and online discussions by focusing on
an instructional strategy that employs a social constructivist approach to engaging students in
online discussions by creating a humanized learning environment using scaffolds and hashtags.
Although the current literature addresses the use of Bloom’s revised taxonomy to improve
questions and prompts in online discussions (Whiteley, 2014; Yang et al., 2010) and the use of
CoI for student engagement in online courses (Garrison et al., 2010; Swan, 2005; Zydney, 2014),
this study is the first to address the design and implementation of the digital powerups strategy in
online discussions. The findings from this study contribute to practice (both design and
instruction) by explaining an effective strategy for improving student engagement in online
course discussions. The intentional course design and instructor scaffolding inherent in instructor
presence is vital to the successful implementation of CoI framed strategies like digital powerups
(Garrison, 2009; Joksimović et al., 2015; Zydney et al., 2012).
4.1. Implications for practice
Online discussions can “allow students to participate actively and interact with students
and faculty” (Baglione & Nastanski, 2007, p. 139); however, without proper course design and
teacher presence, online discussions tend to only focus on “lower level of thinking and
discourse” (Christopher et al., 2004, p. 170). Many factors contribute to this deficiency, but
pushing students into higher-order thinking can be challenging and requires the instructor to
provide appropriate scaffolding (Kanuka et al., 2007; Whitely, 2014).
Through the intentional design of the online discussions for this course, it was initially
anticipated that using the digital powerups strategy would push students into the higher-order
levels of engagement, but the #remember powerup ended up being used most often by students.
At face value, this appears to be a disappointing finding; however, students were less likely to
earn the bonus from their peers when using the #remember powerup. Students also identified that
they chose to use #remember most often because it was easiest to use, it was a good way to label
key points from the course readings, and it naturally fit with the way students engaged in
cognitive presence. In other words, often students were reading for basic understanding.
However, perhaps more importantly, it was identified that the #create (higher-order level)
powerup was more likely to earn a bonus from peers. Although this behavior could be attributed
to peers recognizing the higher-order nature of #create and awarding a like, it is probably more
because the #create posts included video, poetry, deliverables, and other media that were more
appealing than written text and more personally applicable to the current practice of these
students who are in-service K-12 teachers.
This study confirmed that instructor presence (Bradley et al., 2008; Mazzolini &
Maddison, 2007; Rovai, 2007; Salmon, 2004) is a key factor in supporting student engagement
in online discussions. I recommend that future implementations take a balanced approach to both
structure and autonomy (Gilbert & Dabbagh, 2005; Jang, Reeve, & Deci, 2010) because “overall
the design of the online space is the key factor to fostering and maintaining a functional
community of inquiry” (Moore, 2011, p. 19). A balance of structure and autonomy can be
achieved by using digital powerups and creating a humanized learning environment that provides
explicit instructions and expectations but also allows for students to personalize their learning.
As evidenced in the results from question one, my pre-emptive investment in course
design and additional supports I provided as the instructor appears to have positively impacted
the overall student learning experience and positively impacted both cognitive presence and
social presence for students. “To establish and maintain a community of inquiry requires a
thoughtful, focused and attentive teaching presence” (Garrison et al., 2010, p. 32), and although
digital powerups are one piece of the puzzle, it is also important to consider other humanizing
aspects of the course that scaffold students toward success. I found in this study that the
individual interactions with students were perceived as important supports by the students.
In examining the results of student perceptions and use of the digital powerups strategy, it
is perhaps most important to consider all of the ways in which online courses can be humanized.
Humanizing a course can be done by designing for an architecture of engagement (Riggs &
Linder, 2016), which centers on developing a sense of community in a virtual environment
(Kilgore, Bartoletti, & Al Freih, 2018), and by intentionally planning for personalized
communications from the instructor to students throughout the course (Pacansky-Brock, 2012).
Finally, in question eight of the anonymous survey, in responding to what this course gets right,
one student addressed the digital powerups approach, and profoundly responded that “Obviously,
it is not lost on me that if I love [the digital powerups] so much my own students would also
equally appreciate a change of pace once in a while.” This student response speaks to the idea
that humanizing instruction is truly about doing what is best for students.
The results from the study suggest that the implementation was successful in engaging
most students at every level of Bloom’s taxonomy in the discussion forums, and student
perception of learning and engagement were high. This study provides a jumping off point for
additional iterations of this strategy in higher education online courses and for future research on
the digital powerups strategy. There are many ways the digital powerups strategy can be
implemented in online or blended courses and with various approaches to instructor presence.
Future research should continue to improve upon the initial implementation and results explained
in this study.
4.2. Limitations and threats to validity
There are some limitations to the present study that need to be acknowledged. The
importance of a subjective lens (Flick, 2014) is considered in all data and results produced from
this study; however, I served as the instructor of the course in this study, and my interpretation of
the intentional design results were an integral part of the overall study, so this must be taken into
consideration when applying the strategies in a different context with a different instructor.
Reactivity must also be considered as a threat to validity in this study because students may have
responded in ways that avoided impacting their grades negatively. However, the midterm survey
was provided in an anonymous format, and the identities of students were not at risk. Although
considered a limitation, the convenient purposive sample (Creswell, 2012) was intentionally
investigated in order to gain a better understanding and insight into this instructional strategy
from the student perspective (Song, Singleton, Hill, & Koh, 2004). Also, the results come from
only one course that included only 13 students, so findings from this study should not be
considered to be generalizable. However, as a revelatory study (Yin, 2009) the results do provide
a starting point for future research on the digital powerups strategy. The methodological
decisions in coding and the coding schemes produced may have emerged differently from a
different researcher and may have led to different findings. Finally, in the stepwise regression,
although the data were cleaned to segment by student, I did not statistically account for the
nested nature of the multiple powerups used by each student in initial posts and comments.
It is also worth noting that one criticism of the effectiveness of the digital powerups
strategy is that if a student makes the first post in a given week, every other student will see that
student’s post, so it could potentially earn the most likes simply by virtue of being the first post.
To address this potential criticism, I conducted a Pearson correlation analysis between how early
an initial discussion post was made and earning a bonus from peers. There were no statistically
significant findings that connect posting early to earning a bonus from peers.
4.3. Future directions
Given the results of this study, future iterations or uses of the digital powerups strategy
could provide additional incentives or structures for students to utilize one powerup from each of
the levels of Bloom’s (lower level, mid-level application, and higher-order) to ensure that every
student is engaging in the mid-level and higher-order levels each week. Another way this could
be facilitated could be giving differentiated point values for each of the three levels. For
example, lower level powerup would earn one point, an application level powerup would be
worth two points, and a higher-order level powerup would be worth three points. I see this
strategy as still emerging, and therefore additional changes and modifications should be explored
by other instructors.
Future studies could focus on evaluating if the digital powerups being used by students
are actually engaging in the student in that level of Bloom’s. For example, if a student uses the
#apply digital powerup in their initial post, it would be interesting to identify whether or not the
student is actually engaging in that mid-level range of Bloom’s in their approach to the
discussion. Because the digital powerups strategy was designed specifically to utilize Bloom’s
taxonomy, this study focused on higher-order discussions; however, future studies may explore
the how digital powerups engage students in the four categories of cognitive presence from the
CoI framework, including triggering event, exploration, integration, and resolution (Rolim,
Ferreira, Lins, & Gǎsević, 2019).
Afolabi, R. O. (2016). Emphasizing instructor presence in digital learning environments.
In Creating teacher immediacy in online learning environments (pp. 116-117). IGI
Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended
community of inquiry: Assessing outcomes and processes for deep approaches to
learning. British Journal of Educational Technology, 42(2), 233-250.
Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online
education in the United States. Babson Park, MA: Babson Survey Research Group and
Quahog Research Group, LLC.
Andresen, M. A. (2009). Asynchronous discussion forums: Success factors, outcomes,
assessments, and limitations. Educational Technology & Society, 12(1), 249-257.
Annamalai, N., & Tan, K. E. (2016). Social presence of the Community of Inquiry (CoI) model
on an online narrative writing platform via Facebook. Malaysian Journal of ELT
Research, 10(2), 18.
Arbaugh, J. B. (2008). Does the community of inquiry framework predict outcomes in online
MBA courses? International Review of Research in Open and Distance Learning, 9(2).
Baglione, S., & Nastanski, M. (2007). The superiority of online discussions: Faculty perceptions.
The Quarterly Review of Distance Education, 8(2), 139-150.
Barnes, C. L. (2016). Where’s the Teacher? Defining the Role of Instructor Presence in Social
Presence and Cognition in Online Education. In Kilgore, Bartoletti, & Al Freih,
Humanizing online teaching and learning. Pressbooks.
Bates, D., Mächler, M., Bolker, B., & Walker, S. (2014). Fitting linear mixed-effects models
using lme4. arXiv preprint arXiv:1406.5823.
Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and
implementation for novice researchers. The Qualitative Report, 13(4), 544-559.
Beach, A. L., Sorcinelli, M. D., Austin, A. E., & Rivard, J. K. (2016). Faculty development
in the age of evidence: Current practices, future imperatives. Stylus Publishing, LLC.
Berelson, B. (1952). Content analysis in communication research. Glencoe, IL: Free Press.
Bernard, H. R., Wutich, A., & Ryan, G. W. (2016). Analyzing qualitative data: Systematic
approaches. SAGE publications.
Bradley, M. E., Thom, L. R., Hayes, J., & Hay, C. (2008). Ask and you will receive: How
question type influences quantity and quality of online discussions. British Journal of
Educational Technology, 39(5), 888-900.
Bonk, C. J., Lee, M. M., Kou, X., Xu, S., & Sheu, F. R. (2015). Understanding the self-directed
online learning preferences, goals, achievements, and challenges of MIT
OpenCourseWare subscribers. Journal of Educational Technology & Society, 18(2), 349-
Chen, B., Chang, Y. H., Ouyang, F., & Zhou, W. (2018). Fostering student engagement in online
discussion through social learning analytics. The Internet and Higher Education, 37, 21-
Chen, B., deNoyelles, A., Patton, K., & Zydney, J. (2017). Creating a Community of Inquiry in
Large-Enrollment Online Courses: An Exploratory Study on the Effect of Protocols
within Online Discussions. Online Learning, 21(1), 165-188.
Cheung, W. S., Hew, K. F., & Ng, C. S. L. (2008). Toward an understanding of why students
contribute in asynchronous online discussions. Journal of Educational Computing
Research, 38(1), 29-50.
Chiang, A. L., Vartabedian, B., & Spiegel, B. (2016). Harnessing the hashtag: a standard
approach to GI dialogue on social media. The American journal of
gastroenterology, 111(8), 1082.
Cho, M. H., & Cho, Y. (2016). Online Instructors’ Use of Scaffolding Strategies to Promote
Interactions: A Scale Development Study. The International Review of Research in Open
and Distributed Learning, 17(6).
Christopher, M. M., Thomas, J. A., & Tallent-Runnels, M. K. (2004). Raising the bar:
Encouraging high level thinking in online discussion forums. Roeper Review, 26(3), 166-
Cohen, J. (1988). Statistical power analysis for the behavior sciences (2nd ed.). St. Paul, MN:
West Publishing Company.
Conover, W. J., & Iman, R. L. (1981). Rank transformations as a bridge between parametric and
nonparametric statistics. The American Statistician,35(3), 124-129.
Conrad, R. M., & Donaldson, J. A. (2011). Engaging the online learner: Activities and resources
for creative instruction (Vol. 36). John Wiley & Sons.
Covelli, B. J. (2017). Online Discussion Boards: The Practice of Building Community for Adult
Learners. The Journal of Continuing Higher Education, 65(2), 139-145.
Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five
approaches. Sage publications.
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed method research.
Thousand Oaks, CA: Sage.
Cross, N. (1982). Designerly ways of knowing. Design studies, 3(4), 221-227.
Darabi, A., Arrastia, M. C., Nelson, D. W., Cornille, T., & Liang, X. (2011). Cognitive presence
in asynchronous online learning: A comparison of four discussion strategies. Journal of
Computer Assisted Learning, 27(3), 216-227. doi:10.1111/j.1365-2729.2010.00392.x
DeCarlo, L. T. (1997). On the meaning and use of kurtosis. Psychological methods, 2(3), 292-
Dewey, J. (1923). Democracy and education: An introduction to the philosophy of education.
Dotson, T. (2013). Design for community: Toward a communitarian ergonomics. Philosophy &
Technology, 26(2), 139-157.
Emelyanova, N., & Voronina, E. (2014). Introducing a learning management system at a Russian
university: Students' and teachers' perceptions. The International Review of Research in
Open and Distributed Learning, 15(1).
Ertmer, P. A., Sadaf, A., & Ertmer, D. J. (2011). Student-content interactions in online courses:
The role of question prompts in facilitating higher-level engagement with course
content. Journal of Computing in Higher Education, 23(2-3), 157.
Fetters, M. D., Curry, L. A., & Creswell, J. W. (2013). Achieving integration in mixed methods
designs—principles and practices. Health Services Research, 48(6pt2), 2134-2156.
Field, A. (2009). Discovering statistics using SPSS. Sage publications.
Fisher, D., & Frey, N. (2013). Better learning through structured teaching: A framework for the
gradual release of responsibility. ASCD.
Flick, U. (2014). An introduction to qualitative research. Sage.
Flowerday, T., & Schraw, G. (2000). Teacher beliefs about instructional choice: A
phenomenological study. Journal of educational psychology, 92(4), 634.
Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not
promote one size fits all: The effects of instructional conditions in predicting academic
success. The Internet and Higher Education, 28, 68-84.
Gao, F., Zhang, T., & Franklin, T. (2013). Designing asynchronous online discussion
environments: Recent progress and possible future directions. British Journal of
Educational Technology, 44(3), 469-483.
Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice.
Taylor & Francis.
Garrison, R. (2009). Implications of online and blended learning for the conceptual development
and practice of distance education. International Journal of E-Learning & Distance
Education, 23(2), 93-104.
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education. The internet and higher
education, 2(2-3), 87-105.
Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework:
Review, issues, and future directions. Internet and Higher Education, 10(3), 157–172.
Garrison, D. R., & Archer, W. (2000). A transactional perspective on teaching and learning: A
framework for adult and higher education. UK: Emerald.
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online
learning: Interaction is not enough. The American journal of distance education, 19(3),
Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships
among teaching, cognitive and social presence: Student perceptions of the community of
inquiry framework. The Internet and Higher Education, 13(1-2), 31-36.
Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful
discourse: A case study. British Journal of Educational Technology, 36(1), 5-18.
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for
mixed-method evaluation designs. Educational evaluation and policy analysis, 11(3),
Guetterman, T. C., Fetters, M. D., & Creswell, J. W. (2015). Integrating quantitative and
qualitative results in health science mixed methods research through joint displays. The
Annals of Family Medicine, 13(6), 554-561.
Gustafson, B. (2014, March 28). Power-Up Digital Learning [Blog post]. Retrieved from
Gustafson, B. (2016). Renegade Leadership: Creating Innovative Schools for Digital-Age
Students. Corwin Press.
Harder, H. (2010). Explanatory case study. In Mills, A. J., Durepos, G., & Wiebe, E. (Eds.),
Encyclopedia of case study research, 370-371. Thousand Oaks, CA: Sage.
Hay, A., Peltier, J. W., & Drago, W. A. (2004). Reflective learning and on-line management
education: A comparison of traditional and on-line MBA students. Strategic
Change, 13(4), 169.
Herman, J., & Nilson, L. (2018). Creating engaging discussions: Strategies for “avoiding
crickets” in any size classroom and online. Sterling, VA: Stylus Publishing.
Hewitt, J. (2003). How habitual online practices affect the development of asynchronous
discussion threads. Journal of Educational Computing Research, 28, 31-45.
Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multidimensional
analysis of math and science undergraduate course planning and classroom
teaching. Journal of the Learning Sciences, 22(2), 212-257.
Ivankova, N. V., Creswell, J. W., & Stick, S. L. (2006). Using mixed-methods sequential
explanatory design: From theory to practice. Field methods, 18(1), 3-20.
Jang, H., Reeve, J., & Deci, E. L. (2010). Engaging students in learning activities: It is not
autonomy support or structure but autonomy support and structure. Journal of
educational psychology, 102(3), 588.
Johnson, B., & Christensen, L. (2017). Educational research: Quantitative, qualitative, and
mixed approaches. Thousand Oaks, CA: Sage.
Joksimović, S., Gašević, D., Kovanović, V., Riecke, B. E., & Hatala, M. (2015). Social presence
in online discussions as a process predictor of academic performance. Journal of
Computer Assisted Learning, 31(6), 638-654.
Jones, H. (2017). Humanizing the Online Experience Through Effective Use and Analysis of
Discussion Forums. In Handbook of Research on Humanizing the Distance Learning
Experience (pp. 334-363). IGI Global. DOI: 10.4018/978-1-5225-0968-4.ch015
Kanuka, H., Rourke, L., & Laflamme, E. (2007). The influence of instructional methods on the
quality of online discussion. British Journal of Educational Technology, 38(2), 260-271.
Kilgore, W. K., Bartoletti, R., & Al Freih, M. (2018). Humanizing Online Teaching and
Learning. Pressbooks.
Knowles, M. S. (1986). Using learning contracts: Practical approaches to individualizing and
structuring learning. Jossey-Bass.
Könings, K. D., Seidel, T., & van Merriënboer, J. J. (2014). Participatory design of learning
environments: integrating perspectives of students, teachers, and designers. Instructional
Science, 42(1), 1-9.
Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory into
practice, 41(4), 212-218.
Lambiase, J. J. (2010). Hanging by a thread: Topic development and death in an online
discussion of breaking news. Language at Internet.
Lee, J. (2006). Under construction: Scaffolding the expansion of online learning communities
through computer-mediated communication. Educational Media and Technology
Yearbook, 31, 51-64.
Lee, J., & Bonk, C. J. (2016). Social network analysis of peer relationships and online
interactions in a blended class using blogs. The Internet and Higher Education, 28, 35-
Lee, E., Pate, J. A., & Cozart, D. (2015). Autonomy support for online students. TechTrends,
59(4), 54-61.
Lowenthal, P. R. (2010). Social presence. In Social computing: Concepts, methodologies, tools,
and applications (pp. 129-136). IGI Global.
Maor, D. (2003). The teacher's role in developing interaction and reflection in an online learning
community. Educational Media International, 40(1-2), 127-138.
Marra, R. M., Moore, J. L., & Klimczak, A. K. (2004). Content analysis of online discussion
forums: A comparative analysis of protocols. Educational Technology Research and
Development, 52(2), 23.
Mayring, P. (2004). Qualitative content analysis. A companion to qualitative research, 1, 159-
Mazzolini, M., & Maddison, S. (2007). When to jump in: The role of the instructor in online
discussion forums. Computers & Education, 49(2), 193–213.
Menard, S. (2009). Logistic regression: From introductory to advanced concepts and
applications. Sage Publications. Thousand Oaks, CA.
Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of helpfulness of facilitation
strategies that enhance instructor presence, connectedness, engagement and learning in
online courses. The Internet and Higher Education, 37, 52-65.
Michinov, N., Brunot, S., Le Bohec, O., Juhel, J., & Delaval, M. (2011). Procrastination,
participation, and performance in online learning environments. Computers &
Education, 56(1), 243-252.
Moore, D. (2011). Using collaborative online discussion effectively for teaching. Journal of
Applied Learning Technology, 1(4), 19–23.
Moreillon, J. (2015). Increasing interactivity in the online learning environment: Using digital
tools to support students in socially constructed meaning-making. TechTrends, 59(3), 41-
Nilson, L. B., & Goodson, L. A. (2017). Online teaching at its best: Merging instructional
design with teaching and learning research. John Wiley & Sons.
Onwuegbuzie, A. J., & Johnson, R. B. (2006). The validity issue in mixed research. Research in
the Schools, 13(1), 48-63.
Orcutt, J. M., & Dringus, L. P. (2017). Beyond Being There: Practices that Establish Presence,
Engage Students and Influence Intellectual Curiosity in a Structured Online Learning
Environment. Online Learning, 21(3).
Osborne, J., & Waters, E. (2002). Four assumptions of multiple regression that researchers
should always test. Practical assessment, research & evaluation, 8(2), 1-9.
Pacansky-Brock, M. (2012). Best practices for teaching with emerging technologies. Routledge.
Patall, E. A., Cooper, H., & Wynn, S. R. (2010). The effectiveness and relative importance of
choice in the classroom. Journal of Educational Psychology, 102(4), 896.
Patton, M. Q. (1990). Qualitative evaluation and research methods. Beverly Hills, CA: Sage.
Pea, R. D. (2004). The social and technological dimensions of scaffolding and related theoretical
concepts for learning, education, and human activity. The Journal of the Learning
Sciences, 13(3), 423-451.
Phirangee, K., Demmans Epp, C., & Hewitt, J. (2016). Exploring the relationships between
facilitation methods, students' sense of community and their online behaviours. Special
issue on online learning analytics. Online Learning Journal, 20(2), 134–154.
Piro, J. M. (2008). Foucault and the architecture of surveillance: Creating regimes of power in
schools, shrines, and society. Educational Studies, 44(1), 30-46.
Posada, D., & Buckley, T. R. (2004). Model selection and model averaging in phylogenetics:
advantages of Akaike information criterion and Bayesian approaches over likelihood
ratio tests. Systematic Biology, 53(5), 793-808.
Qayyum, A., Zipf, S., Gungor, R., & Dillon, J. M. (2019). Financial aid and student persistence
in online education in the United States. Distance Education, 40(1), 20-31.
Radenski, A. (2009, July). Freedom of choice as motivational factor for active learning. In ACM
SIGCSE Bulletin (Vol. 41, No. 3, pp. 21-25). ACM.
Reeve, J., & Jang, H. (2006). What teachers say and do to support students’ autonomy during a
learning activity. Journal of Educational Psychology, 98(1), 209-218.
Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to
students' perceived learning and satisfaction. Journal of Asynchronous Learning
Networks, 7(1), 68–88.
Riggs, S. & Linder, K. (2016). IDEA Paper: Actively engaging students in asynchronous online
classes. Manhattan, KS: The IDEA Center.
Rolim, V., Ferreira, R., Lins, R. D., & Gǎsević, D. (2019). A network-based analytic approach to
uncovering the relationship between social and cognitive presences in communities of
inquiry. The Internet and Higher Education, 42, 53-65.
Rovai, A. P. (2007). Facilitating online discussions effectively. The Internet and Higher
Education, 10(1), 77–88.
Rubin, B., Fernandes, R., & Avgerinou, M. D. (2013). The effects of technology on the
Community of Inquiry and satisfaction with online courses. The Internet and Higher
Education, 17, 48-57.
Salmon, G. (2004). E-moderating: The key to teaching and learning online. Psychology Press.
Scammon, D. L., Tomoaia-Cotisel, A., Day, R. L., Day, J., Kim, J., Waitzman, N. J., Farrell, T.
M., & Magill, M. K. (2013). Connecting the dots and merging meaning: using mixed
methods to study primary care delivery transformation. Health Services Research, 48(6
Pt 2), 2181.
Schoonenboom, J., & Johnson, R. B. (2017). How to Construct a Mixed Methods Research
Design. KZfSS Kölner Zeitschrift für Soziologie und Sozialpsychologie, 69(2), 107-131.
Seaman, J. E., Allen, I. E., & Seaman, J. (2018). Grade Increase: Tracking Distance Education in
the United States. Babson Survey Research Group.
Shea, P., & Bidjerano, T. (2009). Community of inquiry as a theoretical framework to foster
“epistemic engagement” and “cognitive presence” in online education. Computers &
Education, 52(3), 543-553.
Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student
perceptions of useful and challenging characteristics. The internet and higher
education, 7(1), 59-70.
Stake, R. (1995). The art of case study research. Thousand Oaks, CA: Sage.
Stevens, J. P. (2009). Applied multivariate statistics for the social sciences (5th ed.). Mahwah,
NJ: Routledge Academic.
Sull, E. C. (2014). The (Almost) complete guide to effectively managing threaded discussions.
Distance Learning, 11(3), 11.
Swan, K. (2005). A constructivist model for thinking about learning online. In J. Bourne & J. C.
Moore (Eds), Elements of Quality Online Education: Engaging Communities. Needham,
MA: Sloan-C.
Swan, K., Garrison, D. R., & Richardson, J. (2009). A constructivist approach to online learning:
The community of inquiry framework. In C. R. Payne (Ed.), Information technology and
constructivism in higher education: Progressive learning frameworks. Hershey, PA: IGI
Swan, K., Shea, P., Richardson, J., Ice, P., Garrison, D. R., Cleveland-Innes, M., & Arbaugh, J.
B. (2008). Validating a measurement tool of presence in online communities of inquiry.
E-mentor, 2(24), 1-12.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating
quantitative and qualitative approaches in the social and behavioral sciences. Sage.
Thurston, T. N. (2018). Design case: Implementing gamification with ARCS to engage digital
natives. Journal on Empowering Teaching Excellence, 2(1), 5.
Thurston, T. N., & Schneider, K. (2019). Structuring Personalized Faculty Development
Programming with Autonomy-Support and Microcredentials. In A. Elçi, L. Beith, & A.
Elçi (Eds.), Handbook of Research on Faculty Development for Digital Teaching and
Learning. Hershey, PA: IGI Global Publishing. DOI: 10.4018/978-1-5225-8476-6.ch008
Trigwell, K., Prosser, M., & Waterhouse, F. (1999). Relations between teachers' approaches to
teaching and students' approaches to learning. Higher Education, 37(1), 57–70.
Valcke, M., De Wever, B., Zhu, C., & Deed, C. (2009). Supporting active cognitive processing
in collaborative groups: The potential of Bloom's taxonomy as a labeling tool. The
Internet and Higher Education, 12(3), 165-172.
Vygotsky, L. S. (1980). Mind in society: The development of higher psychological processes.
Harvard university press.
Westfall, P.H., & Henning, K.S.S. (2013). Texts in statistical science: Understanding advanced
statistical methods. Boca Raton, FL: Taylor & Francis.
Whiteley, T. R. (2014). Using The Socratic Method and Bloom’s Taxonomy of the Cognitive
Domain to Enhance Online Discussion, Critical Thinking, and Student
Learning. Developments in Business Simulation and Experiential Learning, 33.
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of
child psychology and psychiatry, 17(2), 89-100.
Xie, K., & Ke, F. (2011). The role of students' motivation in peer-moderated asynchronous
online discussions. British Journal of Educational Technology, 42(6), 916-930.
Yang, Y., Yeh, H., & Wong, W. (2010). The influence of social interaction on meaning
construction in a virtual community. British Journal of Educational Technology, 41(2),
Yazan, B. (2015). Three approaches to case study methods in education: Yin, Merriam, and
Stake. The Qualitative Report, 20(2), 134-152.
Yin, R. K. (2009). Case study research: Design and method (4th ed.). Thousand Oaks, CA:
Zydney, J. (2014). Strategies for creating a community of inquiry through online asynchronous
discussions. Journal of online learning and teaching, 10(1), 153.
Zydney, J. M., & Seo, K. K. J. (2012). Creating a community of inquiry in online environments:
An exploratory study on the effect of a protocol on interactions within asynchronous
discussions. Computers & Education, 58(1), 77-87.
Appendix A
Midterm Course Survey
1. Please rate your satisfaction with the course overall:
a. Not Satisfied
b. Somewhat Satisfied
c. Satisfied
d. Very Satisfied
2. What barriers have you encountered in completing your discussions for this course? What
supports have helped you to complete the discussions for this course?
3. Please rate your satisfaction with the way the CIA discussions are structured.
a. Not Satisfied
b. Somewhat Satisfied
c. Satisfied
d. Very Satisfied
4. In what way(s) have the CIA powerups scaffolded the way you approach the discussions,
and the way that you engage in the course content?
5. Which of the powerups do you find yourself using most often? Why?
Which of the powerups do you find yourself avoiding most often? Why?
6. Please rate your satisfaction with the instructor:
a. Not Satisfied
b. Somewhat Satisfied
c. Satisfied
d. Very Satisfied
7. How can this course be improved? Please provide any suggestions or concerns you have
with this class.
8. What does this course get right? Which aspect of this course would you recommend not
changing at all?
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
This chapter explores the theoretical foundations and practical considerations necessary for instructional leaders to improve student retention in higher education online courses by improving training of online instructors using autonomy-supportive principles. To improve instructional practice, faculty development programming should focus on the specific needs of online instructors by providing personalized learning opportunities and reflection. Using self-determination theory as a framework, the psychological needs of instructors engaging in faculty development can be addressed through autonomy (personalization), competence (achievement), and relatedness (support). The authors recommend utilizing digital badges or microcredentials to scaffold programming, including a three-tiered system of badging that builds toward a culminating credential. Autonomy-supportive faculty development programming will empower instructors to improve teaching practice, and better engage online students.
Full-text available
This paper presents a network-based approach to uncovering the relationship between the elements of social and cognitive presences in a community of inquiry. The paper demonstrates how epistemic network analysis (ENA) can provide new qualitative and quantitative insights into the students' development of social and critical thinking skills in communities of inquiry. More specifically, ENA was used to accomplish three different research goals: i) uncovering links between social and cognitive presences of communities of inquiry; ii) evaluating the effectiveness of two instructional interventions on student experience as measured by connections between cognitive and social presences; and iii) exploring how the relationship between social and cognitive presences changed over time during a course. The proposed approach was applied to the coded transcripts of asynchronous online discussions performed in a fully-online graduate level course. The results of this study showed that indicators of social presence had more association with the exploration and integration phases of cognitive presence. Besides, indicators of the affective category of social presence had stronger links with the two high levels of cognitive presence (i.e., integration and resolution), while indicators of interactive messages of social presence were more connected to the two low levels (triggering events and exploration) of cognitive presence.
Full-text available
Gamification is an emerging topic for both student engagement and motivation in higher education online courses as digital natives become post-secondary students. This design case considers the design, development, and implementation of a higher education online course using the ARCS model for motivational design combined with the four-phase model of interest development as a framework for gamification implementation. Through “designerly ways of knowing,” this design case explores engaging digital native students with a gamified online course design, which will be of interest to instructional designers and instructors in higher education. Overall, students in the pilot course responded favorably to the incorporation of gamification and perceived it to have a positive impact on the overall learning experience. Future iterations can improve upon this approach to plan more targeted gamification strategies.
As the demand for online classes increases, so too does the demand for qualified instructors. The shift from the traditional classroom setting to the online classroom can be daunting to first-time online instructors. The educational philosophy of the online class has moved from the teacher-centered focus of the face-to-face class to one that is learner-centered and more indicative of constructivist learning theory. With the focus of online learning being more learner-centered, effective collaborative discussion is the key to deep learning. Instructors need to have access to professional development that will aid in the inception and the enhancement of online learning design and facilitation, including effective collaborative discourse, in order to promote student learning and success.
The role of the instructor in any learning environment is a good motivator for students. It is important to understand how the instructor's position can be viewed in the online learning environment. The principal role of the instructor in any learning environment is to assist students in making connections and identifying significance through an educational process. This chapter, examined from an instructional designer and instructor's point of view, sets out to explore the effectiveness of the online teaching presence as an adequate investment of time for the student's benefit. Full Text Preview Introduction Teaching online has evolved through a myriad of structures without following a particular set of standards in course structure or development. The expectations of traditional teaching is defined and understood when a student comes into a pre-set venue at a pre-set time to listen and learn from an instructor who gives a lecture on a particular topic in a field of study usually in a degree track. With online teaching, several institutions have adopted the “self-paced” and/or an asynchronous, student-centered education model. While the asynchronous method has proven successful in many online learning environments, there is a gap in the instructor and or student relationship and mentorship that learning should provide. Until recently there have not been a set standards set up for the purposes of teaching online. The International Association for K-12 Online Learning (iNACOL) formerly known as the North American Council for Online Learning, have established themselves as essential resources for educators by compiling research, advocating for positive policy changes and helping to create effective standards. iNACOL designed a National Standards for Quality Online Teaching to provide states, districts, online programs, and other organizations with a set of quality guidelines for online teaching and instructional design (Kennedy & Archambault, 2012). The initiative began with a thorough literature review of existing online teaching quality standards, a cross-reference of standards, followed by a survey of iNACOL members and experts to ensure the efficiency of the standards adopted. Part of the results that yielded from this survey revealed that online technologies for learning can provide active learning for students; however, using the technology alone is not enough to foster learning. It is important that online instructors redefine the methods utilized in the traditional classroom by enhancing existing pedagogical practices within a technology based environment. One of the “seven principles” for good practice in undergraduate education which was first established in 1987 by the American Association for Higher Education identifies one of its components as “encourage student-faculty contact” (Hudgins, Orellana & Simonson, 2009). While these principles were designed and formed as a foundation for the traditional classroom, they are an important consideration of instructor practice in the online classroom. The role of the instructor in any learning environment is a good place to start and align how those positions can translate into the online learning environment. The principal role of the instructor in any learning environment is to assist students in making connections and identifying the significance through an educational process. According to Anderson, Rourke, Garrison, and Archer in Preisman (2014), teaching presence consists of three critical roles for the instructor which are as follows: 1. The first responsibility is the designing and organization of the learning experience, which takes place prior to and during the run of the course. 2. Secondly, instructors are responsible for the creating, implementing, and the monitoring of activities that encourage communication and interaction between the students, teachers, and content resources. 3. Finally, instructors must contribute academic knowledge and relevant experiences through forms of direct instruction. The above roles of the online instructor align back to these three positions. This chapter, examined from an instructional designer and instructor’s point of view, sets out to explore the effectiveness of the online teaching presence as an adequate investment of time for the student’s benefit. The motivation behind this chapter arises from observations from an instructional designer while working with faculty to create online courses. The perspective and perceived notion of several instructors is that online learning is more demanding and time consuming than face-to-face instruction (Bollinger & Waslik, 2009; Harber & Mills; 2008, Worley & Tesdell, 2009). This usually poses a challenge for instructional designers to decide on the best use of educational technology and instructional choices not only to meet the student learning outcomes, but also to be a smart investment of the instructor’s time. Continue Reading
Distance education has historically had lower completion rates. Several models of persistence, for on-campus students, indicate that students’ financial status affects their decision to persist. This study involved providing distance education students’ financial aid in the form of institutional scholarships (N = 545) at Penn State University in the United States. The recipients were students who were not overachievers and not at-risk of failing, but somewhere in the middle. The purpose of the study was to test if receiving financial aid was related to students’ persistence. Using logistic regression and Chi-square analysis, this study found that scholarship-receiving students with higher financial need were two times more likely to persist in their studies than those with lower financial need. Results suggest that there is a positive relationship between providing even small financial aid to students with high-financial need and their persistence in their studies.
To fully understand teaching presence and its implications for the intellectual climate of an online classroom it is necessary to explore the phenomenon from the perspective of the instructors who experience it. Informed by the theoretical perspective of the Community of Inquiry (CoI) model, the actions, intentions and perceptions of instructors were investigated through a collective case study. The goal of this study was to examine the decision processes employed in establishing teaching presence in a structured online environment in order to make a contribution to the body of knowledge from a practical pedagogical perspective. Using the lived experiences of instructors enabled the exploration of the influence pedagogical choices had on the creation of an intellectual climate in the online context. Using semi-structured interviews as the main source of data, the study utilized the Interpretative Phenomenological Analysis (IPA) method as an analytical tool to address concerns of rigor in the qualitative interpretation of experiential data. Results of the collective case revealed student engagement and intellectual curiosity were influenced most greatly by an instructor’s active interest and passion for teaching, an ability to identify the relevance of course topics to the student, and the encouragement for a shared responsibility in the learning process. The findings showed that the shared goal of learning extended beyond the stated learning objectives and expected outcomes of a course and served as a foundation in the creation of authentic relationships between instructor and students. In addressing the overarching research question of how instructors establish teaching presence and inspire intellectual curiosity in a structured teaching environment, the findings of this study contribute to knowledge related to the nature of teaching presence and its role in setting an academic climate in an online classroom.