Content uploaded by M. Ali Akber Dewan
Author content
All content in this area was uploaded by M. Ali Akber Dewan on Apr 17, 2025
Content may be subject to copyright.
1
Detecting Cognitive Engagement in Online Course Forums: A Review
of Frameworks and Methodologies
Nazmus Sakeef, M. Ali Akber Dewan, Fuhua Lin, Dharamjit Parmar
School of Computing and Information Systems, Faculty of Science and Technology
Athabasca University, Alberta, Canada
Email: nsakeefuoa@gmail.com, adewan@athabascau.ca, oscarl@athabascau.ca,
dparmar1@learn.athabascau.ca
Corresponding author: M. Ali Akber Dewan
Abstract— A key aspect of online learning in higher education involves the utilization of course
discussion forums. Assessing the quality of posts, such as cognitive engagement, within online course
discussion forums, and determining students' interest and participation is challenging yet beneficial.
This research investigates existing literature on identifying the cognitive engagement of online
learners through the analysis of course discussion forums. Essentially, this review examines three
educational frameworks - Van Der Meijden’s Knowledge Construction in Synchronous and
Asynchronous Discussion Posts (KCSA), Community of Inquiry (CoI), and Interactive, Constructive,
Active, and Passive (ICAP), which have been widely used for students’ cognitive engagement detection
analyzing their posts in course discussion forums. This study also examines the natural language
processing and deep learning approaches employed and integrated with the above three educational
frameworks in the existing literature concerning the detection of cognitive engagement in the context
of online learning. The article provides recommendations for enhancing instructional design and
fostering student engagement by leveraging cognitive engagement detection. This research
underscores the significance of automating the identification of cognitive engagement in online
learning and puts forth suggestions for future research directions.
Index Terms— Online learning, cognitive engagement detection, ICAP, CoI, KCSA, course forum
analysis.
1. Introduction
Online education has drawn considerable attention from researchers due to its flexible and accessible
delivery method for online learners. A key component of effective online learning is cognitive engagement,
which refers to the psychological investment and effort made by the learner to understand what is studied
2
and to achieve a higher level of comprehension in a specific area of study [1]. Improving cognitive
engagement can help students develop critical thinking, move from procedural tasks to problem-solving
tasks, and think about the learning process [2]. With cognitive engagement detection, it is possible to gain
insightful knowledge about students’ learning processes and inform instructional support for better results
[1].
This study presents a narrative review of educational frameworks and artificial intelligence techniques
used to detect students’ cognitive engagement analyzing course discussion forums. Students in online
learning engage in collaborative learning, share ideas and comments, and develop their critical thinking
through course discussion forums. However, determining cognitive engagement through students’ posts is
challenging due to the absence of any immediate visual and auditory cues, variations in how students’
express engagement, and not identifying their internal cognitive activities. Additionally, assessing diverse
learning objectives and the role of social and affective aspects made it difficult to gauge students’ cognitive
engagement accurately. Some educational frameworks, such as Community of Inquiry (CoI) [3],
Interactive, Constructive, Active, and Passive (ICAP) [4], and Van Der Meijden’s knowledge construction
in synchronous and asynchronous discussion posts (KCSA) [5], have been widely used by the researchers
for detecting cognitive engagement of online learners in their course discussion forums. Along with the
above frameworks, the recent advancement of machine learning, deep learning, text mining, natural
language processing, statistical analysis, and large language models have made this possible to automate
the analysis of cognitive engagement in course discussion forums.
This paper presents a narrative review of the existing works on identifying online learners’ cognitive
engagement through the examination of course discussion forums using the CoI, ICAP, and KCSA
frameworks, machine learning and NLP techniques. We aim to advance the knowledge of cognitive
engagement detection methods in online learning by synthesizing the existing literature and examining the
following four research questions: (1) What theoretical frameworks are widely used for students’ cognitive
engagement detection by analyzing course discussion posts? (2) How are these frameworks employed to
code discussion posts to identify cognitive engagement in online courses? (3) What machine learning (ML)
and artificial intelligence (AI) techniques have been used with the cognitive engagement frameworks for
cognitive engagement detection? (4) What are the pros and cons of using ML and AI techniques with the
cognitive engagement frameworks for cognitive engagement detection using forum posts analysis? We
expect that this review will further advance the knowledge of cognitive engagement analysis in online
courses and offer useful information for educational institutions and academics in creating productive
online courses and smart learning environments. We also hope to offer guidance to researchers, teachers,
and practitioners on how to use the cognitive engagement detection frameworks for effective instructional
3
design and learner support.
The rest of the paper is organized as follows. Section 2 presents an in-depth discussion on CoI, ICAP
and KCSA frameworks and their coding schemes from the perspective of students’ cognitive engagement
detection in online learning. Section 3 explores the recent research studies on cognitive engagement
detection and explores how machine learning models and NLP techniques have been used along with the
various educational frameworks for cognitive engagement detection and use it to improve students’
learning outcome. Section 4 discusses several open-ended research issues related to cognitive engagement
detection that we revealed from existing literature. Finally, Section 5 concludes the review with some
insightful discussions and future recommendations.
2. Educational Frameworks for Cognitive Engagement Detection
In an online learning environment, cognitive engagement refers to the level of active participation of
the online learners in the learning process and intellectual engagement with their online courses. Cognitive
engagement can be promoted in online learning in various ways, such as interactive course materials,
discussion boards, and cooperative group activities. Cognitively engaged students are more likely to
comprehend the information better, retain it for longer, and eventually perform better academically [1]. The
importance of measuring students’ cognitive engagement during online learning cannot be overstated.
Cognitive engagement detection aids instructors in various ways in evaluating the efficiency of their lesson
plans and online learning platforms. By keeping track of cognitive engagement, instructors can assess both
the level of student participation in the learning process and the efficacy of their instructional approaches.
Instructors might spot learners who could be having trouble or aren’t interested in the subject matter. With
the aid of this information, educators can offer these learners extra assistance or resources to help them be
active. Learners deeper learning and critical thinking can be encouraged with the aid of cognitive
engagement detection. Instructors can assist students in honing their problem-solving and analytical
abilities, which are crucial for success in many disciplines, by monitoring and encouraging cognitive
engagement. The detection of cognitive engagement can yield insightful information for course assessment
and development. Instructors can pinpoint course components that may need improvement and make
changes to improve student learning results by analyzing cognitive engagement data. However, measuring
cognitive engagement in online learning is challenging, as it involves both observable behaviors, such as
participation in online discussions, and unobservable cognitive processes such as critical thinking and
problem-solving. However, various educational frameworks provide structured ways to evaluate cognitive
engagement, by focusing on aspects of cognitive presence and interactive learning activities. As we
mentioned earlier, some such frameworks are ICAP, CoI and KCSA.
4
2.1 ICAP Framework
The ICAP framework [4] focuses on how the students cognitively engage with their learning activities.
This framework basically focuses on four types of engagement: interactive, constructive, active, and
passive. Interactive engagement includes activities where students actively collaborate and communicate
with one another as well as the instructor. Discussing with peers or responding to questions online
discussion boards, collaborative projects, and peer review exercises are some examples of interactive
engagement. A higher level of cognitive engagement is typically found in this category, where students
actively interact with the instructor, their peers, and each other to deepen their knowledge about the course
topic. Constructive engagement focuses on the tasks that require students to build something fresh out of
the course material (e.g., organizing information or summarizing key points, producing a presentation, an
essay, or a project plan). As students actively apply and synthesize the information to create something
new, constructive activities are thought to have high levels of cognitive engagement. The Constructive
category is further divided into two subcategories: C1 and C2. Students in C1 subcategory display
reasoning skills by elaborating on points, explaining phenomena, or providing in-depth analysis related to
the course content. Students in C2 subcategory go beyond the course materials and propose new ideas or
content related to the course, demonstrating creativity and original thinking.
The Active category focuses on the exercises where students actively engage with and process the
course material (e.g., making notes or coming up with examples, taking notes, responding to inquiries, or
resolving issues). Given that students are actively processing and manipulating information, active
activities are thought to have moderate levels of cognitive engagement. The Active category is divided into
two subcategories: A1 and A2. Students in A1 subcategory engage with course materials actively, showing
involvement and interaction with the content. In contrast, students in A2 subcategory are active, but they do
not specifically mention course content. Their engagement might be more general or unrelated to the course
material. Passive category focuses on activities where students are passively taking in knowledge by
observing or listening (e.g., reading or watching video lectures without actively processing the
information). Because learners are not actively processing or interacting with the information, passive
activities are thought to have low levels of cognitive engagement. Finally, an additional category is
included in this framework which is called “Other” which encompasses behaviors that do not fit into the
above-defined categories.
For detecting cognitive engagement using ICAP framework analyzing course discussion forum,
researchers use a coding scheme which is shown in Figure 1. This coding scheme provides a structured way
to assess and categorize student engagement behaviors, allowing educators and researchers to gain insights
into online courses. ICAP framework can be used to identify these areas in online courses and encourage
5
deeper learning and engagement among students. The ICAP framework allows instructors to obtain
knowledge about how their students interact with the course materials and activities, and to use this
knowledge to modify their course design and teaching process.
Figure 1. Coding scheme of ICAP framework
2.2 CoI Framework
The CoI framework [3] emphasizes the social and cognitive aspects of the learners. It emphasizes social
presence, cognitive presence, and teaching presence as the three key components of effective online
learning. The CoI framework is used for understanding and assessing the quality of students’ learning. The
framework has been used to evaluate online learning environments. As we mentioned earlier, this
framework comprises three interrelated elements that are essential for achieving a successful online
6
learning experience: Social Presence, Cognitive Presence, and Teaching Presence. Social Presence
describes how connected and involved students feel with their instructors, peers, and other students in an
online learning environment. This covers elements like the usage of communication tools, the degree of
participant participation, and how much the learning experience is customized. Cognitive Presence
represents the level of a learner’s ability to make sense of the course material and to engage in critical
analysis and research. This involves problem-solving exercises, data analysis, and evidence evaluation, as
well as the utilization of reflection exercises and group work among students. Teaching Presence
represents the level to which the instructor creates and supports the learning environment to support social
and cognitive presence. This involves tasks like giving learners clear instructions and feedback, organizing
conversations, and offering support and direction.
The CoI framework is used to measure the efficacy of online learning environments by rating the
quality of the above three components. Educators and researchers can use the framework to identify the
advantages and disadvantages of online course design as well as to direct the creation of strategies to
improve the learning process. The CoI framework has been widely used in research studies to assess how
well online learning environments support learners’ cognitive involvement. One study by Garrison et al. [6]
has demonstrated, for instance, that courses with higher levels of social and cognitive presence are more
successful at encouraging learners to engage in deep learning and critical thinking. The CoI framework is a
useful resource for comprehending and evaluating the quality of online learning environments, as well as
for developing and accessing efficient methods for encouraging cognitive engagement among students.
The coding scheme for CoI framework is shown in Table 1. In CoI framework, the first component,
cognitive presence, represents the degree to which students create and validate meaning through prolonged
contemplation and conversation that places an emphasis on helping them develop critical and higher-order
thinking skills [7]. Cognitive presence comprises the four dimensions in its learning cycle: triggering event
(conceptualization), (b) exploration (thought production), (c) integration (knowledge synthesis), and (d)
resolution (application of knowledge and vicarious testing). The second component, social presence,
represents the growth of social interactions between learning assemble people while preserving an
advantageous social environment. Social presence has three dimensions which include affective expression,
course organization, acknowledge receipt, and group cohesion. The third component, teaching presence,
describes course organization and design, direct instruction, facilitation, and the instructor's role both
before and during teaching. Teaching presence plays an important role in enhancing student pleasure,
perceived learning, and community sense.
7
Table 1: Coding scheme of CoI framework
Elements
Categories
Indicators
Cognitive
presence
Triggering event
New subject, message expresses student’s confusion.
Exploration
Although messages lack a logical conclusion, they contain
information regarding the source of the misunderstanding.
Integration
Connecting ideas, Messages make logical, well-supported
recommendations about how to address the situation.
Resolution
Applying new concepts, messages frequently take the form of new
constructions and debate, test, or apply the earlier conclusions.
Social
presence
Affective
expression
Apology, Thank you, Emotions.
Group cohesion
Social, Greeting, Encouraging, and Community Building
Course
organization
Technical, class management, time out
Acknowledge
receipt
Nodding
Teaching
presence
Design and
organization
Time out, technical, course management, class management
Facilitating
discourse
Clarification, specific query, giving task, confirmation,
understanding, direction
Direct instruction
Explanation of content, additional explanation definition.
2.3 KCSA Framework
The KCSA framework is used for assessing the quality of online discussions in terms of the cognitive
processes that the students engage in course discussion forums. This framework uses four stages of
students’ cognitive engagement in online discussions, which include exploration, conceptualization,
application, and reflection. At the exploration stage, students explore the topic of discussion by
brainstorming ideas, asking questions, and sharing their initial thoughts and experiences. This stage is
characterized by open-ended and exploration discussions, and the emphasis is on generating a wide range
of ideas. At the conceptualization stage, students begin to develop a shared understanding of the topic of
discussion by clarifying concepts, defining terms, and identifying patterns and themes. This stage is
characterized by more focused and structured discussions, and the emphasis is on developing a deeper
understanding of the topic. At the application stage, students apply what they have learned about the
subject to real-world problems. The emphasis at this level is on making connections between abstract
concepts and actual circumstances, which is demonstrated using examples and case studies. At the
reflection stage, students analyze their comprehension and reasoning as they consider the lessons that have
been learned during the conversation. The focus of this stage is on gaining a deeper understanding of one’s
thought processes and is characterized by metacognitive processes including self-evaluation and self-
reflection.
The coding scheme for the KCSA framework is shown in Table 2. This coding scheme is used to
examine online discussions and determine the level of cognitive presence of the students in an online
8
learning environment. Researchers often use this coding scheme to rate the effectiveness of online
discussions and identify areas where online learning environments can be improved. KCSA categorizes the
students’ learning activity as cognitive (asking questions, providing answers, and providing information),
affective, regulative, and rest. However, the affective, regulative, and the rest are not considered when
determining the student's level of cognitive involvement because these categories do not entail any
cognitive components.
Within the cognitive category, the data for question-asking includes questions with/without explanation
and requests for agreement (such as CHV1, CHV2, and CHVER), whereas the data for answer-giving
includes replies with/without explanation (such as CHG1 and CHG2). Giving information includes
providing information with or without elaboration, making references to earlier information, assessing the
content, accepting the statement with or without elaboration, and rejecting the statement with or without
elaboration (for example, CI 1, CI2, CIT, CIE, ACCEPT-, ACCEPT+, REJECT- and REJECT+). These
schemes are used to identify the level of cognitive engagement of online students.
Table 2: Coding scheme for KCSA framework
Asking
questions
CHV1
The student inquiries about a fact or just asks what the answer is. The question
does not provoke any explanation
CHV2
Posing inquiries that demand an explanation (comprehension or explication)
CHVER
The student poses a question to determine whether other people concur with his
or her statement, conclusion, or response.
Giving answers
CHG1
The student responds to the question but doesn’t elaborate or give more details.
CHG2
The student answers the question elaborately and gives more explanation.
Giving
information
CI1
Giving information (a concept or idea) without more explanation
CI2
The student formulates an idea or thought and then explains or elaborates on it.
CIT
The student refers to earlier remarks or previously formulated information.
CIE
The student assesses, paraphrases, or concludes the materials.
ACCEPT-
The student acknowledges another participant’s contribution without adding
anything additional.
ACCEPT+
The student declines to acknowledge the contribution of a participant and gives
no more justification.
NAACEPT-
The student declines to acknowledge the contribution of a participant and gives
no more justification.
NACCEPT+
The student refuses to acknowledge the contribution of a participant and gives
elaboration for refusal.
2.4 Some Key Differences Among ICAP, CoI, and KCSA Frameworks
Some key differences among ICAP [4], CoI [3], and KCSA [5] frameworks are summarized in Table 3.
These three frameworks provide a structured way for analyzing student behavior and modifying teaching
techniques to improve student learning outcomes. These frameworks can assist instructors in identifying
the cognitive involvement of students during online learning. Identifying cognitive engagement is essential
9
for instructors to encourage student learning, to spot students who are having difficulty in learning, and to
enhance the standard of online learning environments. Educational researchers widely use these
frameworks to enhance pedagogical approaches and evaluate the efficacy of teaching strategies in many
learning contexts.
Table 3: Some key differences among ICAP [4], CoI [3], and KCSA [5] frameworks
Category
Differences
Focus
ICAP focuses on the learning processes, whereas CoI focuses on the social processes of learning.
KCSA, on the other hand, focuses on the cognitive processes involved in learning.
Coding scheme
ICAP uses a coding scheme that classifies student interactions into four categories: interactive,
cognitive, active, and passive. CoI categorizes student interactions into three types: cognitive, social,
and teaching presence. KCSA categorizes student
interactions into five categories: cognitive,
metacognitive, affective, social, and strategic.
Application
CoI has been used in a variety of learning contexts while ICAP has been primarily used in STEM
education. Both conventional classroom settings and online learning environments make use of the
KCSA framework.
Level of granularity
While CoI framework offers a more in-depth look at social processes, the ICAP framework offers a
broad overview of student engagement. A more thorough examination of cognitive processes is
offered by the KCSA framework.
3. Methods for Cognitive Engagement Detection Analyzing Course Discussion Posts
Researchers have devoted significant effort to understanding and detecting cognitive engagement in
online learning environments, using diverse frameworks and techniques to analyze and interpret students’
cognitive activities. This section provides a comprehensive review of the literature on cognitive engagement
detection, with a focus on analyzing course forums using artificial intelligence, machine learning, and natural
language processing (NLP) techniques. It also examines the previously discussed three widely recognized
educational frameworks, ICAP, CoI, and KCSA, which offer valuable perspectives for exploring the depth
and quality of learners' engagement. These frameworks illuminate the dynamic interplay between cognitive
processes and online learning experiences.
3.1 Methods for Cognitive Engagement Detection using ICAP Framework
In this subsection, we discussed noteworthy studies that utilizes ICAP framework for cognitive
engagement detection. These studies employed machine learning and deep learning methods in conjunction
with the ICAP framework and NLP to automatically categorize and analyze student engagement, providing
valuable insights into online learning environments and pedagogical strategies.
Hayati et al. [8] presented an approach that analyses students’ social interactions in online discussion
forums using text-mining and predicted learners’ cognitive involvement. This approach incorporated the
ICAP framework to investigate and classify whether a student is a passive, active, constructive, or
interactive participant based on a corpus of posts in a software engineering course delivered through an
online learning platform. The authors used SVM for classifying the learners’ cognitive engagement into the
10
above four categories of the ICAP framework. The obtained findings showed intriguing precision with a
higher classification accuracy and Cohen Kappa, demonstrating the effectiveness of the suggested
technique with a high accuracy.
Liu et al. [9] presented a text classification model to automatically detect students’ emotional and
cognitive engagement and investigate their relationships with the students’ achievement. This method used
an interpretable NLP model called BERT-CNN (bidirectional encoder representation from the
transformers-convolutional neural network) [10] to analyze learners’ discussions in a MOOC forum [11]. In
this study, authors found that the co-occurring emotion and cognition indicators have a combined effect on
learning achievement. It also offered insights for enhancing MOOC completion rates and learning outcomes, such as
introducing engaging and meaningful discussion tasks to evoke positive or thought-provoking emotions.
Atapattu et al. [12] employed the ICAP framework to measure cognitive engagement in a professional
development MOOC. Using word embeddings, they automated the identification of instructors' community
contributions as either "active," involving the manipulation of course materials, or "constructive," involving
the generation of new knowledge. They also explored individual variations in engagement across different
units. Additionally, the study incorporated a manual content analysis to examine constructive contributions.
Out of 67 cases analyzed, all but one were identified as containing "constructive knowledge," providing a
robust foundation for replicating their methodology to analyze cognitive engagement in community-centric
MOOC models. Their findings highlighted that participants' cognitive engagement is influenced by the
nature of MOOC tasks.
Liu et al. [13] presented a semi-supervised cognitive engagement classification method called B-LIWC-
UDA. This method incorporated BERT [10] and LIWC [14] cognitive lexicon dual feature embeddings.
The method uses advanced data-augmentation methods and fuses generic semantic features and implicit
cognitive features to optimize its performance [15] [16]. The results of this study showed that the cognitive
features taken from the LIWC cognitive lexicon could efficiently characterize cognitive engagement for both
the ICAP and the CoI frameworks. The other method, B-LIWC-UDA, used an advanced data augmentation
module called EDA [17] that serves as a noise insertion approach that can produce diverse and effective
augmented samples without changing the labels of the original data samples. The B-LIWC-UDA outperformed
the other models in major evaluation metrics, such as accuracy and resource consumption. This method is
found to be useful to help instructors monitor learners’ learning status and intervene earlier with at-risk
learners, potentially increasing online learning retention by investigating cognitive engagement in forum
discussions.
Gorgun et al. [18] presented a predictive model for automatically identifying the cognitive engagement in
online discussion posts by employing ICAP and Bloom’s taxonomy. Authors used the ICAP coding scheme
11
to label students’ posts for capturing their level of cognitive engagement. The labelled data were used to train
three machine learning classifiers - decision tree (DT), random forest (RF), and support vector machine
(SVM), where the SVM outperformed the other classifiers. The features for the classifiers were extracted by
applying CohMetrix to student posts and non-linguistic contextual features (e.g., number of replies). Authors
planned to explore some new features, such as AWL [19] and Flesh-Kincaid [20] in their future study. The
above studies showed that automated models in conjunction with the ICAP framework hold promises for
effectively categorizing and predicting cognitive engagement in online learning.
3.2 Methods for Cognitive Engagement Detection using CoI Framework
In this subsection, we examined methods that used NLP and machine learning models in conjunction
with the CoI framework for detecting cognitive engagement. Numerous studies have explored various
aspects of cognitive engagement in online learning environments, emphasizing its critical role in shaping
effective pedagogy and improving student learning outcomes within the CoI framework.
Lee et al. [21] proposed a method to automatically measure and enhance the quality of interactions in
discussion forums. This study presented a machine learning model that can be used to identify the phases of
cognitive presence exhibited by a student in their forum posts. It also suggested several future applications of
such a model to help online students develop higher-order thinking. The authors collected discussion posts from
two online courses which used Piazza [22], an online discussion tool. The posts were manually coded based
on the CoI coding scheme in order to explore trends in cognitive presence within and across the courses.
The authors further used this coded data to analyze the relationship between students observed cognitive
presence and course grades. In terms of testing and building an ML model, this study used BERT model to
train with a large text corpus and fine-tune the model. The results suggest that deeper cognitive engagement
with course concepts, as expressed by higher cognitive presence, is associated with better learning outcomes for
students in both course settings. This research study is expected to have a beneficial impact on online
instructors or curriculum developers in higher education.
Villanueva et al. [23] looked at how social and teaching presences interacted in blended learning in K–
12 settings. This study examined three blended learning classes based on data collected through surveys,
student focus group discussions, teacher interviews, class observations and archived data. When
participating in reflective community-building collaborative activities, students gave excellent marks to
their perceived learning at the integration and resolution levels of the cognitive presence. Due to their
shared metacognition, which is defined as “an awareness of one’s learning in the process of constructing
meaning and creating understanding associated with self and others” [24], students’ self-regulation and co-
regulation techniques were impacted by group activity. The students are held more accountable for taking
on their obligations and learning time management. This study demonstrated the development of learning
12
communities and the relevance of the CoI in the K–12 educational setting.
Ozogul et al. [25] investigated how to promote cognitive presence in an online graduate course. To gauge
cognitive presence, the authors employed CoI for self-reporting and Linguistic Inquiry and Word Count
(LIWC) software for actual behavior tracking. This study also looked at how cognitive presence interacted
with the behavioral presence in an educational setting. Students perceived and demonstrated a higher level of
cognitive presence in their discussion board behaviors, according to the findings. Results also indicated that
the perceived cognitive presence is strongly predicted by teaching and social presence. Authors also argued
that the instructor’s responsiveness in discussion posts and the creation of dialogue, the creation of course
assignments as online hands-on projects, the interviewing of guest speakers on course topics, weekly recap
and orientation videos, feedback, case-based discussions, and the teacher’s overall presence in the course
were strategies that helped students to stay cognitively active in an asynchronous online course. The authors
emphasized several strategies that promoted cognitive engagement in asynchronous online courses. These
included the instructor’s active participation in discussion posts and efforts to foster meaningful dialogue.
Additionally, they highlighted the importance of designing course assignments as interactive and hands-on
online projects. Other strategies include conducting interviews with guest speakers on course-related topics,
providing weekly recap, and orientation videos. Offering timely feedback, facilitating case-based
discussions, and maintaining a strong instructor presence throughout the course were also identified as
effective approaches. Together, these methods helped students stay actively engaged in the learning
process.
Liu et al. [10] utilized BERT model to detect students’ levels of cognitive presence in the discussion
posts. The existing machine learning models automatically categorizing the degree of cognitive presence
[26], [8], [27] served as an inspiration for the authors. The findings showed that depending on the course
type and design, students’ cognitive presence may vary. Higher levels of cognitive presence were observed in
sessions where students were asked to discuss their assignments. According to the results of the study, higher
degrees of cognitive presence are strongly correlated with participants’ real final course grades, which are in
line with those of [28].
Alwafi, E. M. [29] examined how students’ cognitive presence and interactions during asynchronous
discussion can be affected by elaboration feedback based on learning analytics. While the results of the first
online discussion did not show any differences between the two groups, the results of successive discussions
showed that the experimental group performed better in terms of raising the levels of cognitive presence
and network density. Because they were aware of the quality of their participation and the linkages between
their classmates, students who received learning analytics-based elaboration feedback felt that their motivation
and engagement had grown. For Graduate Communication Capstone students, this research study
13
investigated the effects of a structured online interactive peer review system. The levels of cognitive
presence were coded as exploration, triggering, integration and resolution. The authors noted that during the
organized peer review process, students actively shared outside resources and cited reliable sources to
support their arguments. Using the principles of cognitive apprenticeships (such as modelling, coaching,
scaffolding, articulation, reflection, and exploration), this study demonstrated how computer-based cognitive
tools may foster, assist, and extend learning and collaboration.
Hind et al. [30] proposed a method to make teachers aware of students’ critical thinking and cognitive
behavior so they could take the appropriate actions to ensure effective learning. This method is designed to
automatically evaluate students’ CoI coded cognitive presence in relation to their social interaction in
asynchronous online discussion posts. The authors analyzed student transcripts using NLP and machine
learning approaches, categorizing them into five classes of cognitive presence: triggering event, exploration,
integration, and resolution, and a none-category. The authors applied some standard preprocessing steps and a
doc2vec feature extraction method followed by Naive Bayes classifier to classify posts into the above five
classes.
Hu et al. [31] presented a deep learning approach to automatically classify online discussion posts into CoI
categories. They utilized discussion posts from two different courses and examined the generalizability and
interpretability of a convolutional neural network (CNN) model for this classification problem. The
performance of the CNN model was compared with random forest classifier where language aspects of
psychological processes and cohesiveness had been used as features. In addition to analyzing the importance
of features, the visualizations of the CNN interpretability offer unique insights into distinguishing the stages of
cognitive presence in the posts.
Liu et al. [32] proposed MOOC-BERT which used discussion posts from MOOC to identify cognitive
presence of the students. The unlabeled textual data from MOOC on various topics was used to pre-train
MOOC-BERT. Authors found that MOOC-BERT considerably beat the original BERT [33], and other
baseline models, such as SVM, RF, NB, DPCNN, and FastText in terms of cross-course generalizability,
which indicates that it has great potential to be used in different types of courses or educational settings. The
study also showed that pretraining followed by fine-tuning is even more important in situations where the
model lacks high-quality labeled data, although sufficient large-scale unlabeled data are available.
Additionally, the study identified performance issues in the resolution category of CoI, which need to be
addressed.
Garrison et al. [34] developed a German-language cognitive presence classifier that was manually coded
from an online university course, additional learning traces, and linguistic analysis with the LIWC tool. Each
student’s posts were classified independently into different classes of cognitive presence. The CoI coding
14
scheme for cognitive presence (i.e., triggering event, exploration, integration, and resolution) were developed
based on [35], which they translated into German and tailored to the course context. They considered various
stages of cognitive presence for a post. For learning traces, they incorporated terminology from the course
glossary, tagging, and attachments to posts in addition to the LIWC. Finally, they employed K-nearest neighbor,
random forest model, and multilayer perceptron as the classifier. The authors in this study contributed to
making the first fully CoI coded German dataset for the cognitive presence analysis.
Darabi et al. [36] compared four different text analysis methods (Dictionary Method (DM), Coh-Metrix
Tool (CMT), Interaction Analysis Model (IAM), and CoI) and examined their effectiveness in identifying
indicators of cognitive presence in online learning. The authors conducted a comparative analysis of these four
methods using a dataset of asynchronous online discussions. They examined the similarities and differences in
the indicators of cognitive presence identified by each method and evaluated the effectiveness of each
approach. The findings of the study revealed that the CoI and the IAM model were more effective in
detecting cognitive presence compared to the DM and the CMT. The CoI, in particular, provided a
comprehensive framework for understanding cognitive presence by considering not only the cognitive aspects
but also the social and teaching presences within online discussions.
3.3 Methods for Cognitive Engagement Detection using
KCSA Framework
Hind et al. [37] proposed a method to predict learners’ level of cognitive engagement based on their
involvement in online discussion posts using KCSA framework. This method combined OWL (Web
Ontology Language) and LSA (Latent Semantic Analysis) ontologies and performed a semantic
classification of forum posts in a particular context selected by the course instructor. To derive qualitative
information from learners’ forum posts on the e-learning platform, all the collected posts are analyzed using a
content analysis method and coded by the cognitive dimension based on the coding scheme proposed by the
KCSA. Along with this data, the authors used some numerical statistics, such as the number of posts, views,
and responses with a SVM classifier to predict learners’ cognitive engagement in their online courses.
Kew and Tasir [1] examined course forum posts and analyzed how students’ cognitive engagement
changed over time in an online learning platform. The authors investigated the association among students’
cognitive engagement, gender, and the quantity of forum posts using inferential statistics. The authors applied
KCSA coding scheme to classify students’ cognitive engagement into high-level and low-level involvement.
Authors found that only around half of the students provided any justification for their posts, which indicated
a low level of cognitive engagement. Authors emphasized the necessity to explore more options to encourage
and raise students’ cognitive engagement level in their course which could improve the quality of learning as
well. Authors also applied Fisher’s Exact Test [38] and Spearman’s correlation and found that the level of
students’ cognitive engagement was unaffected by gender and a very small correlation between the number of
15
posts and the students’ cognitive engagement.
4. Discussions
4.1. Findings on Research Questions
Analyzing students’ cognitive engagement on e-learning platforms through course discussion posts has
helped researchers and instructors address a wide range of questions related to online learning. Some
intriguing research questions and their potential solutions, based on existing literature, are discussed below:
RQ 1: What is the level of cognitive engagement among students in online learning discussion forums?
The level of cognitive engagement among students in online learning discussion forums varies depending
on several factors, including the design of the course, the quality of the discussion prompts, and the level of
student participation. Research suggests that students may engage in different types of cognitive activities
in online discussion forums, including elaborating on ideas, questioning assumptions, and analyzing
information. However, a study by Kew and Tasir [1] has found that students may also engage in superficial
or low-level discussions, such as summarizing information or expressing agreement or disagreement with
others. To improve the level of cognitive engagement in online learning discussion forums, instructors can
use strategies such as providing clear and open-ended prompts, modelling high-level discussion, and
encouraging active participation from all students [1].
RQ2: How does the level of cognitive engagement differ between students with different demographic
characteristics (e.g., age, gender, academic background)? The level of cognitive engagement in online
learning may differ based on various demographic characteristics of students, such as age, gender, and
academic background. Richardson and Newby [39] found that age is a significant predictor of cognitive
engagement in online learning, with older students exhibiting higher levels of engagement compared to
younger students. However, another study [40] found no significant differences in cognitive engagement
based on age. Gender has also been studied as a potential predictor of cognitive engagement in online
learning, where Hartono et al. [41] found that female students exhibited higher level of engagement
compared to male students, while others have found no significant differences based on gender. Moreover,
academic background has also been investigated as a potential predictor of cognitive engagement in online
learning. Kew and Tasir [1] found that students with a higher level of academic achievement or experience
in the subject matter exhibit higher levels of engagement compared to students with lower levels of
academic achievement or experience.
RQ3: How does the level of cognitive engagement change over time throughout the course? The level
of cognitive engagement in online learning can change over time throughout the course, influenced by
various factors such as the course design, the instructional strategies used, and the level of student
16
motivation. Singh et al. [42] investigated changes in cognitive engagement over time in online courses and
found varying results. Singh et al. [42] has found that cognitive engagement tends to be higher at the
beginning of the course when students are more motivated and excited about the new learning experience.
However, as the course progresses, cognitive engagement tends to decrease due to factors such as boredom,
frustration, or lack of engagement with the course material or activities. Richardson et al. [39] found that
cognitive engagement may vary depending on the specific course activities or modules being studied. For
example, students may exhibit higher levels of engagement during more interactive or challenging
activities, while engagement may decrease during more passive or repetitive activities. Gray et al. [6] also
stated that the level of cognitive engagement in online learning is dynamic and may change over time
throughout the course. To promote sustained engagement, course designers and instructors may need to
incorporate a variety of instructional strategies and activities that challenge and motivate students
throughout the course, while also providing opportunities for reflection and feedback. Additionally, regular
communication and support from the instructor can also help to maintain students’ cognitive engagement
over time.
RQ4: What types of cognitive engagement are most prevalent among students in online learning
discussion forums? Online learning discussion forums can provide students with opportunities to engage in
various types of cognitive activities, such as reflection, critical thinking, and problem-solving [37]. Several
studies have investigated the types of cognitive engagement that are most prevalent among students in
online learning discussion forums. One type of cognitive engagement that is often observed in online
learning discussion forums is reflective thinking. Students may engage in reflective thinking by sharing
personal experiences, making connections between course concepts and real-world situations, and
discussing their learning processes and progress. Another type of cognitive engagement commonly
observed in online learning discussion forums is critical thinking. Students may engage in critical thinking
by analyzing and evaluating course concepts, questioning assumptions, and providing evidence to support
their arguments or perspectives. Problem-solving is also a common type of cognitive engagement observed
in online learning discussion forums. Students may engage in problem-solving by applying course concepts
to real-world problems, collaborating with peers to find solutions, and evaluating the effectiveness of
different solutions.
RQ5: How do different types of cognitive engagement relate to student learning outcomes (e.g., grades,
knowledge retention, satisfaction)? Several studies have shown a positive relationship between cognitive
engagement and student learning outcomes in online learning environments [43], [44], [6], [45]. For
example, Wei et al. [44] found that higher levels of cognitive engagement, particularly in the areas of
critical thinking and higher-order thinking, were positively correlated with higher grades and better
17
performance on exams. Richardson and Swan [43] found that higher levels of cognitive engagement,
specifically in the areas of critical thinking and reflective thinking, were positively related to student
satisfaction with the course and their perceived learning outcomes. Similarly, Cho and Heron [45] found
that higher levels of cognitive engagement, particularly in the areas of critical thinking and self-regulated
learning, were positively correlated with knowledge retention. These studies suggest that promoting higher
levels of cognitive engagement among students in online learning environments can lead to improved
learning outcomes, including higher grades, better exam performance, increased satisfaction with the
course, and better retention of knowledge.
RQ6: How does the level of cognitive engagement differ across different types of online learning
activities (e.g., discussion forums, quizzes, group projects)? The level of cognitive engagement can differ
across different types of online learning activities, as each type of activity may require different cognitive
processes or may be conducive to certain types of engagement. Discussion forums are often used to
promote reflective thinking, critical thinking, and social learning. Students may engage in a variety of
cognitive processes during forum discussions, including analyzing, evaluating, and synthesizing
information, as well as sharing personal experiences and perspectives. Therefore, discussion forums may be
particularly conducive to cognitive engagement that promotes deep learning and knowledge construction.
Quizzes are often used to promote retrieval practice, which can enhance memory retention and knowledge
acquisition. However, the cognitive engagement required during quizzes may be more focused on recall
and recognition rather than higher-order thinking or knowledge construction. Another important aspect is
group projects. Group projects can promote collaboration, communication, and problem-solving skills [46].
The cognitive engagement required during group projects may be more focused on applying knowledge to
real-world problems, generating new ideas, and synthesizing information from multiple sources. Overall,
the level and type of cognitive engagement required during different types of online learning activities can
vary widely, and each type of activity may have unique strengths and limitations for promoting cognitive
engagement and learning outcomes. Understanding the specific types of cognitive engagement required for
each activity can help instructors design more effective online learning experiences and promote deeper
levels of cognitive engagement among students.
RQ7: How do different instructional designs and pedagogical strategies impact cognitive engagement
in online learning? Different instructional designs and pedagogical strategies can significantly impact
cognitive engagement in online learning [27]. For example, instructional designs that incorporate active
learning strategies such as problem-based learning, case-based learning, and collaborative learning have
been found to enhance cognitive engagement. These strategies encourage students to take an active role in
their learning, work collaboratively with peers, and apply their knowledge to real-world problems.
18
Similarly, pedagogical strategies such as providing timely feedback, setting clear learning goals, and using
multimedia resources can also promote cognitive engagement. Providing timely feedback can help students
understand their progress and identify areas where they need to improve. Setting clear learning goals can
help students stay focused and motivated while using multimedia resources can provide a variety of
engaging learning experiences. On the other hand, instructional designs that rely heavily on lecture-based
formats and passive learning strategies have been found to limit cognitive engagement. These formats often
prioritize the delivery of information overactive engagement and fail to provide opportunities for students
to apply their knowledge.
RQ8: How can instructors use cognitive engagement data to optimize their teaching strategies and
improve student learning outcomes? Instructors can use cognitive engagement data to optimize their
teaching strategies and improve student learning outcomes in several ways [25]. By analyzing cognitive
engagement data, instructors can identify areas where students may be struggling or disengaged and need
improvement. This information can help instructors adjust their teaching strategies to better meet the needs
of their students. Cognitive engagement data can also help instructors provide targeted feedback to
students. For example, if students are struggling with a particular concept or activity, instructors can
provide personalized feedback to help students improve. In addition, based on cognitive engagement data,
instructors can adjust their instructional strategies to better engage students. For example, if students are
disengaged during lectures, instructors can incorporate more interactive activities to increase engagement.
Cognitive engagement data can also be used to personalize learning experiences for students. Instructors
can use data to identify individual student strengths and weaknesses and provide tailored learning
experiences that better meet their needs. By monitoring cognitive engagement data over time, instructors
can track student progress and adjust as needed to ensure students are meeting learning objectives. Overall,
cognitive engagement data can provide valuable insights for instructors to optimize their teaching strategies
and improve student learning outcomes.
4.2 Scalability Challenge in Cognitive Engagement Frameworks
The ICAP framework demonstrates scalability, particularly in data-driven environments such as online
discussion forums, MOOCs, and large-scale asynchronous platforms. Its strength lies in its behavioural
taxonomy, which maps well onto automated classification techniques using machine learning and natural
language processing (NLP). For example, Hayati et al. [8] applied text mining and SVM classifiers to
effectively detect ICAP engagement levels in an online software engineering course. Similarly, Liu et al.
[9] utilized the BERT-CNN model to classify cognitive and emotional engagement in MOOC forums,
showing robust results at scale. The framework’s categorical structure (i.e., Passive, Active, Constructive,
Interactive) allows researchers to annotate and automate classification pipelines with a relatively high
19
degree of reliability. However, ICAP has certain limitations. The framework predominantly captures
observable behavioural engagement, potentially overlooking deeper cognitive or metacognitive dimensions
such as reflection or emotion-driven learning. Additionally, high-fidelity classification still relies on expert-
labeled datasets for training, which may not generalize well across varying disciplines or cultures.
Therefore, while ICAP is scalable across large textual data sources, such as forum posts or chat logs, it may
be less effective in multimodal or nuanced instructional contexts unless further adapted.
The CoI framework is inherently scalable across diverse educational contexts, including higher
education, K–12, and blended or fully online courses. Its tripartite structure—comprising cognitive
presence, social presence, and teaching presence—has been validated in numerous studies as a
comprehensive lens for evaluating learning engagement. Lee et al. [21] applied a BERT-based model to
automatically classify forum posts into CoI categories, demonstrating the framework’s applicability to large
corpora. Similarly, Liu et al. [32] developed MOOC-BERT, a domain-specific language model trained on
unlabeled MOOC discussions and achieved high cross-course generalizability for detecting cognitive
presence. Despite its scalability, CoI presents some challenges when applied to a complex or high-volume
environment. The framework’s nuanced dimensions require substantial annotation effort and domain
knowledge, which can hinder rapid deployment without significant computational or human resources.
Furthermore, reliance on self-reported data in some implementations can limit its accuracy, and its
application may be less suited to non-traditional learning platforms where structured interactions (e.g.,
instructor facilitation) are minimal or absent. Nonetheless, with proper infrastructure and automation, CoI
remains a powerful framework for large-scale analysis of cognitive engagement.
The KCSA framework offers a detailed, cognitively focused structure for analyzing student discussions.
It segments learning into key phases: exploration, conceptualization, application, and reflection. For
example, Hind et al. [37] and Kew & Tasir [1] used KCSA to classify cognitive engagement levels using
both statistical and semantic methods (e.g., OWL, LSA, and SVM) on forum post data. These
implementations show the framework’s potential for scale, especially in well-structured LMS environments
or instructor-moderated forums. However, KCSA’s granularity and complexity can limit its scalability
across broader or less structured datasets. The coding scheme involves multiple subcategories (e.g., CHV1,
CI1, CIE, REJECT+), which require intensive manual annotation or sophisticated automation. Unlike ICAP
or CoI, KCSA has seen less adoption in large-scale MOOC contexts, and its application beyond forum-
based environments is relatively limited. Therefore, while KCSA is valuable for detailed cognitive analysis,
particularly in small to medium-sized courses, it may not yet be optimized for real-time or large-scale
adaptive learning environments.
20
4.3 Domain Specificity and Context Sensitivity
Domain specificity is an essential factor in assessing the effectiveness of cognitive engagement models
in various learning environments. The ICAP framework provides a broad range of usefulness, beneficial for
various fields such as STEM, social sciences, business, and humanities. Its hierarchical framework for
engagement enables researchers to categorize involvement across a wide array of online learning
environments. However, ICAP is limited in areas where real-time collaboration is needed, as it mainly
concentrates on individual student actions rather than on the shared construction of knowledge.
The CoI framework is most appropriate for educational settings that involve discussions, reflective
dialogue, argumentation, and collaborative knowledge building. It is most effective in the humanities and
social sciences, where instructor guidance (teaching presence) and student interaction (social presence) play
crucial roles in participation. Nevertheless, CoI is not as effective in STEM disciplines, where engagement
in learning typically occurs through experiments, problem-solving tasks, and coding exercises instead of
text-focused discussions. Consequently, CoI does not consistently indicate participation in highly
interactive or simulation-driven learning settings [48].
The KCSA framework is very specific to certain domains, with its most efficient uses found in areas
like law, business, and political science that rely on argumentation. Its emphasis on cognitive processing,
problem-solving, and decision-making makes it particularly well-suited for group project work and debate
courses. It is less suitable in technical fields like computer science or engineering, where participation is
typically shown through practical work instead of written communication. Additionally, the model does not
align with self-paced learning models, resulting in its unsuitability for adaptive learning platforms [49].
4.4 Adaptability to Evolving AI-Driven Online Learning Environments
The adaptability of cognitive engagement frameworks to modern AI-driven learning environments is
crucial for maintaining their ongoing relevance. The ICAP framework has been widely used in machine
learning-based engagement detection, with models such as SVM, Random Forest, and neural networks
successfully classifying ICAP-based engagement levels. However, ICAP does not account for emotional
and social engagement, which are significantly relevant in modern AI-driven learning. Moreover, its
applicability is limited to text-based interactions, making it difficult to integrate with video-based learning,
multimodal interactions, and real-time adaptive learning systems [47], [50].
The CoI framework has shown some adaptability to modern NLP-based analysis, with studies
employing BERT-CNN models, sentiment analysis, and deep learning classifiers to evaluate cognitive
presence. However, its strong reliance on discussion-based interactions limits its applicability to self-paced
21
learning environments, where learners primarily engage with AI tutors rather than human peers.
Additionally, CoI faces challenges with multimodal engagement detection, as it was not designed to
incorporate speech, visual cues, or behavioral data [7], [48].
The KCSA framework encounters the most significant challenges in adaptability. Unlike ICAP and
CoI, which have been integrated into NLP-driven engagement detection models, KCSA’s dependence on
structured argumentation and decision-making processes makes it difficult to automate. Furthermore, it
does not align well with real-time engagement tracking systems, making it unsuitable for modern AI-driven
adaptive learning environments [49].
4.5 Cognitive Engagement Detection Models: Strengths and Weaknesses
The identification of cognitive engagement in online education has greatly advanced through the
integration of Natural Language Processing (NLP) and Deep Learning (DL) methods. Conventional
techniques, like rule-based language analysis and statistical NLP systems, have gradually been replaced by
transformer-based frameworks and neural embedding models that provide enhanced contextual
understanding and scalability. Early engagement detection methods mainly depended on lexicon-based
strategies like Linguistic Inquiry and Word Count (LIWC) and Coh-Metrix, which examine sentiment,
lexical variety, and cohesion in discussion contributions. Although these techniques provide an
interpretable way to evaluate engagement, they face challenges in scalability and adaptability in various
learning settings because they depend on predefined dictionaries and linguistic rules. In the same way,
classifiers based on machine learning, such as Support Vector Machines (SVM), Random Forests (RF), and
Logistic Regression (LR), have been utilized for engagement classification by utilizing manually created
linguistic and contextual features. While these models provide improved accuracy over lexicon-based
methods, they require significant feature engineering and do not fully grasp more profound cognitive
engagement patterns beyond surface text analysis.
The introduction of deep learning models, especially transformer-based architectures like BERT, GPT,
and XLNet, has greatly enhanced engagement detection by identifying hidden cognitive engagement
patterns and contextual dependencies in conversations. These models, pre-trained on large corpora, exhibit
excellent generalizability across various learning environments, making them suitable for ICAP, CoI, and
KCSA frameworks. Nevertheless, despite their enhanced accuracy, deep learning models present
computational difficulties, requiring large datasets and high processing power, which constrains their
feasibility for real-time engagement monitoring in online learning settings. To tackle these challenges,
hybrid AI models have emerged, blending conventional NLP heuristics with neural embeddings. For
example, the BERT-CNN and BERT-LIWC models integrate deep contextual embeddings with structured
knowledge frameworks to improve both interpretability and computational efficiency. These hybrid
22
methods surpass individual models by providing enhanced generalization and decreasing overfitting to
discussion-specific language.
5. Conclusion
The ICAP, CoI, and KCSA frameworks have been found to be important in detecting cognitive engagement of
students in online learning. These frameworks can serve cognitive engagement detection methods in several
ways. The analysis and categorization of student behavior during online learning using these frameworks
offers better organization, which makes it simpler for instructors to comprehend and evaluate the data. By
considering several aspects of learning, such as social presence, cognitive presence, and specific categories of
cognitive processes, these frameworks provide a holistic view of student participation. This gives instructors
insights into how the students interact with the information and activities in the courses. These frameworks
allow instructors to gather information on student behavior and utilize it to support data-driven decisions on
how to improve the learning outcomes for their students. Instructors can optimize the learning outcomes for
their students by analyzing how well their classes are using these frameworks. This covers adjustments to
learning materials, teaching strategies, and course design. These frameworks can assist instructors in making
sure that their teaching methods complement their intended learning outcomes. Instructors may make sure
they are fostering a learning environment that is optimized for student success by focusing on forms of
engagement or aspects of learning.
These frameworks offer a structured, thorough, and data-driven approach to understanding student
behavior, making them crucial in the detection of cognitive engagements of students during online learning.
This can result in more effective teaching strategies and better learning outcomes for students. While the
frameworks are useful for detecting cognitive engagement of students during online learning, there are some
limitations and factors that may affect their accuracy and applicability. There are some instances when these
frameworks may fail in detecting cognitive engagement. These frameworks focus on the examination of
text-based data from written assignments, chat logs, and discussion forum entries. They might not fully
capture the interaction’s environment, which can affect students’ cognitive engagement through nonverbal
clues, social dynamics, and cultural considerations. The coding categories of this framework may be
arbitrary and subject to different interpretations. As a result, the classification of engagement types and the
coding procedure may become inconsistent. These frameworks may not account for all forms of cognitive
involvement, including metacognitive, creative, and emotional engagement. As a result, they might give an
inaccurate image of how students are engaging cognitively in online learning environments. These
frameworks heavily rely on NLP and machine learning techniques to analyze large amounts of text data.
However, the accuracy of NLP and machine learning techniques may vary depending on the quality and
23
complexity of the text data, which can affect the accuracy of cognitive engagement detection. Individual
variations in cognitive engagement styles and preferences might not be considered by these frameworks.
For instance, some students could like in-depth reflection, while others might favor learning activities that
require greater participation and activity. Overall, while these frameworks can be valuable tools for detecting
the cognitive engagement of students during online learning, it is important to consider their limitations and
potential biases to ensure that they provide an accurate and comprehensive understanding of students’
learning experiences.
It is important to note that each framework has its strengths and weaknesses, and the choice of the
framework should be based on the research question and the educational context. From different discussions,
we extracted the advantages and disadvantages of each framework in detecting cognitive engagement in online
learning. The ICAP framework is accessible to a broad spectrum of academics and educators due to its
simplicity of use and comprehension. It has been verified in a variety of educational environments and is
concentrated on the direct observation of learning behaviors, making it valuable for identifying areas where
interventions may be required. However, it only looks at cognitive engagement and ignores other elements
that might affect student learning, like motivation or emotional involvement. ICAP also needs skilled
observers to implement it, which would prevent it from being widely used. It could not be sensitive enough to
pick up on slower, more gradual changes in cognitive activation.
Contrarily, the CoI framework views student participation holistically, considering the interactions between
teaching presence, social presence, and cognitive presence. It has undergone significant research and has
been approved in numerous educational settings. It can be used practically to direct the creation and
administration of online courses. However, CoI can be difficult to implement and complex, necessitating a
thorough comprehension of the framework and its elements. This paradigm mainly relies on data that was self-
reported, which does not always correctly reflect student participation. Furthermore, it might not be a good
tool for evaluating participation in other online learning contexts, like massive open online courses
(MOOCs).
A significant part of student learning is knowledge building, which is the focus of the KCSA framework.
It has received validation in numerous diverse educational environments. It is relatively simple to use and put
into practice. The methodology exclusively evaluates knowledge construction; it does not include other facets
of student interaction. The researcher or instructor must exercise some degree of subjectivity to evaluate the
data. Additionally, it might not be a good way to gauge participation in other online learning environments,
like massive open online courses (MOOCs). The framework has the advantage of offering a methodical
technique to assess the cognitive processes that students use during online discussions. Educators and
researchers can create interventions to support and enhance students’ cognitive engagement and foster
24
deeper learning by determining the stage of cognitive processing that students are in. However, one
potential drawback of the framework is that using the coding scheme to analyze online discussions is time-
consuming. Furthermore, the framework might fall short of capturing the dynamic and complex nature of
online discussions, which can involve several stages of cognitive processing at once.
The ICAP, CoI, and KCSA frameworks have been used in this review paper to shed light on the
challenging task of identifying cognitive engagement among online learners. Our investigation has
highlighted cognitive engagement's multidimensional nature and stressed how crucial it is to create
successful online learning environments. The fusion of deep learning and machine learning methods has
ushered in a new era of automatic engagement detection, potential targeted interventions, and improved
instructional design. Although the frameworks have given a thorough foundation, difficulties still exist.
Continued research is required due to the demand for uniform metrics, ethical issues with data utilization,
and thorough cross-framework analysis. The combination of these frameworks offers a chance for a more
comprehensive method of detecting cognitive involvement, enabling teachers to design precise
interventions that consider various learning preferences.
In the future, the dynamic nature of online learning will necessitate models that are adaptable and take
changing engagement patterns into consideration. Accurate detection may be improved by including
multimodal data, such as visual and behavioral clues. Additionally, real-time feedback methods can
establish a feedback loop that dynamically modifies instructional strategies while also informing
engagement. Thorough knowledge of cognitive engagement might also result from research into hybrid
models, which combine the advantages of many frameworks. A cohesive paradigm that incorporates the
cognitive, social, and emotional facets of involvement may result from this. It is also important to continue
paying attention to ethical issues, particularly those involving learner autonomy and data privacy. It is
crucial to strike a balance between raising engagement and protecting people's privacy. This review
essentially emphasizes the need for cognitive engagement in enabling successful online learning. We
foresee a future where automated detection seamlessly influences instructional tactics, providing an
educational environment that is engaging, responsive, and adaptive by fusing theoretical frameworks with
cutting-edge technologies.
Acknowledgement: We acknowledge the support of the Natural Sciences and Engineering Research
Council of Canada, Alberta Innovates, and Athabasca University, Canada.
References
25
[1]
S. N. Kew and Z. Tasir, "Analysing Students' Cognitive Engagement in E-Learning Discussion Forums through Content
Analysis.," Knowledge Management & E-Learning, vol. 13, pp. 39-57, 2021.
[2]
S. Lv, C. Chen, W. Zheng and Y. Zhu, "The relationship between study engagement and critical thinking among higher
vocational college students in China: a longitudinal study," Psychology Research and Behavior Management, 2022.
[3]
Z. Akyol, J. B. Arbaugh, M. Cleveland-Innes, D. R. Garrison, P. Ice, J. C. Richardson and K. Swan, "A response to the
review of the community of inquiry framework," Journal of Distance Education, vol. 23, no. 2, pp. 123-135, 2009.
[4]
M. T. Chi, "Active-constructive-interactive: a conceptual framework for differentiating learning activities," Topics in
Cognitive Science, vol. 1, no. 1, pp. 73-105, 2009.
[5]
H. A. T. V. d. Meijden, "Knowledge construction through CSCL: student elaborations in synchronous and three-
dimensional learning environments," Radboud University Nijmegen, Houtlaan, Netherlands, 2005.
[6]
J. A. Gray and M. DiLoreto, "The effects of student engagement, student satisfaction, and perceived learning in online
learning environments.," International Journal of Educational Leadership Preparation, 2016.
[7]
D. R. Garrison, T. Anderson and W. Archer, "Critical inquiry in a text-based environment: Computer conferencing in higher
education," The internet and higher education, vol. 2, pp. 87-105, 1999.
[8]
H. Hayati, K. M. Idrissi and S. Bennani, "Automatic classification for cognitive engagement in online discussion forums:
text mining and machine learning approach," in Artificial Intelligence in Education, Ifrane, Morocco, 2020.
[9]
S. Liu, S. Liu, Z. Liu, X. Peng and Z. Yang, "Automated detection of emotional and cognitive engagement in mooc
discussions to predict learning achievement," Computers and Education, vol. 181, no. C, 2022.
[10]
J. Devlin, M.-W. Chang, K. Lee and K. Toutanova, "Bert: Pre-training of deep bidirectional transformers for language
understanding," arXiv preprint arXiv:1810.04805, 2018.
[11]
I. Galikyan, W. Admiraal and L. Kester, "MOOC discussion forums: The interplay of the cognitive and the social,"
Computers & Education, vol. 165, pp. 104-133, 2021.
[12]
T. Atapattu, M. Thilakaratne, R. Vivian and K. Falkner, "Detecting cognitive engagement using word embeddings within an
online teacher professional development community," Computers and Education, vol. 140, 2019.
[13]
Z. Liu, W. Kong, X. Peng, Z. Yang, S. Liu, S. Liu and C. Wen, "Dual-feature-embeddings-based semi-supervised learning
for cognitive engagement classification in online course discussions," Knowledge-Based Systems, vol. 259, 2023.
[14]
J. W. Pennebaker, M. E. Francis and R. J. Booth, "Linguistic inquiry and word count: LIWC 2001," Mahway: Lawrence
Erlbaum Associates, vol. 71, 2001.
[15]
R. L. Moore, K. M. and Oliver and C. and Wang, "Setting the pace: Examining cognitive processing in MOOC discussion
forums with automatic text analysis," Interactive Learning Environments, vol. 27, pp. 655-669, 2019.
[16]
M. Wen, D. Yang and C. Rose, "Linguistic reflections of student engagement in massive open online courses," in
Proceedings of the International AAAI Conference on Web and Social Media, vol. 8, 2014.
[17]
J. Wei and K. Zou, "Eda: Easy data augmentation techniques for boosting performance on text classification tasks," arXiv
preprint arXiv:1901.11196, 2019.
[18]
G. Gorgun, S. N. Yildirim-Erbasli and C. Demmans, "Predicting cognitive engagement in online course discussion forums,"
in International Educational Data Mining Society, 2022.
[19]
A. Coxhead, "A new academic word list," TESOL quarterly, vol. 34, pp. 213-238, 2000.
[20]
J. Kincaid, R. Fishburne, R. Rogers and B. Chissom, "Derivation of new readability formulas (automated readability index,
fog count, and flesch reading ease formula) for Navy enlisted personnel," Naval Air Station Memphis, Chief of Naval
Technical Training, Research Branch Report, 1975.
[21]
J. Lee, F. Soleimani, J. Hosmer, M. Y. Soylu, R. Finkelberg and S. Chatterjee, "Predicting cognitive presence in at-scale on-
line learning: MOOC and for-credit online course environments," Online Learning, vol. 26, no. 1, pp. 58-79, 2022.
[22]
"piazza," Piazza Technologies, [Online]. Available: https://piazza.com/product/overview. [Accessed 25 11 2024].
[23]
J. A. R. Villanueva, P. Redmond and L. Galligan, "Manifestations of cognitive presence in blended learning classes of the
philippine k-12 system," Online Learning, vol. 26, no. 1, pp. 19-37, 2022.
[24]
G. DR, "E-Learning in the 21st Century: A Community of Inquiry Framework for research and pracfice," 2017.
[25]
G. Ozogul , M. Zhu and T. M. Phillips, "Perceived and actual cognitive presence: A case study of an intentionally-designed
asynchronous online course," Online Learning, vol. 26, no. 1, pp. 38-57, 2022.
[26]
M. T. Chi and R. Wylie, "The ICAP framework: Linking cognitive engagement to active learning outcomes," Educational
Psychologist, vol. 49, no. 3, pp. 219-243, 2014.
[27]
V. Kovanovic, S. and Joksimovic, Z. and Waters, D. and Gavsevic, K. and Kitto, M. and Hatala and G. and Siemens,
"Towards automated content analysis of discussion transcripts: A cognitive presence case," in
Proceedings of the sixth
international conference on learning analytics & knowledge, 2016, pp. 15-24.
26
[28]
A. Sadaf, S. Y. Kim and Y. Wang, "A comparison of cognitive presence, learning, satisfaction, and academic performance
in case-based and non-case-based online discussions," American Journal of Distance Education, vol. 35, pp. 214-227, 2021.
[29]
E. M. Alwafi, "Designing an online discussion strategy with learning analytics feedback on the level of cognitive presence
and student interaction in an online learning community," Online Learning, vol. 26, no. 1, pp. 80-92, 2022.
[30]
H. Hind, M. K. Idrissi and S. Bennani, "Automatic assessment of coi-cognitive presence within asynchronous online
learning," in IEEE International Conference on Information Technology Based Higher Education and Training, 2018.
[31]
Y. Hu, R. F. Mello and D. Gasevic, "Automatic analysis of cognitive presence in online discussions: An approach using
deep learning and explainable artificial intelligence," Computers and Education: Artificial Intelligence, 2021.
[32]
Z. Liu, X. Kong, H. Chen, S. Liu and Z. Yang, "Mooc-bert: auto- matically identifying learner cognitive presence from
mooc discussion data," IEEE Transactions on Learning Technologies, 2023.
[33]
L. Yao, Z. Jin, C. Mao, Y. Zhang and Y. Luo, "Traditional Chinese medicine clinical records classification with BERT and
domain specific corpora," Journal of the American Medical Informatics Association, vol. 26, pp. 1632-1636, 2019.
[34]
V. Dornauer, M. Netzer, E. Kaczko, L.-M. Norz and E. Am-menwerth, "Automatic classification of online discussions and
other learning traces to detect cognitive presence," International Journal of Artificial Intelligence in Education, pp. 1-
21,
2023.
[35]
D. R. Garrison, T. Anderson and W. Archer, "Critical thinking, cognitive presence, and computer conferencing in distance
education," American Journal of distance education, vol. 15, pp. 7-23, 2001.
[36]
A. Darabi, M. C. Arrastia, D. W. Nelson, T. Cornille and X. Liang, "Cognitive presence in asynchronous online learning: A
comparison of four discussion strategies," Journal of Computer Assisted Learning, vol. 27, pp. 216-227, 2011.
[37]
H. Hind, M. K. Idrissi and S. Bennani, "Applying text mining to predict learners’ cognitive engagement," in Mediterranean
Symposium on Smart City Application, 2017.
[38]
G. J. Upton, "Fisher's exact test," Journal of the Royal Statistical Society: Series A (Statistics in Society), vol. 155, pp. 395-
402, 1992.
[39]
J. C. Richardson and T. Newby, "The role of students' cognitive engagement in online learning," American Journal of
Distance Education, vol. 20, pp. 23-37, 2006.
[40]
D. C. Park, J. Lodi-Smith, L. Drew, S. Haber, A. Hebrank, G. N. Bischof and W. Aamodt, "he impact of sustained
engagement on cognitive function in older adults: The Synapse Project," Psychological science, vol. 25, pp. 103-112, 2014.
[41]
F. P. Hartono, N. Umamah, R. Sumarno and P. N. Puji, "The level of student engagement based on gender and grade on
history subject of senior high school students in Jember Regency,"
International Journal of Scientific and Technology
Research, vol. 8, pp. 21-26, 2019.
[42]
M. Singh, P. James, H. Paul and K. Bolar, "Impact of cognitive-behavioral motivation on student engagement," Heliyon,
vol. 8, 2022.
[43]
J. C. Richardson, Examining social presence in online courses in relation to students' perceived learning and satisfaction,
State University of New York at Albany, 2001.
[44]
W. Li, J.-Y. Huang, C.-Y. Liu, J. C. Tseng and S.-P. Wang, "A study on the relationship between student's learning
engagements and higher-order thinking skills in programming learning," Thinking Skills and Creativity, vol. 49, 2023.
[45]
M.-H. Cho and M. L. Heron, "Self-regulated learning: The role of motivation, emotion, and use of learning strategies in
students’ learning experiences in a self-paced online mathematics course," Distance Education, vol. 36, pp. 80-99, 2015.
[46]
J. Kim, "Influence of group size on students' participation in online discussion forums," Computers & Education, vol. 62,
pp. 123-129, 2013.
[47]
M. T. Chi and N. S. Boucher, "Applying the ICAP framework to improve classroom learning," In their own words: What
scholars and teachers want you to know about why and how to apply the science of learning in your academic setting,
pp.
94--110, 2023.
[48]
Z. Akyol and D. R. Garrison, "Understanding cognitive presence in an online and blended community of inquiry: Assessing
outcomes and processes for deep approaches to learning," British Journal of Educational Technology, vol. 42, pp. 233--
250,
2011.
[49]
N. Vos, H. Van Der Meijden and E. Denessen, "Effects of constructing versus playing an educational game on student
motivation and deep learning strategy use," Computers \& education, vol. 56, pp. 127--137, 2011.
[50]
C. M. Thurn, P. A. Edelsbrunner, M. Berkowitz, A. Deiglmayr and L. Schalk, "Questioning central assumptions of the
ICAP framework," npj Science of Learning}, vol. 8, p. 49, 2023.