Conference PaperPDF Available

Engaging Learning Analytics in MOOCS: the good, the bad, and the ugly

Authors:

Abstract

Learning Analytics is an emerging field in the vast areas of Educational Technology and Technology Enhanced Learning (TEL). It provides tools and techniques that offer researchers the ability to analyze, study, and benchmark institutions, learners and teachers as well as online learning environments such as MOOCs. Massive Open Online Courses (MOOCs) are considered to be a very active and an innovative form of bringing educational content to a broad community. Due to the reasons of being free and accessible to the public, MOOCs attracted a large number of heterogeneous learners who differ in education level, gender, and age. However, there are pressing demands to adjust the quality of the hosted courses, as well as controlling the high dropout ratio and the lack of interaction. With the help of Learning Analytics, it is possible to contain such issues. In this publication, we discuss the principles of engaging Learning Analytics in MOOCs learning environments and review its potential and capabilities (the good), constraints (the bad), and fallacy analytics (the ugly) based on our experience in last year's.
ENGAGING LEARNING ANALYTICS IN MOOCS: THE GOOD, THE BAD,
AND THE UGLY
Mohammad Khalil, Behnam Taraghi & Martin Ebner
Educational Technology, Graz University of Technology (Austria)
Abstract
Learning Analytics is an emerging field in the vast areas of Educational Technology and Technology
Enhanced Learning (TEL). It provides tools and techniques that offer researchers the ability to analyze,
study, and benchmark institutions, learners and teachers as well as online learning environments such as
MOOCs. Massive Open Online Courses (MOOCs) are considered to be a very active and an innovative
form of bringing educational content to a broad community. Due to the reasons of being free and
accessible to the public, MOOCs attracted a large number of heterogeneous learners who differ in
education level, gender, and age. However, there are pressing demands to adjust the quality of the hosted
courses, as well as controlling the high dropout ratio and the lack of interaction. With the help of
Learning Analytics, it is possible to contain such issues. In this publication, we discuss the principles of
engaging Learning Analytics in MOOCs learning environments and review its potential and capabilities
(the good), constraints (the bad), and fallacy analytics (the ugly) based on our experience in last year's.
Keywords: Learning Analytics, MOOCs, pedagogy, potential, dilemma.
1. Introduction
Since 2008, Massive Open Online Courses (MOOCs) have shown significance and potentiality
to scale education in distance learning environments. The benefits shine when thousands of students can
participate in a course that a normal classroom cannot hold. Due to many reasons of being free, available
to the public and require no predefined level of participation, MOOCs attracted a large number of learners
from all over the world regardless their educational background, gender or age. Institutions of Higher
Education (HE) start to think seriously of adopting MOOCs and make use of Open Educational Resources
(OER) principles. Comparatively, famous MOOC-platform such as Coursera was established by Stanford
University, and edX by the Massachusetts Institute of Technology and Harvard. Both platforms provide
various courses to university students. Besides, MOOCs are not only preserved to university and college
participants, but also for primary school children, such as courses provided by the Austrian MOOC
provider, iMooX (www.imoox.at).
Typically, MOOCs are based on video lectures, multiple-choice quizzes or peer-review
assessments, discussion forums and documents (Khalil & Ebner, 2016c; Lackner, Ebner & Khalil, 2015).
Lessons are delivered on a weekly basis, and students commit to attend during the week. Additionally,
students can solve assignments and then share and discuss their views in forums or social media
networks. Further, teachers post questions and can communicate with students toward creating a domain
of presence (Khalil & Ebner, 2013). Nevertheless, frequent studies and reports complain about the low
completion rate, lack of interaction (Lackner, Ebner & Khalil, 2015), keeping the learners motivated,
engagement issues, and last but not least cheating and gaming the MOOC systems (Khalil & Ebner,
2015a; Khalil, Kastl & Ebner, 2016). As a result, mining student actions on distance learning
environments makes the job easier for educationists and researchers to maintain learner behaviors and
explain such concerns.
An inclusion and exploration of the term “Big Data” in the education field emerged recently.
Two main research communities oriented with respect to discovering new meaning of educational
datasets activities: the Educational Data Mining and the Learning Analytics communities (Papamitsiou &
Economides, 2014). In this paper, the focus will be mainly on Learning Analytics. We will discuss the
potential and capabilities as well as the constraints and the negative sides of the field with strong focus on
MOOCs. These criteria are established based on our experience in the last couple of years of
implementing Learning Analytics prototypes and strategies in the Austrian iMooX platform.
International Conference on Education and New Developments 2016
3
2. Learning analytics potentiality in MOOCs (the good)
Analyzing student data on online environments in order to reveal hidden patterns and discover
paradigms of activities is called Learning Analytics. In 2011, the Society for Learning Analytics and
Research defines it as “… the measurement, collection, analysis and reporting of data about learners and
their contexts, for purposes of understanding and optimizing learning and the environment in which
it occurs”. The needs for Learning Analytics emerged to optimize learning and benchmark the
learning environments. Khalil and Ebner (2015b, 2016c) discussed the various promises of employing
Learning Analytics in MOOCs platforms. Another recent study by Khalil and Ebner (2016b)
about surveying Learning Analytics techniques from 2013 to 2015 shows that the combination of
Learning Analytics and MOOCs related-topic scored the highest number of citations in Google Scholar
(http://scholar.google.com) during that period.
Online distance learning environments such as MOOCs provide a rich source of knowledge
mining opportunity. By logging mouse clicks, forums activity, quiz performance, login frequency, time
spent on tasks and tracking videos interactivity, Learning Analytics researchers can build an enormous
amount of data logs. This database of information, if interpreted appropriately, can help researchers from
diverse disciplines of computer science, pedagogy, statistics, and machine learning…etc., to intervene
directly toward student success. Benefits of Learning Analytics in MOOCs are limitless. In the following,
we list the primary benefits of applying Learning Analytics in MOOCs:
Prediction: One of the most popular objectives performed by both Learning Analytics and
Educational Data Mining. Techniques are used to predict when a participant is expected to drop from an
online course. This could be done by analyzing a student behavior, exam performance, and video skips.
Storing numerous records of previous students’ activities based on specific modules help researchers
predict the prospective action, such as dropping out of a course or detecting students at-risk. Additionally,
Learning Analytics is used in predicting performance and motivation (Edtstadler, Ebner & Ebner, 2015).
Further forecasting about video watching on a course and relative activity in discussion forums is feasible
to be investigated.
Recommendation: Actions on MOOC platforms can be mined for recommendation purposes.
An example is when a MOOC provider recommends learning materials to students based on their
previous registered courses. In addition, recommendations can be generated to suggest a student
answering a specific question in discussion forums.
Visualization: Through Learning Analytics, tracking previously mentioned actions creates a lot
of records. Visualizations can be presented to participants via dashboards. Verbert and her colleagues
(2014) discussed that dashboards support awareness, reflection and sense-making. On the other hand,
analyzing data through visualizing them into plots supports researchers to reveal patterns (Khalil &
Ebner, 2016c) and provides feedback and reflection to MOOC participants at the end.
Entertainment: Gaming tools were considered as a Learning Analytics technique in Khalil and
Ebner (2016b) work. The survey illustrates how gamification makes learning in MOOCs more
entertaining which results in an increased motivation and completion rate among students. Such tools can
be badges (Wüster & Ebner, 2016), reward points, progress bars or colorful gauges.
Benchmarking: Benchmarking is a learning process which evaluating courses, videos,
assignments, and MOOC platforms are attainable using Learning Analytics. Hence, we can identify
learning difficulties as well as weak points in the online courses or stalling segments in video lectures.
Accordingly, constructive feedback is generated which concludes into an enhanced educational system.
Personalization: Learners can shape their personal experience in a MOOC. Developers through
different types of Learning Analytics techniques (Khalil & Ebner, 2016b) can build a set of personalized
items in the MOOC platform. For example, a student can favorite a part of a video or bookmark an article
or a document. Further, (s)he can customize notifications and add annotations in videos.
Enhance Engagement: Engagement has recently been an attracting topic in MOOCs.
Employing Learning Analytics through data mining techniques such as clustering was used in (Kizilcec,
Piech & Schneider, 2013; Khalil, Kastl & Ebner, 2016). Expected results are grouping participants into a
subpopulation of students or classifying interactions in videos, assignments and quizzes for reasons of
future interventions in MOOC designs or studying the needs of the students’ catalogue.
Communication Information: Learning Analytics involves collecting data from sources and
processes them. It is further used to report information in a form of statistical analysis to different MOOC
stakeholders. Similar to web analytics, students can check their activities and review general statistics
using dashboards, for example. In addition, teachers and decision makers can build an overview about
MOOC using descriptive statistics.
Cost Saving: Since Learning Analytics provides tools of data analysis, it opens the doors for a
broad examination services which makes it possible to determine weak sections of a MOOC. Therefore,
decision makers can allocate resources effectively.
ISBN: 978-989-99389-8-4 © 2016
4
3. The negative side of learning analytics in MOOCs (the bad)
Despite the fact that Learning Analytics achieves several benefits when it is applied to education
data stream, rising constraints have been identified lately (Papamitsiou & Economides, 2014; Khalil &
Ebner, 2015b). The large-scale of data collection and processes drives Learning Analytics to questions
related to privacy and ethical issues. An atmosphere of uncertainty among practitioners of Learning
Analytics as well as decision makers decelerates its steep growth (Drachsler & Greller, 2016). Through
our experience, we encourage educational organizations to adopt the security model CIA, which stands
for Confidentiality, Integrity, and Availability. In this section, we list major concerns of implementing
Learning Analytics in MOOC platforms:
Security: The stored records of students in databases that belong to Learning Analytics
applications represent the heart of their private information. Thus, maintaining database configuration is
not always considered by organizations. As a result, breaches of confidential information are possible to
happen.
Privacy: Learning Analytics can reveal personal information of learners. MOOC datasets may
hold sensitive information such as emails, names or addresses. Privacy has been considered as a threat in
Learning Analytics (Papamitsiou & Economides, 2014) and as a constraint (Khalil & Ebner, 2015b).
Different solutions can be proposed such as anonymization approach (Khalil & Ebner, 2016a),
encryption, or increasing restrictions.
Ownership: Questions related to who owns the analyzed data of MOOCs” can emerge
anytime. Participants like to keep their information confidential, but at the same time, consent policy is
essential to ensure transparency. Further, MOOC providers are encouraged to delete or de-identify
personal information of their participants.
Consent: Related to ownership of data. Not every MOOC provider clearly declares the usage
of students’ data. Policies with legislation frameworks should include rules of a collection of personal
information and a description of information usage, such as research purposes or third party information
selling.
Transparency: Secret processes can hide unfair decision making when analytics is applied on
educational datasets (Sclater, 2014). By the same token, when Learning Analytics is applied on MOOCs,
providers need to disclose their approach to collecting, analyzing and using of participants’ data. At the
same time, a point of balance should be made when the Learning Analytics algorithms or tools are
proprietary. Sclater argued different code of practices regarding transparency.
Storage: As long as MOOCs are open to the public, a single course can attract thousands of
students. Storing big data could be costly, overloaded, and complex as well as hard to manage.
Furthermore, according to the European Directive 95/46/EC1, personal data needs to be stored no longer
than necessary.
4. The dark side of learning analytics in MOOCs (the ugly)
Looking for the quality of data is an important factor in Learning Analytics. However, when data
records have incomplete segments or polluted information, then Learning Analytics is negatively
affected. Moreover, getting a holistic overview of students in online courses cannot only be harvested
through their left traces on MOOCs. Are there any guarantees of the Learning Analytics results? What
about the accuracy? In this section, we summarize some of the worst-case results that Learning Analytics
can produce by employing it in MOOCs.
False Positives: Making decisions, either by analysts or directors, based on a small subset of
data could lead to fast judgments and hence trigger “false positives”. Consequently, the accuracy of any
forthcoming decision in a MOOC system will be influenced. For instance, if a group of students were
“gaming the system” and an analyst builds a prediction model for all students based on MOOC indicators
fulfillment, then a false positive action is triggered. As a matter of fact, Learning Analytics is not only
based on numbers and statistics. Judgments and opinions of researchers play a major role. We always see
flounce on MOOCs discussion forums activity and its correlation with performance. Some researchers
approved that more social activity in forums is reflected positively on performance while others go
against this theory. In the light of that, Learning Analytics is not always accurate.
Fallacy Analytics: Analytics could fail and thus, mistaken interventions or predictions occur.
Failures could happen during the main processes of Learning Analytics cycle. Wrong actions in collecting
data from MOOCs, errors in processing or filtering and mistaken interpretation of data are possible
scenarios of fallacy analytics. Additionally, presenting the results through visualizations might also be
within the same page. Visualizations are a great way to report information, but playing with scales or
1http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML (last visited: March, 2016)
International Conference on Education and New Developments 2016
5
using 3D figures might be tricky to the end user (student, teacher, decision maker). Fallacy analytics may
be accidental and not intentional; however, using interpreted data based on fallacy analytics can be
dangerous to different stakeholders and uneconomical to the MOOC business. Fallacy analytics through
misuse of statistics as a method of Learning Analytics corrupts and pollutes research records as well as
wastes the time and energy of other researchers (Gardenier & Resnik, 2002).
Bias: Learning Analytics can show significant results of prediction and recommendation. It
can also prove hypotheses such as the relation between activity in discussion forums and performance or
watching videos and passing MOOCs. Collected data “could feel” that, but this actually returns to the
intention desire of the researcher or decision maker. The bias towards a certain hypothesis and the inner
determination of proving a theory of students’ data leads to biased Learning Analytics.
Meaningful data: Papamitsiou and Economides (2014) mentioned that Learning Analytics
mostly uses quantitative research results. Qualitative methods have not yet shown significant results.
Learning Analytics can be ineffective and waste of efforts if meaningful data is hard to extract. Dringus
(2012) argued two main points regarding meaningful data in Learning Analytics: 1) if the data collected
has no impact on improving or changing education. 2) if the data has no meaningful evidence such as lack
of clarity about what to measure to get meaningful information.
5. Conclusion
Learning Analytics provides various tools and to optimize learning. In this paper, we reviewed
the principles of engaging Learning Analytics in Massive Open Online Courses (MOOCs). We discussed
the capabilities (the good), the dilemmas (the bad) and the out of the bound situations (the ugly).
Figure 1. The advantages and disadvantages of Learning Analytics in MOOCs
Figure 1 summarizes our results. Generally speaking, MOOCs and Learning Analytics imply
high potentiality. Nevertheless, a code of practice should be considered by all stakeholders in order to
carry out the optimum outcomes.
References
Drachsler, H. & Greller, W. (2016). Privacy and Learning Analytics it’s a DELICATE issue. In
Proceedings of the Sixth International Conference of Learning Analytics and Knowledge
(LAK16’), Edinburgh, United Kingdom. ACM.
Dringus, L. P. (2012). Learning Analytics Considered Harmful. Journal of Asynchronous Learning
Networks, 16(3), (pp. 87-100).
Edtstadler, K., Ebner, M., Ebner, M. (2015). Improved German Spelling Acquisition through Learning
Analytics. eLearning Papers, 45, (pp. 17-28).
ISBN: 978-989-99389-8-4 © 2016
6
Gardenier, J., & Resnik, D. (2002). The misuse of statistics: concepts, tools, and a research agenda.
Accountability in Research: Policies and Quality Assurance, 9(2), (pp. 65-74).
Khalil, H., & Ebner, M. (2013). Interaction Possibilities in MOOCsHow Do They Actually Happen. In
International Conference on Higher Education Development (pp. 1-24).
Khalil, M., & Ebner, M. (2015a). A STEM MOOC for school childrenWhat does Learning Analytics
tell us?. In Proceedings of 2015 International Conference on Interactive Collaborative Learning
(ICL). (pp. 1217-1221). IEEE.
Khalil, M., & Ebner, M. (2015b). Learning Analytics: Principles and Constraints. In Proceedings of
World Conference on Educational Multimedia, Hypermedia and Telecommunications
(pp. 1326-1336).
Khalil, M., & Ebner, M. (2016a). De-Identification in Learning Analytics. Journal of Learning Analytics,
3(1).
Khalil, M., & Ebner, M. (2016b). What is Learning Analytics about? A Survey of Different Methods
Used in 2013-2015. In proceeding of the 8th e-Learning Excellence Conference. Dubai, UAE.
Khalil, M. & Ebner, M. (2016c). What Massive Open Online Course (MOOC) Stakeholders Can Learn
from Learning Analytics?. Learning, Design, and Technology: An International Compendium of
Theory, Research, Practice, and Policy. Springer.
Khalil, M., Kastl, C., & Ebner, M. (2016). Portraying MOOCs Learners: a Clustering Experience Using
Learning Analytics. In Proceedings of the European Stakeholder Summit on experiences and best
practices in and around MOOCs (EMOOCs 2016). Khalil, M., Ebner, M., Kopp, M., Lorenz, A. &
Kalz. M. (Eds.). BookOnDemand, Norderstedt. (pp. 265-278).
Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement: analyzing learner
subpopulations in massive open online courses. In Proceedings of the third international
conference on Learning Analytics and knowledge (pp. 170-179). ACM.
Lackner, E., Ebner, M., & Khalil, M. (2015). MOOCs as granular systems: design patterns to foster
participant activity. eLearning Papers, 42, (pp. 28-37).
Papamitsiou, Z. K., & Economides, A. A. (2014). Learning Analytics and Educational Data Mining in
Practice: A Systematic Literature Review of Empirical Evidence. Educational Technology &
Society, 17(4), (pp. 49-64).
Sclater, N. (2014). Code of practice for Learning Analytics: A literature review of the ethical and
legal issues. Available at: http://repository.jisc.ac.uk/5661/1/Learning_Analytics_A-
_Literature_Review.pdf.
Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., & Klerkx, J. (2014).
Learning dashboards: an overview and future research opportunities. Personal and Ubiquitous
Computing, 18(6), (pp. 1499-1514).
Wüster, M. & Ebner, M. (2016). How to integrate and automatically issue Open Badges in MOOC
platforms. In Proceedings of the European Stakeholder Summit on experiences and best practices
in and around MOOCs (EMOOCs 2016). Khalil, M., Ebner, M., Kopp, M., Lorenz, A. & Kalz. M.
(Eds.). BookOnDemand, Norderstedt, (pp. 279286).
International Conference on Education and New Developments 2016
7
... Privacy has been considered as a threat in LA (Papamitsiou, 2014) and as a constraint (Hoel & Chen, 2018). Some solutions have been proposed such as to anonymize the traceable data such as names and emails or encryption, or increasing restrictions (Khalil, Taraghi, & Ebner 2016). Scholars have also proposed principles that could be used to further develop an educational maxim for data privacy in learning analytics including privacy and data protection in LA that could be achieved by negotiating data sharing with each student (Hoel & Chen, 2018). ...
... One of the key concerns in analytics is to rely on the available data sets (Khalil, Taraghi, & Ebner 2016). Often the datasets are limited with limited sample size compared to the population data and the key question is whether the data provide enough representation to the population? ...
Conference Paper
Full-text available
Network analysis simulations were used to guide decision-makers while configuring instructional spaces on our campus during COVID-19. Course enrollment data were utilized to estimate metrics of student-to-student contact under various instruction mode scenarios. Campus administrators developed recommendations based on these metrics; examples of learning analytics implementation are provided.
... In-depth data analysis: By utilising advanced Learning Analytics techniques, future studies could more thoroughly analyse the data generated by MOOC, allowing for the identification of patterns in participant engagement, learning, and success [57,58]. ...
Article
Full-text available
In the context of 21st-century educational transformation, Massive Open Online Courses have emerged as a global and flexible learning opportunity. This study explores the impact of MOOC implemented at Instituto Super Técnico on student engagement and attendance, focusing on their effectiveness in higher education. A descriptive approach was applied, selecting three MOOC as the empirical field, combining quantitative data from participation logs and qualitative feedback from student surveys. The key characteristics, the challenges, and the effectiveness of educational resources were evaluated, particularly concerning student interaction and perceptions of course quality. The pedagogical analysis highlighted both successful strategies and areas for improvement. The findings offer valuable insights into the influence of MOOC on students and the institution, contributing to the broader understanding of Portugal’s educational landscape. This study also proposes a framework for enhancing MOOC strategies in higher education globally, considering the potential of technological innovations in the 21st century.
... The objective has been to better understand the learner and thus gain insights which may potentially help optimise learning. Data that tracks learning activities, used as a basis for learning analytics, is one of the biggest and the most reliable information sources associated with online participants [19,24]. ...
Article
Full-text available
Analysing learners’ behaviours in MOOCs has been used to identify predictive features associated with positive outcomes in engagement and learning success. Early methods predominantly analysed numerical features of behaviours such as the page views, video views, and assessment grades. Analysing extracted numeric features using baseline machine learning algorithms performed well to predict the learners’ future performance in MOOCs. We propose categorising learners by likely English language proficiency and extending the range of data to include the content of comment texts. We compare results to a model trained with a combined set of extracted features. Not all platforms provide this rich variety of data. We analysed a series of a FutureLearn language focused MOOCs. Our data were from discussions embedded into each lesson’s content. Analysing whether we gained any additional insights, over 420,000 comments were used to train the algorithm. We created a method for identifying one’s possible first language from their country. We found that using comments alone is a weaker predictive approach than using a combination including extracted features from learners’ activities. Our study contributes to research on generalisability of learning algorithms. We replicated the method across different MOOCs—the performance varies on the model though it always remained over 50%. One of the deep learning architecture, Bidirectional LSTM, trained with discussions on the language learning 73% successfully predicted learners’ performance on a different MOOC.
Article
Full-text available
Collecting and analyzing log data can provide students with individualized learning to maintain their motivation and engagement in learning activities and reduce dropout in Massive Open Online Courses (MOOCs). As online learning becomes more and more important, the demand for learning analytics is surging to design a variety of interventions that can achieve learning success and achieve individual learning goals and targets. In response to significant demand, we intended to derive data standards for learning analytics by specifying more the factors influencing MOOC completion suggested in previous research results. Therefore, this study aims to compare the event logs of students who have achieved scores adjacent to the minimum passing score of Korean Massive Open Online Course (K-MOOC) completion by dividing them into the completion (C) group and the non-completion (NC) group. As a result of analyzing the log data accumulated on the 60 K-MOOCs, what is interesting in the results of this study is that there was no significant difference between the C group and the NC group in video viewing, which is considered the main learning activity on the MOOC platform. On the other hand, there was a statistically significant difference between the C group and the NC group for textbook interactions in the percentage of learners who performed and the average number of logs per learner, as well as problem interactions in the average number of logs per learner. Students’ assertive activities such as textbook interaction and problem interaction might have greater value for MOOC completion than passive activities such as video watching. Therefore, MOOC instructors and developers should explore more specific design guidelines on how to provide problems with individualized hints and feedback and offer effective digital textbooks or reference materials for the large number of students. The results suggest that collecting and analyzing MOOC students’ log data on interactions, for understanding their motivation and engagement, should be investigated to create an individualized learning environment and increase their learning persistence in completing MOOCs. Future studies should focus on investigating meaningful patterns of the event logs on learning activities in massive quantitative and qualitative data sets.
Article
Full-text available
MOOCs are an innovative way to conduct teaching-learning in online, with the help of internet technology. The objectives of the study are to Know about the user friendly characteristics of MOOCS. Know the role of MOOCS as an alternative to traditional education, role of MOOCs in higher education. Explore the benefits of MOOCS and to know about the disadvantages of MOOCS. It is a qualitative research. The study is conducted on the bases of researcher observation and critically analysis of the providing facilities of MOOCS in higher education. MOOCs is so popular because it is based on learner centered pedagogy, select learner interest topic, Value oriented, Language is no barrier here, globally use, any time anywhere, Learning is so enjoyable. MOOCs are no alternative to traditional formal education but it is modified then traditional learning. The special role of MOOCs in higher education is it is a free platform, Access for all, Professional development, Provide new courses. Including above benefits there are some limitations of MOOCs are- absence of physical classroom, absence of teacher physical interaction, Here individual attention by the teacher is absent, in remote area where internet connection is week where it is a big problem, most courses are provided by English that’s why language is one of the barriers, to assessment the students is difficult, interaction with teacher and other students is less. Keywords: advantages, disadvantages, higher learning, MOOCs, online learning.
Chapter
Online learning has proved its effectiveness in the last few years among a wide range of learners. Massive Open Online Courses (MOOCs) have revolutionized the shape of learning because they are considered to be a substitutional tool to the conventional educational system for many reasons, such as flexibility in timing and eliminating the economic and geographical constraints to the learners. MOOCs also enable learners from different cultures to communicate and share their knowledge through forums. Nevertheless, MOOCs are encountering several challenges that are required to be addressed, such as the higher dropout rates among learners at different phases of the course, and reduction in participation level of learners. In this chapter, we aim to address the most familiar four challenges and enhance the MOOCs experience through providing a framework of integrating a Learning Analytics technique and Intelligent Conversational Agent (LAICA) to improve the MOOCs experience for learners and educators.
Chapter
Learning Analytics provides researchers with the opportunity to evaluate, monitor, and compare institutions, learners, and instructors, as well as online learning environments such as MOOCs. Massive Open Online Courses (MOOCs) are a trendy and creative way to deliver educational content to a large audience. Because of being free and publicly available, MOOCs attracted many heterogeneous learners who vary in education, gender, and age rates. However, there are pressing demands to change the courses’ standards and monitor the high dropout ratio and the lack of interaction. Those issues can be handled with the aid of Learning Analytics. This chapter addresses the major challenges of integrating Learning Analytics in learning environments within MOOCs and examining their potential benefits and limitations.
Article
This paper focuses on quality assurance in language massive open online courses (LMOOCs). It is a qualitative study that adopts the grounded theory method and analyses evaluative comments on the quality of LMOOCs from learners’ perspectives. With the data collected from 1,000 evaluations from English as a second language (ESL) learners on China’s biggest MOOC platform “iCourse”, this study examines what has influenced learners’ perceptions of LMOOCs and identifies the specific quality criteria of five types of them, including ESL courses for speaking, reading, writing, cultural studies, and integrated skills. The results of the study will lay a foundation for the establishment of a quality criteria framework for LMOOCs and provide insights into design principles for effective online language courses tailored to the diverse needs of a massive number of language learners.
Chapter
Full-text available
Massive open online courses (MOOCs) provide anyone with Internet access the chance to study at university level for free. In such learning environments and due to their ubiquitous nature, learners produce vast amounts of data representing their learning process. Learning Analytics (LA) can help identifying, quantifying, and understanding these data traces. Within the implemented web-based tool, called LA Cockpit, basic metrics to capture the learners’ activity for the Austrian MOOC platform iMooX were defined. Data is aggregated in an approach of behavioral and web analysis as well as paired with state-of-the-art visualization techniques to build a LA dashboard. It should act as suitable tool to bridge the distant nature of learning in MOOCs. Together with the extendible design of the LA Cockpit, it shall act as a future proof framework to be reused and improved over time. Aimed toward administrators and educators, the dashboard contains interactive widgets letting the user explore their datasets themselves rather than presenting categories. This supports the data literacy and improves the understanding of the underlying key figures, thereby helping them generate actionable insights from the data. The web analytical feature of the LA Cockpit captures mouse activity in individual course-wide heatmaps to identify regions of learner’s interest and help separating structure and content. Activity over time is aggregated in a calendar view, making timely reoccurring patterns otherwise not deductible, now visible. Through the additional feedback from the LA Cockpit on the learners’ behavior within the courses, it will become easier to improve the teaching and learning process by tailoring the provided content to the needs of the online learning community.
Article
Full-text available
The area of Learning Analytics has developed enormously since the first International Conference on Learning Analytics and Knowledge (LAK) in 2011. It is a field that combines different disciplines such as computer science, statistics, psychology and pedagogy to achieve its intended objectives. The main goals illustrate in creating convenient interventions on learning as well as its environment and the final optimization about learning domain stakeholders. Because the field matures and is now adapted in diverse educational settings, we believe there is a pressing need to list its own research methods and specify its objectives and dilemmas. This paper surveys publications from Learning Analytics and Knowledge conference from 2013 to 2015 and lists the significant research areas in this sphere. We consider the method profile and classify them into seven different categories with a brief description on each. Furthermore, we show the most cited method categories using Google scholar. Finally, the authors raise the challenges and constraints that affect its ethical approach through the meta-analysis study. It is believed that this paper will help researchers to identify the common methods used in Learning Analytics, and it will assist by establishing a future forecast towards new research work taking into account the privacy and ethical issues of this strongly emerged field.
Chapter
Full-text available
Massive open online courses (MOOCs) are the road that led to a revolution and a new era of learning environments. Educational institutions have come under pressure to adopt new models that assure openness in their education distribution. Nonetheless, there is still altercation about the pedagogical approach and the absolute information delivery to the students. On the other side with the use of Learning Analytics, powerful tools become available which mainly aim to enhance learning and improve learners’ performance. In this chapter, the development phases of a Learning Analytics prototype and the experiment of integrating it into a MOOC platform, called iMooX will be presented. This chapter explores how MOOC stakeholders may benefit from Learning Analytics as well as it reports an exploratory analysis of some of the offered courses and demonstrates use cases as a typical evaluation of this prototype in order to discover hidden patterns, overture future proper decisions, and to optimize learning with applicable and convenient interventions.
Article
Full-text available
Learning analytics has reserved its position as an important field in the educational sector. However, the large-scale collection, processing, and analyzing of data has steered the wheel beyond the borders to face an abundance of ethical breaches and constraints. Revealing learners’ personal information and attitudes, as well as their activities, are major aspects that lead to identifying individuals personally. Yet, de-identification can keep the process of learning analytics in progress while reducing the risk of inadvertent disclosure of learners’ identities. In this paper, the authors discuss de-identification methods in the context of the learning environment and propose a first prototype conceptual approach that describes the combination of anonymization strategies and learning analytics techniques.
Conference Paper
Full-text available
The area of Learning Analytics has developed enormously since the first International Conference on Learning Analytics and Knowledge (LAK) in 2011. It is a field that combines different disciplines such as computer science, statistics, psychology and pedagogy to achieve its intended objectives. The main goals illustrate in creating convenient interventions on learning as well as its environment and the final optimization about learning domain's stakeholders (Khalil & Ebner, 2015b). Because the field matures and is now adapted in diverse educational settings, we believe there is a pressing need to list its own research methods and specify its objectives and dilemmas. This paper surveys publications from Learning Analytics and Knowledge conference from 2013 to 2015 and lists the significant research areas in this sphere. We consider the method profile and classify them into seven different categories with a brief description on each. Furthermore, we show the most cited method categories using Google scholar. Finally, the authors raise the challenges and constraints that affect its ethical approach through the meta-analysis study. It is believed that this paper will help researchers to identify the common methods used in Learning Analytics, and it will assist by establishing a future forecast towards new research work taking into account the privacy and ethical issues of this strongly emerged field.
Chapter
Full-text available
Though MOOC platforms offer quite good online learning opportunities, thereby gained skills and knowledge is not recognized appropriately. Also, they fail in main-taining the initial learner’s motivation to complete the course. Mozilla’s Open Badges, which are digital artifacts with embedded meta-data, could help to solve these problems. An Open Badge contains, beside its visual component, data to trustworthy verify its receipt. In addition, badges of different granularity cannot just certify successful course completion, but also help to steer the learning process of learners through formative feedback during the course. Therefore, a web application was developed that enabled iMooX to issue Open Badges for formative feedback as well as summative evaluation. A course about Open Educa-tional Resources served as prototype evaluation, which confirmed its aptitude to be also used in other courses.
Conference Paper
Full-text available
Massive Open Online Courses are remote courses that excel in their students' heterogeneity and quantity. Due to the peculiarity of being massiveness, the large datasets generated by MOOCs platforms require advance tools to reveal hidden patterns for enhancing learning and educational environments. This paper offers an interesting study on using one of these tools, clustering, to portray learners' engagement in MOOCs. The research study analyse a university mandatory MOOC, and also opened to the public, in order to classify students into appropriate profiles based on their engagement. We compared the clustering results across MOOC variables and finally, we evaluated our results with an eighties students' motivation scheme to examine the contrast between classical classes and MOOCs classes. Our research pointed out that MOOC participants are strongly following the Cryer's scheme of Elton (1996).
Conference Paper
Full-text available
The widespread adoption of Learning Analytics (LA) and Educational Data Mining (EDM) has somewhat stagnated recently, and in some prominent cases even been reversed following concerns by governments, stakeholders and civil rights groups about privacy and ethics applied to the handling of personal data. In this ongoing discussion, fears and realities are often indistinguishably mixed up, leading to an atmosphere of uncertainty among potential beneficiaries of Learning Analytics, as well as hesitations among institutional managers who aim to innovate their institution's learning support by implementing data and analytics with a view on improving student success. In this paper, we try to get to the heart of the matter, by analysing the most common views and the propositions made by the LA community to solve them. We conclude the paper with an eight-point checklist named DELICATE that can be applied by researchers, policy makers and institutional managers to facilitate a trusted implementation of Learning Analytics.
Article
Full-text available
Many pupils struggle with the acquisition of the German orthography. In order to meet this struggle a web based platform for German speaking countries is currently developed. This platform aims to motivate pupils aged 8 to 12 to improve their writing and spelling competences. In this platform pupils can write texts in the form of blog entries concerning everyday events or special topics. Since the core of this platform consists of an intelligent dictionary focussing on different categories of misspellings, students can improve their own spelling skills by trying to correct their mistakes according to the feedback of the system. Teachers are informed about specific orthographic problems of a particular student by getting a qualitative analysis of the misspellings from this intelligent dictionary. The article focuses on the development of the intelligent dictionary, details concerning the requirements, the categorization and the used wordlist. Further, necessary information on German orthography, spelling competence in general and the platform itself is given. By implementing methods of learning analytics it is expected to gain deeper insight into the process of spelling acquisition and thus serves as a basis to develop better materials on the long run.
Conference Paper
Full-text available
Massive Open Online Courses (MOOCs) have been tremendously spreading among Science, Technology, Engineering and Mathematics (STEM) academic disciplines. These MOOCs have served an agglomeration of various learner groups across the world. The leading MOOCs platform in Austria, the iMooX, offers such courses. This paper highlights authors’ experience of applying Learning Analytics to examine the participation of secondary school pupils in one of its courses called “Mechanics in everyday life”. We sighted different patterns and observations and on the contrary of the expected jubilant results of any educational MOOC, we will show, that pupils seemingly decided to consider it not as a real motivating learning route, but rather as an optional homework.
Article
This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through the lens of questioning the current status of applying learning analytics to online courses. The goal of the discussion is twofold: (1) to inform online learning practitioners (e.g., instructors and administrators) of the potential of learning analytics in online courses and (2) to broaden discussion in the research community about the advancement of learning analytics in online learning. In recognizing the full potential of formalizing big data in online courses, the community must address this issue also in the context of the potentially "harmful" application of learning analytics.