Conference PaperPDF Available

AWA-Tutor: A Platform to Ground Automated Writing Feedback in Robust Learning Design

Authors:

Abstract

Increasingly, the importance of aligning learning analytics with learning design is being understood as a way to uphold its core aim of improving educational practices, while also collecting meaningful data about learner’s activities that can be interpreted in context (Lockyer, Heathcote, & Dawson, 2013). In light of this, a writing analytics tool “AWA-Tutor” has been developed that integrates analytics with pedagogy. AWA-Tutor is a web-based tool developed as an extension of the Academic Writing Analytics (AWA) tool that provides automated feedback on students’ writing based on rhetorical structures in the text (Shibani, Knight, Buckingham Shum, & Ryan, 2017).
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
1
AWA-Tutor: A Platform to Ground Automated Writing
Feedback in Robust Learning Design
Antonette Shibani
University of Technology Sydney, Australia
antonette.shibani@gmail.com
Increasingly, the importance of aligning learning analytics with learning design is being understood as
a way to uphold its core aim of improving educational practices, while also collecting meaningful data
about learner’s activities that can be interpreted in context (Lockyer, Heathcote, & Dawson, 2013). In
light of this, a writing analytics tool “AWA-Tutor” has been developed that integrates analytics with
pedagogy. AWA-Tutor is a web-based tool developed as an extension of the Academic Writing
Analytics (AWA) tool that provides automated feedback on students’ writing based on rhetorical
structures in the text (Shibani, Knight, Buckingham Shum, & Ryan, 2017).
AWA-Tutor extends AWA, by scaffolding an entire writing improvement activity. Students are guided
through a series of tasks, such as understanding the instructor’s rubric, improving a sample text,
reviewing exemplar improvements, self-assessing their work, and reflecting on the quality of the
automated feedback. The tool is designed in a modular fashion to support the learning design of an
instructor, who can select the task components to be included, and personalize the feedback
experience for different students. AWA-Tutor captures detailed activity traces: the time taken by
students to complete certain tasks of the activity, snapshots of drafts at customizable time intervals,
students’ requests for automated feedback and the feedback received, and feedback survey
responses. Thus, the process of drafting and revising, which were previously hard to study, are now
reconstructable for subsequent analysis in other tools such as R, for which a suite of analyses have
been developed.
AWA-Tutor has been evaluated with undergraduate law students in authentic classroom settings,
tackling tasks co-designed with the Law academic, performing the activity individually or in pairs. Both
student and instructor feedback have been positive regarding the usage of this tool over the two
semesters the intervention was run, although there is certainly scope for improvement.
Demonstration movie: https://www.youtube.com/watch?v=K212XabCL5w&feature=youtu.be
Keywords: AWA-Tutor, writing analytics, tool, learning design, pedagogy integration
REFERENCES
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning
analytics with learning design. American Behavioral Scientist, 57(10), 1439-1459.
Shibani, A., Knight, S., Buckingham Shum, S., & Ryan, P. (2017). Design and Implementation of a
Pedagogic Intervention Using Writing Analytics. Paper presented at the 25th International
Conference on Computers in Education, New Zealand.
... The task started with an introductory reading or a video in which the instructor explained the concept of rhetorical moves and discourse markers and their application for the students' own writing genre. The students then logged in to an online platform called AWA-Tutor/ AWAT (Shibani, 2018) which facilitated the writing activities, as described in Shibani et al. (2017). 1. ...
Article
Full-text available
Written communication is an important skill across academia, the workplace, and civic participation. Effective writing incorporates instantiations of particular text structures-rhetorical moves-that communicate intent to the reader. These rhetorical moves are important across a range of academic styles of writing, including essays and research abstracts, as well as in forms of writing in which one reflects on learning gained through experience. However, learning how to effectively instantiate and use these rhetorical moves is a challenge. Moreover, educators often struggle to provide feedback supporting this learning, particularly at scale. Where effective support is provided, the techniques can be hard to share beyond single implementation sites. We address these challenges through the open-source AcaWriter tool, which provides feedback on rhetorical moves, with a design that allows feedback customization for specific contexts. We introduce three example implementations in which we have customized the tool and evaluated it with regard to user perceptions, and its impact on student writing. We discuss the tool's general theoretical background and provide a detailed technical account. We conclude with four recommendations that emphasize the potential of collaborative approaches in building, sharing and evaluating writing tools in research and practice.
... In the law subject, the instructor co-designed the intervention as a single task to be delivered in class. In this task, students completed writing tasks such as an exercise matching the assessment criteria for their assignment to sample sentences, viewing exemplar assignments, assessing a draft writing based on the criteria, and using the automated feedback tool to improve the quality of the draft using an online platform (Shibani, 2018), with additional peer discussions facilitated as part of the activity. They were then encouraged to use the tool to support their own assignment submission. ...
Article
Full-text available
Failing to understand the perspectives of educators, and the constraints under which they work, is a hallmark of many educational technology innovations' failure to achieve usage in authentic contexts, and sustained adoption. Learning Analytics (LA) is no exception, and there are increasingly recognised policy and implementation challenges in higher education for educators to integrate LA into their teaching. This paper contributes a detailed analysis of interviews with educators who introduced an automated writing feedback tool in their classrooms (triangulated with student and tutor survey data), over the course of a three-year collaboration with researchers, spanning six semesters' teaching. It explains educators' motivations, implementation strategies, outcomes, and challenges when using LA in authentic practice. The paper foregrounds the views of educators to support cross-fertilization between LA research and practice, and discusses the importance of cultivating educators' and students' agency when introducing novel, student-facing LA tools.
... First, an induction was provided to the task using an introductory reading or video which was created by the instructor to explain the goals of the activity. Then students logged in to an online platform which facilitates the writing tasks online, and collects activity data [33]. The first task was a matching exercise where students were asked to identify sample sentences from an essay that would match elements of the instructor's marking rubric so they have a better understanding of the rubric facets in the assessment criteria by learning from exemplars. ...
Conference Paper
Full-text available
A major promise of learning analytics is that through the collection of large amounts of data we can derive insights from authentic learning environments, and impact many learners at scale. However, the context in which the learning occurs is important for educational innovations to impact student learning. In particular, for student-facing learning analytics systems like feedback tools to work effectively, they have to be integrated with pedagogical approaches and the learning design. This paper proposes a conceptual model to strike a balance between the concepts of generalizable scalable support and contextualized specific support by clarifying key elements that help to contextualize student-facing learning analytics tools. We demonstrate an implementation of the model using a writing analytics example, where the features, feedback and learning activities around the automated writing feedback tool are tuned for the pedagogical context and the assessment regime in hand, by co-designing them with the subject experts. The model can be employed for learning analytics to move from generalized support to meaningful contextualized support for enhancing learning.
... To study the features of revision, the revised essays were marked by tutors on a scale of 0-3 (0-degraded, 1-no change, 2-minor improvement, 3-major improvements), based on which the essays are characterized as improved or degraded. Drafts from students' revisions were captured every one minute (unobtrusively) for collecting revision data using the AWA-Tutor tool which scaffolds the tasks in the intervention, and students' usage of automated feedback was also recorded [9]. ...
Chapter
Text revision is regarded as an important process in improving written products. To study the process of revision activity from authentic classroom contexts, this paper introduces a novel visualization method called Revision Graph to aid detailed analysis of the writing process. This opens up the possibility of exploring the stages in students’ revision of drafts, which can lead to further automation of revision analysis for researchers, and formative feedback to students on their writing. The Revision Graph could also be applied to study the direct impact of automated feedback on students’ revisions and written outputs in stages of their revision, thus evaluating its effectiveness in pedagogic contexts.
... The intervention consisted of a set of tasks which were completed by the students individually in a classroom during a tutorial session. To carry out the pedagogical intervention, I developed a tool called AWA-Tutor as an extension of AWA which guides students through different activities to learn rhetorical writing (Shibani, 2018). The tool also collects data when guiding the students through the tasks to enable learning analytics to collect evidence for the learning design. ...
Conference Paper
Full-text available
Academic writing can be supported by the provision of formative feedback in many forms including instructor feedback, peer feedback and automated feedback. However, for these feedback types to be effective, they should be applied in well-designed pedagogic contexts. In my pilot study, automated feedback from a writing analytics tool has been implemented in pedagogic interventions, which integrate learning analytics technologies to existing practice in an educational context. To improve the learning design and to study the use of human insights in this context, a peer discussion module is planned to be added. This kind of peer discussion can augment automated feedback applications by making students aware of the limitations of such artificial intelligence powered feedback, and develop writing literacy by providing additional contextual feedback for their peers. The learning analytics intervention design when tested across different disciplines can validate the usefulness of this approach to improve students' academic writing in authentic pedagogic contexts. The design can be implemented using a learning analytics tool which is developed to facilitate the intervention and provide analytic capabilities by collecting learner data.
Conference Paper
Full-text available
Academic writing is a key skill required for higher education students, which is often challenging to learn. A promising approach to help students develop this skill is the use of automated tools that provide formative feedback on writing. However, such tools are not widely adopted by students unless useful for their discipline-related writing, and embedded in the curriculum. This recognition motivates an increased emphasis in the field on aligning learning analytics applications with learning design, so that analytics-driven feedback is congruent with the pedagogy and assessment regime. This paper describes the design, implementation, and evaluation of a pedagogic intervention that was developed for law students to make use of an automated Academic Writing Analytics tool (AWA) for improving their academic writing. In exemplifying this pedagogically aligned learning analytic intervention, we describe the development of a learning analytics platform to support the pedagogic design, illustrating its potential through example analyses of data derived from the task.
Article
Full-text available
This article considers the developing field of learning analytics and argues that to move from small-scale practice to broad scale applicability, there is a need to establish a contextual framework that helps teachers interpret the information that analytics provides. The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action.