Content uploaded by Antonette Shibani
Author content
All content in this area was uploaded by Antonette Shibani on Apr 23, 2018
Content may be subject to copyright.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
1
Developing a Learning Analytics Intervention Design and tool for
Writing Instruction
Antonette Shibani
University of Technology Sydney
antonette.aileenshibani@student.uts.edu.au
ABSTRACT: Academic writing can be supported by the provision of formative feedback in many
forms including instructor feedback, peer feedback and automated feedback. However, for
these feedback types to be effective, they should be applied in well-designed pedagogic
contexts. In my pilot study, automated feedback from a writing analytics tool has been
implemented in pedagogic interventions, which integrate learning analytics technologies to
existing practice in an educational context. To improve the learning design and to study the
use of human insights in this context, a peer discussion module is planned to be added. This
kind of peer discussion can augment automated feedback applications by making students
aware of the limitations of such artificial intelligence powered feedback, and develop writing
literacy by providing additional contextual feedback for their peers. The learning analytics
intervention design when tested across different disciplines can validate the usefulness of this
approach to improve students’ academic writing in authentic pedagogic contexts. The design
can be implemented using a learning analytics tool which is developed to facilitate the
intervention and provide analytic capabilities by collecting learner data.
Keywords: academic writing, automated feedback, peer feedback, learning design, learning
analytics
1. INTRODUCTION
Academic writing is challenging to learn for many students, who could be supported by the provision
of formative feedback on the writing. Formative feedback aids students to gain awareness on where
they stand in terms of goals in their current work and how to improve their progress by determining
the way forward (Sadler, 1989). Formative feedback on students’ drafts can help them to improve
their writing through the application of the feedback in revisions of their text. Due to the time-
consuming nature of instructor provided feedback for student drafts, especially in large cohorts,
alternative forms of feedback like peer feedback and automated feedback have been studied to help
improve students’ writing skills.
Automated tools use computational techniques to provide immediate feedback on students’ drafts.
They reduce waiting time and human effort, ensure consistency and encourage students to practise
writing and revision by providing feedback multiple times while writing drafts (Shermis, Raymat, &
Barrera, 2003). Such writing analytics tools which offer the potential to provide timely automated
feedback on students’ writing are good examples of learning analytics applications in pedagogy.
However, these technologies have to be embedded in the curriculum by implementing them in well-
designed contexts for better uptake by students, and for solving existing pedagogical issues using
learning analytics. This move from developing learning analytics technologies to integrating them as
part of a larger educational context can be done using ‘pedagogical learning analytics intervention
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
2
design’ which refers to the “systematic efforts to incorporate the use of analytics as a productive part
of teaching and learning practices in a given educational context” (Wise, 2014). This adds value to
learning by closing the gap between potential and actual use of technologies. The alignment of
learning analytics to learning design also provides a contextual framework to document the pedagogic
intent of analytics applications and to collect data for its evidence (Lockyer, Heathcote, & Dawson,
2013).
While working with learning analytics tools and dashboards, the human context is often emphasized
as central in interpreting and making sense of the analytics (Siemens, 2012). This is because learning
is a complex activity involving social processes. Sense-making and interpretations are hence important
in writing analytics tools, in order for students to understand and implement the automated feedback
that is provided on their writing. One way of providing sense-making support is through peer feedback
and discussion, where students can interpret, discuss and critique automated feedback on writing
with their peers. This approach of combining peer discussion and automated feedback has two core
benefits. First, peer discussion overcomes limitations in automated feedback by complementing it
with contextual feedback by peers to capture features missed by the tool. This brings in a human
context which is lacking in automated feedback and enhances the social and cognitive processes
involved in writing. Students also learn from each other while providing feedback on each other’s
writing by making judgements about their performance (Allal, Lopez, Lehraus, & Forget, 2005).
Second, automated feedback may address a concern in peer feedback regarding student’s abilities to
provide meaningful feedback, by scaffolding this feedback and provoking discussion around the
identified features.
I propose a design that combines known effective practices like peer feedback and discussion with
automated feedback, to complement each other. The design will be applicable in pedagogic contexts
where learning analytics can augment existing learning designs to improve students’ writing skills. To
represent the design in a theoretical and practical way for implantation in practice, abstractions like
conjecture mapping (Sandoval, 2014) and design patterns (Goodyear, 2005) will be developed. Thus,
the main aim of my research is to develop a pedagogic learning analytics intervention design that
combines automated and peer feedback to improve students’ academic writing, and design patterns
to help implement it in classroom settings across different pedagogic contexts. In doing that, I will
study the following as part of my research:
• The impact of automated feedback in student writing
• The impact of the inclusion of a peer discussion component with automated feedback
• The implementation of learning design across different contexts
2. CURRENT STATUS OF WORK
In the first part of my doctoral study, a learning design aligned with learning analytics was developed
by embedding the automated feedback tool Academic Writing Analytics (AWA) in a law subject
(Shibani, Knight, Buckingham Shum, & Ryan, 2017). This was to help students understand the role of
rhetorical structures and the usage of automated feedback on them in a way that it can be applied for
their subject essay writing. These rhetorical structures guide the reader through the argument
structure of a text, and are a key part of academic writing (Hyland, 2005), with their presence having
some (small) relationship to essay quality (Simsek et al., 2015). An intervention grounded by pedagogy
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
3
was designed for students to learn essay writing and revision skills based on rhetorical moves in the
context of their subject curriculum by augmenting existing practice with learning analytics. The
intervention consisted of a set of tasks which were completed by the students individually in a
classroom during a tutorial session. To carry out the pedagogical intervention, I developed a tool called
AWA-Tutor as an extension of AWA which guides students through different activities to learn
rhetorical writing (Shibani, 2018). The tool also collects data when guiding the students through the
tasks to enable learning analytics to collect evidence for the learning design. The design consisting of
the tasks and the data collected are listed in Table 1 in the Appendix.
From this intervention, the impact of the types of feedback provided was studied using the essays
revised by students of different assigned groups, and by automated feedback group in particular
(Shibani et al., 2017). It was because this area studying the impact on students’ writing has not been
researched extensively in many tools that provide automated feedback, since the focus has mainly
been on the accuracy of such tools. The analytic data collected from the tool also enabled the analysis
of students’ revision process which was previously hard to track. Studying this revision data and
processes in the context of student essays aids to gain insights on processes involved in student writing
(Shibani, Knight, & Shum, 2018, Submitted for Review).
Thus, the writing analytics tool was embedded within a curriculum as a pedagogic intervention for
improving students’ writing ability by making use of several tasks. This design provided an intervention
design for students to learn writing skills, and a platform to deliver the activity, with integrated
learning analytics for collecting data for instructors and researchers, and delivering instant feedback
to students. All the tasks in this intervention were completed by students individually. The next design
will introduce a peer discussion component along with automated feedback to study its effectiveness
(Shibani, 2017). This design has been piloted, with some improvements made in the design and in the
tool. The design included a new component at the beginning of the activity which explains the use of
rhetorical moves and discourse markers to students. This was done by providing a supporting material
which was read aloud during the tutorial. The upgraded tool included descriptions of rhetorical moves
identified by the automated feedback component with example sentences. Preliminary analysis from
this data has shown increased acceptance and understanding of the activity among students, although
there is no significant different noted across the conditions. The next study will study the effects in
detail based on a design that encompasses automated feedback and peer discussion at student level
to study their differences, with some improvements made to automated feedback as well. The
learning design is improved across iterations, based on observations from the previous
implementation and feedback from the instructor and students. When it is stabilized for wide usage,
a design package consisting of the design patterns and a tool to help implement the design in
pedagogical contexts will be developed for practitioners adopting this writing instruction approach.
3. GOALS
The main goal of my research is the development of a learning design, abstractions and a tool that can
help implement the design to be used by researchers and practitioners for writing instruction. The
design will also be tested in a discipline other than the one it was originally developed for to validate
its transferability. The abstractions and learning patterns that evolve can be used by practitioners for
implementing the design in their classroom. Using this design, I also aim to study some research
questions which are not extensively studied in past literature.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
4
3.1 Research Questions
The overall aim of my research is to develop an effective learning analytics intervention design that
can be used across disciplines to study and impact students’ writing in authentic contexts by making
use of both automated and peer feedback. The specific research questions are:
1 What is the impact of automated feedback on student writing?
• What are students’ perceptions of automated feedback?
• What is the impact of automated feedback on student revisions?
2 What is the impact of automated feedback when combined with peer discussion on student
writing and revisions?
• Do students produce higher quality texts when a peer feedback component is added
to automated feedback?
• How do peer discussion dynamics impact the outcome?
• What kinds of automated and peer feedback did students act on?
• What is the student self-reported value of peer feedback in combination with
automated writing feedback?
3 How does the transfer of intervention design work across disciplines?
• What are the abstractions to be developed to help a practitioner implement the
learning analytics intervention design in their discipline?
• What is the student self-reported value of the intervention in the other discipline?
3.2 Methodology
The overarching methodology used in this research is design-based research (DBR), which is “a
systematic, but flexible methodology aimed to improve educational practices through iterative
analysis, design, development, and implementation, based on collaboration among researchers and
practitioners in real world settings, and leading to contextually-sensitive design principles and
theories’’ (pp. 6–7) (Wang & Hannafin, 2005). Changes are made to the design of the intervention and
the tool based on the feedback from the earlier implementation and discussion with the instructor
until the design is stabilized.
From the pilot studies which were carried out earlier, further improvements that can be made to
automated feedback format were identified for better student uptake. New ways of helping students
by providing contextual feedback and improving social sense making in writing were also trialed by
including peer discussion in the design. The proposed design will hence be an extension of the previous
studies, by making changes to the tool and design in addition to the inclusion of a peer discussion
component, as shown in Figure 1.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
5
Figure 1: Proposed learning analytics intervention design for rhetorical writing instruction
This design will be used to study the impact of automated feedback and dyadic peer discussion on
students’ writing by comparing individually revised texts and revised texts after peer discussion in the
context of with/without automated feedback. The peer discussion dynamics which is likely to have an
effect on the outcomes will be studied using qualitative analysis building on the work of discourse-
centric learning analytics (Knight & Littleton, 2015). Based on the results of the effect of peer
discussion component, the design may be updated in the future to run with/ without peer discussion.
The current design is for improving rhetorical writing of students by teaching the structure of text in
terms of rhetorical structures, discourse markers and automated feedback on them featuring the use
of AWA tool. However, it can also be potentially applied to writing instruction by making use of tools
that provide other types of feedback E.g. cohesion. The design is also pedagogically sound to
implement even without the use of automated feedback and analytics capabilities.
4. CONCLUSION
Academic writing in students can be supported by the use of resources and tools in most pedagogic
contexts. My doctoral research hence focusses on developing an efficient learning design by
combining automated and peer feedback for writing instruction. A learning analytics pedagogic
intervention design and automated tools to help carry out the intervention should enable new ways
to embed learning analytics applications in authentic contexts, and ultimately, improve writing. The
validated design could be potentially transferrable across contexts with the development of
standardized abstractions.
ACKNOWLEDGEMENTS
I would like to thank my supervisors Prof. Simon Buckingham Shum and Dr. Simon Knight for their
constant support and guidance in my research.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
6
REFERENCES
Allal, L., Lopez, L. M., Lehraus, K., & Forget, A. (2005). Whole-class and peer interaction in an activity
of writing and revision Writing in context (s) (pp. 69-91): Springer.
Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languages and
design practice. Australasian Journal of Educational Technology, 21(1).
Hyland, K. (2005). Metadiscourse: Wiley Online Library.
Knight, S., & Littleton, K. (2015). Discourse-centric learning analytics: mapping the terrain. Journal of
Learning Analytics, 2(1), 185-209.
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning
analytics with learning design. American Behavioral Scientist, 57(10), 1439-1459.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional
science, 18(2), 119-144.
Sandoval, W. (2014). Conjecture mapping: An approach to systematic educational design research.
Journal of the learning sciences, 23(1), 18-36.
Shermis, M. D., Raymat, M. V., & Barrera, F. (2003). Assessing Writing through the Curriculum with
Automated Essay Scoring. ERIC document reproduction service no ED 477 929.
Shibani, A. (2017). Combining automated and peer feedback for effective learning design in writing
practices. In W. Chen et al. (Ed.). Proceedings of the 25th International Conference on
Computers in Education, New Zealand.
Shibani, A. (2018). AWA-Tutor: A Platform to Ground Automated Writing Feedback in Robust Learning
Design. Proceedings of the 8th International Conference on Learning Analytics & Knowledge
(LAK18), Sydney.
Shibani, A., Knight, S., Buckingham Shum, S., & Ryan, P. (2017). Design and Implementation of a
Pedagogic Intervention Using Writing Analytics. In W. Chen et al. (Ed.). Proceedings of the 25th
International Conference on Computers in Education, New Zealand.
Shibani, A., Knight, S., & Shum, S. B. (2018, Submitted for Review). Understanding Students’ Revisions
in Writing: From Word Counts to the Revision Graph.
Siemens, G. (2012). Sensemaking: Beyond analytics as a technical activity. Presentation at the
EDUCAUSE ELI.
Simsek, D., Sándor, Á., Buckingham Shum, S., Ferguson, R., De Liddo, A., & Whitelock, D. (2015).
Correlations between automated rhetorical analysis and tutors' grades on student essays.
Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp.
355-359), NewYork.
Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning
environments. Educational technology research and development, 53(4), 5-23.
Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics.
Proceedings of the Proceedings of the Fourth International Conference on Learning Analytics
And Knowledge (pp. 203-211).
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
7
Appendix
Table 1: Design of tasks in iteration 1
Task
Purpose
Tool Design
Data collected
Matching
exercise
Enabling students’ understanding of
the instructor’s rubric elements
(Assessment criteria) for assessing
rhetorical essays by matching sample
rhetorical moves that correspond to
the rubric elements.
Interactive drag and
drop interface
(augmented by
analytics for
immediate feedback)
Time taken to
complete the task
Viewing
exemplar
revisions
Enabling learning to revise using
exemplars and provide understanding
of the current activity requirements by
giving students a sample revised essay
with changes made by the instructor
to improve it highlighted.
Displaying a tracked
version of a sample
improved essay in pdf
format that shows
revisions made using
rhetorical moves and
their rationale.
Nil
Essay
Assessment
Assessing the given low quality essay
by understanding the assessment
criteria, and identifying possible
revisions that can be made to improve
its quality.
Guiding questions to
enable assessment
and reflection.
Student responses
Revision &
Self-
assessment
(Main task)
Practising revision by the assessed
essay to improve its quality using
feedback (if provided, based on the
group assignment). In this first
iteration, students were assigned to
one of the following feedback groups
to study the feedback effects: AWA
Feedback Group, Instructor Feedback
Group and No Feedback Group.
A revision editor for all
groups and an
interface to receive
automated feedback
for AWA feedback
group when
requested
(augmented by
analytics for
immediate feedback)
Drafts during the
revision process,
Revised essays,
Student responses
for assessment
questions,
Requests for
automated
feedback
Task
evaluation
Receiving feedback from the students
on the whole activity for evaluation of
the task design. Students also get to
download an instructor’s sample
revised essay and their own revised
essay for reflection.
Evaluation questions
to provide feedback
Student responses,
Click history
tracked to record
the download of
revised essays.