Content uploaded by Antonette Shibani
Author content
All content in this area was uploaded by Antonette Shibani on Mar 25, 2020
Content may be subject to copyright.
1
Educator Perspectives on Learning Analytics in Classroom Practice
Antonette Shibani,a,b,* Simon Knight,a Simon Buckingham Shumb
a Faculty of Transdisciplinary Innovation, University of Technology Sydney, PO Box 123, Broadway NSW, 2007, Australia
bConnected Intelligence Centre, University of Technology Sydney, PO Box 123, Broadway NSW, 2007, Australia
Abstract
Failing to understand the perspectives of educators, and the constraints under which they work, is a hallmark of many
educational technology innovations’ failure to achieve usage in authentic contexts, and sustained adoption. Learning Analytics
(LA) is no exception, and there are increasingly recognised policy and implementation challenges in higher education for
educators to integrate LA into their teaching. This paper contributes a detailed analysis of interviews with educators who
introduced an automated writing feedback tool in their classrooms (triangulated with student and tutor survey data), over the
course of a three-year collaboration with researchers, spanning six semesters’ teaching. It explains educators’ motivations,
implementation strategies, outcomes, and challenges when using LA in authentic practice. The paper foregrounds the views of
educators to support cross-fertilization between LA research and practice, and discusses the importance of cultivating
educators’ and students’ agency when introducing novel, student-facing LA tools.
Keywords: learning analytics; writing analytics; participatory research; design research; implementation; educator
1. Introduction
Large-scale institutional adoption of learning
analytics comes with challenges that call for new
adaptive forms of leadership, collaboration, policy
development and strategic planning (Macfadyen,
Dawson, Pardo, & Gaševic, 2014; Tsai & Gasevic,
2017). For Learning Analytics (LA) to be impactful, a
transformative shift from exploratory studies to
evaluative research of the impact of LA at an
institutional level is necessary (Dawson, Joksimovic,
Poquet, & Siemens, 2019). This shift will facilitate the
move of LA applications from laboratory-based
environments to authentic classroom settings for use
by practitioners. The authenticity is ensured by
designing LA that is tied to authentic assessments and
teaching practices, to solve existing pedagogical
problems (Knight, Shibani, & Buckingham Shum,
2018). Student-facing LA tools are a new form of
educational technology, but we know already that
failing to understand the perspectives of educators,
and the constraints under which they work, is a
hallmark of many educational technology innovations’
failure to achieve usage in authentic contexts, and
sustained adoption (Scanlon et al., 2013).
As LA-powered tools becomes more mainstream
and widely applied in practice, a particular interest for
higher education is to focus on LA in classroom
practice, that can be impactful in supporting users in
their learning. Therefore, an increasing number of
studies are using LA in convergence with Learning
———
* Corresponding author. e-mail: antonette.shibani@uts.edu.au
Design (LD) (Hernández‐Leo, Martinez‐Maldonado,
Pardo, Muñoz‐Cristóbal, & Rodríguez‐Triana, 2019;
Kitto, Lupton, Davis, & Waters, 2017; Mangaroska &
Giannakos, 2018; Rodríguez‐Triana, Martínez‐
Monés, Asensio‐Pérez, & Dimitriadis, 2015; Shibani,
Knight, & Buckingham Shum, 2019). Similarly, the
concept of ‘classroom orchestration’ as a way of
understanding teacher real-time management and
design of classroom activities with technology
(Dillenbourg, 2013) has been applied to LA (Prieto,
Rodríguez-Triana, Martínez-Maldonado, Dimitriadis,
& Gašević, 2019). These design approaches
emphasize the role of the educators and the learning
contexts in implementing technology to support
learning.
However, enabling this shift is challenging. The
bulk of empirical LA studies focus on student
outcomes and performance measures using tools with
relatively little focus on the barriers to adoption of
these tools such as practitioner involvement (Klein,
Lester, Rangwala, & Johri, 2019). Among the key
groups of stakeholders, including Learners, Educators,
Researchers, and Administrators (Romero & Ventura,
2013), there is limited research on the role of educators
in integrating learning analytics in authentic practice.
Two approaches have been taken in the notable studies
that do explore teacher interaction with LA. First, a set
of studies has investigated teacher design and inquiry
processes, for example in a professional development
design workshop on designing for learning using
learning analytics (Alhadad & Thompson, 2017). By
underpinning teacher inquiry in the process of
authentic design, a call for analytics-enabled ‘teaching
2
as design’ and structured collaboration between
researchers and teachers has been made (Alhadad,
Thompson, Knight, Lewis, & Lodge, 2018; Thompson
et al., 2018). In the second approach, the ‘completing
the loop’ project investigated how learning analytics
from Learning Management Systems (LMS) could be
delivered to support teachers (Corrin et al., 2016). This
was accomplished by interviewing university teachers
to understand what analytics they would find useful,
and implementing a tool to deliver meaningful
analytics.
Other studies have investigated educator
perspectives and sense-making on learning analytics,
for example regarding information from LA focused
on online collaborative learning (Alhadad &
Thompson, 2017; van Leeuwen, van Wermeskerken,
Erkens, & Rummel, 2017). These explore the
processes involved in teacher’s use of LA, i.e, the
implementation. Similarly in a study by Holstein et al,
teacher expectations were explored to see how
Intelligent Tutoring Systems (ITS) with real-time
dashboards could be designed for blended classrooms
(Holstein, McLaren, & Aleven, 2017). This study used
design interviews including generative card sorting
exercises, semi-structured interviews, directed
storytelling, and Speed Dating sessions to understand
teacher notions before designing an ITS for them.
More generally, the sub-field of teaching Analytics
has focused on capturing and analyzing teacher actions
to help teachers improve educational designs prior to
their delivery (Prieto, Sharma, Kidzinski, Rodríguez‐
Triana, & Dillenbourg, 2018; Sergis & Sampson,
2017). By examining the needs of teachers and
possible issues that may arise, they report on
educators’ motivations and challenges when using LA.
Other studies have examined teacher views to
qualitatively evaluate the usefulness of LA
applications after they are implemented (Echeverria et
al., 2018; Koh, Shibani, Tan, & Hong, 2016). Here, the
perspectives of educators in terms of outcomes are
studied. As discussed earlier, adoption of learning
analytics has also been studied at classroom and
institutional levels (Macfadyen et al., 2014; Tsai &
Gasevic, 2017).
In all of the above studies, specific components
relating to educators’ use of LA are explored –
motivations, challenges, implementation, outcomes,
and adoption strategies. While they give insight into
how educators reflect on specific LA applications,
they generally do not provide detail about long-term
usage, or the ways that educators adopt and adapt
learning analytics to their specific context end to end.
Thus, across this body of work the perspectives of
teachers regarding learning analytics and their use in
existing practice is limited in holistic coverage.
To enable greater opportunities for cross-
fertilization between research and practice in learning
analytics, this paper presents a case study on the
perspectives of educators, after their engagement in
co-designing learning analytics in extended classroom
practice. It uncovers the key motivations, outcomes,
implementation strategies, challenges and support
required for educators, and reports on lessons for
adopting LA in their authentic practice from a writing
analytics case, triangulated with student and tutor
survey data. By foregrounding these important, but
underrepresented views, this paper expands our
understanding of learning analytics in authentic
classroom practice. The lessons learned from the case
study provide useful pointers for LA researchers
working towards classroom practice.
2. Research Context
This study is part of a larger research project that
educators were involved in using a writing analytics
tool called ‘AcaWriter’ (previously called ‘AWA’) to
improve the writing skills of students in their subjects
(Knight et al., Forthcoming). This application of
learning analytics was employed to provide
personalized feedback at scale on students’ writing in
a higher educational institution. Educators worked in
a co-design process (Prieto-Alvarez, Martinez-
Maldonado, & Anderson, 2018) with LA researchers
to bring AcaWriter to their classrooms in trials that ran
over many semesters. Implementation involved
designing learning analytics pedagogic interventions
that align learning analytics with learning design to
integrate it in the larger educational context (Lockyer,
Heathcote, & Dawson, 2013; Wise, 2014). The
researchers were part of an innovation centre at a large
Australian metropolitan university where instructors
came forward for collaboration if they were interested
in using the available technologies in their classrooms.
The LA-LD alignment was possible because of the
3
researchers’ ability to tune the LA for relevant
subjects, since the tool is developed in-house by co-
designing it with the educators. These interventions
introduced students to the use of AcaWriter for
automated writing feedback using pedagogically
grounded writing tasks in authentic practice (Knight,
Buckingham Shum, Ryan, Sándor, & Wang, 2018;
Shibani, Knight, Buckingham Shum, & Ryan, 2017).
The writing interventions for students to improve
their writing using AcaWriter were implemented in
two disciplinary contexts – Law and Accounting, and
are explained in detail in earlier work (Shibani et al.,
2019; Shibani et al., 2017). In the law subject, the
instructor co-designed the intervention as a single task
to be delivered in class. In this task, students
completed writing tasks such as an exercise matching
the assessment criteria for their assignment to sample
sentences, viewing exemplar assignments, assessing a
draft writing based on the criteria, and using the
automated feedback tool to improve the quality of the
draft using an online platform (Shibani, 2018), with
additional peer discussions facilitated as part of the
activity. They were then encouraged to use the tool to
support their own assignment submission. In the
accounting context, the instructors co-designed similar
writing tasks as in law, but they extended over a few
sessions (5 weeks). It included homework tasks
completed online, in-class discussion the following
week, and application of automated feedback and self-
assessment of their own assignments in later weeks.
When using AcaWriter for their own assignments,
students were provided prompts to reflect on the
feedback and engage with it in a scaffolded way. In
both contexts, in-class activities were facilitated by
either tutors or the instructors themselves depending
on the timetable allocated. The tutors thus acted as
intermediary bringing the writing intervention to life
in a face to face setting with students in some
classrooms.
By adopting Learning Design practices through
integration of tools and supporting existing good
practice of educators, the researchers enabled
augmentation of pedagogic writing practice with
learning analytics (Knight, Shibani, et al., 2018). They
used design patterns (Goodyear, 2005) to represent
and share designs that can bridge research and practice
by connecting educators and researchers in
applications of learning analytics. A Contextualizable
Learning Analytics Design model emerged from this
work which identified key elements of LA and LD that
can be flexibly tuned for the educational context where
LA designers and educators work together (Shibani et
al., 2019). While the details of implementation and
evaluation of outcomes were previously presented in
those studies, educator perspectives were yet to be
examined, and provide the novel contribution of this
paper. We propose that these perspectives, while
pertaining to writing analytics, provide useful lessons
for learning analytics more broadly.
3. Methodology
The current study uses qualitative research
methods to explore the perspectives of educators who
employed learning analytics in their classrooms –
more specifically, the Writing Analytics (WA)
application described in the previous section. There is
a concern that the favoring of quantitative approaches
in learning analytics may lead to a focus on finding
generalizable structural relations rather than
understanding nuanced processes in learning (Wise &
Cui, 2018). The value in qualitative research methods
is that they provide insights that are deeper in nature
with interpretive rich descriptions (Erickson, 1985).
Such rich insights are hard to obtain from quantitative
methods by analyzing large number of participants
who provide short responses. Hence, this article
employs a qualitative case study to explore educator
perspectives in detail. It also uses other data sources
(student feedback, tutor feedback, revisions made, and
marks scored) to validate the qualitative findings by
triangulation where appropriate. Based on this prior
literature, the research questions that guide the study
are follows:
In authentic writing practice,
1) RQ1 – Motivations: What are the educators’
motivations and expectations of using LA?
2) RQ2 – Implementation: What implem-
entation strategies are helpful to adopt LA in
the classroom?
3) RQ3 – Challenges: What are the challenges
in implementing LA?
4) RQ4 – Outcomes: What are the outcomes
gained by applying LA?
4
Based on these questions, we distill lessons from
our work to guide the implementation of effective
learning analytics applications in classroom practice.
3.1. Data Collection
Data for this study come from semi-structured
interviews conducted with educators who worked with
the writing analytics application in their classrooms
(Interview guide approved by the institutional human
research ethics committee is provided in Appendix A).
These educators are instructors from different
disciplines who were involved in the co-design of
specific feedback modules in AcaWriter, and writing
tasks for students to make use of the automated
feedback on their writing. These interviews were
conducted after in-class implementations of the tool
over a number of semesters, and invited the instructors
to reflect on the whole process. The interviews
highlight issues and considerations in the alignment of
the aspirations for LA and its practical
implementation.
3.2. Participants
Interviewed instructors were lead educators with
responsibility for designing and teaching portions of a
large (250-500 student) course unit. The instructors
were helped by tutors to facilitate the writing
intervention for students in some classes. Descriptions
of the interview participants are provided below:
I1 (First Instructor) – Law Academic
I2 (Second Instructor) – Business School
Academic
I3 (Third Instructor) – Business School
Academic
I1 was the first educator who co-designed the
intervention in class with researchers, over five
semesters to develop a stable learning design for law
students. I1 was interviewed online. I2 and I3 worked
together to implement an adapted version of I1’s
learning design by tuning it for their classes in
Accounting. They were interviewed together in a face-
———
1
Quotes from interviews are given verbatim, although are not exhaustive of related content. Where words have been removed for brevity this
is indicated […]; in all such cases we do not believe that the removal has changed the meaning of the quotation. Detail is added in some places
using square brackets, for example, to expand the acronyms that academics use in their everyday discourse.
to-face session with the researcher. Each interview
lasted approximately one hour.
3.3. Data Analysis
The two 1 hour long interviews were video and
audio recorded, and transcribed for analysis. This
interview data was qualitatively analyzed by
identifying key themes that emerged from the data. It
followed the method of analyzing the data using a
deductive approach to extract key issues and insights
related to the specific research questions, while using
an inductive approach to create themes within these
overarching questions (Eisenhardt, 1989; Thomas,
2003). Analysis was conducted by the authors, led by
the first author, with key cases and themes discussed
to reach agreement. The aim of this approach is not to
create an exhaustive coding scheme, but to provide an
interpretive typology to understand the interview case
data. The main themes identified from the data
addressing each research question are discussed in the
following sections. Excerpts of the quotes from the
instructor interviews are provided as exemplifications
of themes identified
1
. Data for triangulation come
from tutor responses to a questionnaire (Appendix B)
completed by three tutors (1 from law and 2 from
accounting), analysed deductively in light of interview
analysis. Prior work analysing student data (detailed in
Shibani, 2019) is also used to triangulate the
outcomes: feedback from students on the writing
intervention, their revisions made to a given sample
text as part of the task, and marks scored for
improvements made to the text.
4. RQ1 – What are the educators’ motivations and
expectations of using LA?
The first research question aims to understand the
educators’ motivations and expectations of using LA
in classroom writing practice. While these motivations
which drove the instructors to use LA were discussed
in earlier co-design sessions, the interviews helped
tease out their expectations in a more formal and
5
systematic way. Themes from the interview data fell
into five main categories, detailed below.
4.1. Improving students’ written communication and
self-assessment
The primary motivation for instructors was to target
an existing pedagogical need in their classrooms,
namely, to improve students’ disciplinary written
communication. They wanted to provide support to
students in developing this skill, and to learn to self-
assess their writing better. One instructor was
particularly keen to teach students how to assess their
own work, that is, to build their “evaluative
judgement” (Boud, Lawson, & Thompson, 2015) and
reduce the number of re-mark requests received. Such
aims that are not typically the explicit targets of LA
interventions. By learning how to self-assess their
work better, they hoped that students would improve
their writing:
“part of the issue was throughout the accounting
degree they don’t necessarily get support and so
they land in the subject without having lots of
practice and developing their [writing] skills. So,
we wanted to really push that.” [I3]
“their writing skills are generally pretty high, but
the reason that I wanted to use AcaWriter was
because I found, when I first started teaching this
really large cohort, that there were an
extraordinary number of requests from students
after they had received the mark for the essay for
either a re-mark or an explanation as to why they
had not achieved a better mark […] I just thought
wouldn't it be great if we could find a way to get
them to do some self-assessment and reflect on their
essays a little more thoughtfully so that they could
reach the conclusions themselves as to how they
could have improved.” [I1]
4.2. Providing formative feedback for students
In addition to the opportunity to improve students’
writing skills, the instructors valued the provision of
automated feedback which AcaWriter offered in a
formative way. Students could receive this automated
feedback directly from the tool anytime, which was
intended to aid their drafting and revision process
before submitting their final writing. They were ready
to test the idea that there were certain aspects of
writing amenable to immediate, automated feedback.
“we wanted to provide students with the formative
feedback before their summative assessment, so
that they had an opportunity to recognise parts that
they can improve and build on […] In engaging
with AcaWriter we’ve helped the students get
feedback directly on the piece of work they’re
working on and they can use the tool directly. […]
the broader motivation was […] to provide
feedback to students on their written
communication that did not require the tutors to
have to mark-up reports and provide that back.”
[I2]
One instructor also felt that this would remove the
need for tutors to develop expertise in giving feedback
on writing (as opposed to the curriculum):
“And, also wasn’t necessarily reliant on the tutors
having to develop expertise around providing
feedback around written communication because
that’s not necessarily their core expertise. So, I was
really open to the idea of using a tool that could
actually do that for us and that the students could
use themselves at any time” [I3]
They particularly thought the automated feedback
would be useful to teach students how to better
structure their writing, while the disciplinary content
knowledge could be provided by the tutors or subject
experts. They also thought that it would be
complementary to the other kinds of feedback students
receive on their writing from other tools, peers or
tutors:
“assessing the merits of their arguments is
something that is very difficult to automate. But
assessing whether or not the essay has certain
features and follows a certain structure to me is
more mechanical. So just as you might use
Grammarly or a grammar checker with Word or
Turnitin to check the originality score of your
essay, I thought wouldn't it be great if we could use
6
this tool, this writing analysis tool, to automate
some of the feedback.” [I1]
“in terms of written communication you’ve got
content versus structure and I think AcaWriter is
bringing more of the structural improvements,
whereas we have other activities in-class so we talk
about ideas around the topic and that’s more
content.” [I3]
4.3. Saving time
Another major motivation for instructors was to
save time when supporting large student cohorts.
Requests for further feedback and re-marks require
considerable resource. The instructors hoped that the
intervention and the tool would help reduce these
numbers and save time:
“But we can’t afford to do that [giving formative
feedback] when we have 400 students because it
already takes us maybe about 20 hours to mark one
class [~35 students] of these assignments and so we
can’t have the tutors spend that time again giving
formative feedback. So, we had to do it in a way that
is time-efficient.” [I1]
“The sorts of numbers we're dealing with can be
anything between 280 students to 420 […]. And I
would say prior to introducing AcaWriter in the
most meaningful way, which was in the last year or
so, we would have up to I would say 20% wanting
more information, wanting a re-mark, complaining
about their mark, wanting their essay redone. […]
maybe as few as a tenth, but never fewer than one
in ten wanting more information. In a cohort of 280
students, you've got 28, often 35 students wanting
more information.” [I1]
4.4. Instructor having knowledge of, and motivation
to, support writing
The instructors who trialled writing analytics in
their classrooms were particularly self-motivated to
improve student experience. Their interest in
delivering student-centered learning put them in the
forefront of other innovative practices when
opportunities arose. Hence, the instructors’ knowledge
of and motivation to support student writing was
identified as the next important factor.
“it’s really left to us to drive that and it’s left to
individuals to drive it. We weren’t given any
encouragement to go or push to go do this. An
opportunity came up and we thought it was a good
idea, so we went for it.” [I2]
“It’s nice to get some external recognition [a
teaching award] but that’s not really part of the
motivation to do it. The motivation is really student-
centered, trying to figure out how best to develop
students’ written communication and support that
in different ways. And being just really open to
technology as a potential solution.” [I3]
4.5. Being open to the role of technology
The instructors were curious to explore the
potential of writing analytics technology to develop
student writing. They had prior experience in using
technology in the classroom, and were comfortable in
using educational technology to innovate in their
classrooms. This made them more open to trying new
techniques and motivated them to use LA to improve
their teaching practices.
“That’s the first time I’ve used a writing analysis
tool, but I have used a lot of technology in the
classroom for various purposes, sometimes to give
the students an opportunity to engage in or practice
authentic way with the kind of tech values in
practice […] I just wanted to convey here that I’m
pretty techy, like I’m comfortable with tech” [I1]
“I think we’ve got a long history in our subject in
trialling and experimenting various different
innovations in our teaching. And that could be in
terms of activities or it could be in terms of trying
different ways to teach particular concepts […].
we’re definitely very open to trialling different
technologies and to putting them in place and
seeing how they go. […] I was really curious about
the role of writing analytics” [I2]
7
5. RQ2: What implementation strategies are
helpful to adopt LA in the classroom?
To better understand the strategies that can help
implement LA in classroom practice, instructors’
reflections on their whole process of adopting writing
analytics were examined. The instructors talked about
the key points to be taken note of in this process of co-
designing and implementing the intervention in their
subjects and the support mechanisms which helped
them.
5.1. Co-designing LA implementation
The instructors talked about co-designing with
researchers as a key strategy that helped them adopt
writing analytics in their classrooms, and reflected on
its process. The design process was used to iteratively
and incrementally test and evaluate new ideas. They
identified areas where they could improve their
existing pedagogic writing practices using the
analytics offered by AcaWriter. This follows the
strategy of augmenting existing pedagogically sound
good practice with affordances of LA for better
adoption, rather than revolutionizing those practices
(Knight, Shibani, et al., 2018). The design process of
the intervention was different for law and accounting
disciplines, with more iterations of the current task
design in law when compared to accounting. There
were several failures in the initial law trials, which led
to a stable design iteration that was found to be
effective. These iterative design stages – rarely
reported in research literature – are an important but
understudied phenomenon for understanding the
practice of designing LA for authentic scenarios.
In the first design implementation trial, without the
tool, the students were simply asked to self-assess their
writing alongside submission of the assignment.
However, many students did not participate:
“I knew that some 10% did it, seriously 10%, so it
was pretty hopeless. It was all a bit of a disaster,
but that was just the first semester.” [I1]
In the following semester, the instructor decided to
introduce AcaWriter to a voluntary group of students
with incentives to increase response rates:
"I incentivized that by giving them a promise of a
little script of text [for their CV] […] So that
control group was fantastic. I had about 30
students and they were very critical. They were
really critical of the whole app, the use of it.” [I1]
To improve students’ understanding of the tool, the
instructor then created a tutorial which explained
rhetorical moves by taking into account feedback from
linguistic experts.
“So, then what we did was had a bit of a discussion.
I was a bit disheartened. I presented an initial set
of results to my colleagues […] she said, […] the
reason they don’t understand what the tool is doing
is because they don’t understand the [rhetorical
moves]. She’s a linguist and she’s an expert in all
things linguistic, and I just took it on board” [I1]
However, the instructor explained briefly that this
small addition of a short video tutorial on rhetorical
moves made little difference. The instructor at this
point came up with a document to explain rhetorical
moves, and collaborated with the analytics team to co-
design a writing intervention, detailed in an earlier
work (Shibani et al., 2017). As the instructor noted,
implementation AcaWriter in her teaching practice
was thus a multi-faceted intervention.
“You’ve got the intervention of explaining
[rhetorical moves], the intervention of demanding
the self-assessment in order to get the remark, the
intervention of your benchmarking activity, the
intervention of using AcaWriter itself.” [I1]
The accounting instructors then took their existing
practices, and integrated approaches from law to
implement the intervention based on that evaluation.
This method of transfer from one context to another is
an effective method for adopting LA to scale to more
students in a context-sensitive way (Shibani et al.,
2019). As in law, they had previously trialed the usage
of AcaWriter in the following ways:
As part of a benchmarking exercise teaching
self-assessment through assessing exemplars,
with AcaWriter feedback marking up
exemplars for one group. However, at that
8
stage “it wasn’t actually integrated into their
assignment” [I2].
This exemplar marking exercise was retained,
but without AcaWriter use in subsequent
semesters.
Subsequently (for two semesters), they asked
students to use a technology to support self-
assessment of a draft prior to submission, with
AcaWriter as one of the feedback options.
Most recently, they adopted the approach from
Law with writing activities, and the use of
AcaWriter as a feedback tool that students use to
improve their drafts:
“then at the beginning of this year I think we
started to reconnect in terms of some of the
things that have been happening in law […]. It
was really pleasing to see that this model of
having Week 1 activities in class where
students are doing benchmarking. So, they’re
getting exposed to marking criteria, they’re
thinking about the marking rubric, they’re
looking at samples of students’ work. And then
they’re actually using these sorts of tools
amongst others to improve a draft of their
assignment before they finally submit. That
sort of model seemed to fit really nicely around
some of the innovation that’s happened since
then in Law.” [I2]
Throughout the implementations in the two
disciplinary contexts, the instructors’ involvement and
agency in the co-design process thus helped them
improve the design to make it more effective.
5.2. Designing an authentic experience with
explanation on the relevance of technology
The next strategy used by the instructors in
implementing writing analytics was the design of an
authentic experience for students and explaining its
relevance. A constructive student experience was
important for the instructors, who emphasized not
wanting students to engage in activities solely for
research purposes. With this motivation, they co-
designed authentic experiences for students by
aligning formative, analytics-augmented writing tasks
with their existing assessments, with help from LA
researchers. In that way, they intended that students
could apply the skills learnt practically to their subject
assignments:
“I had to make sure that when the students were
using AcaWriter and trying to get feedback to self-
assess that it was a genuinely constructive
experience for them in the writing process […]. I
did not change the essay or the essay question and
I only changed the criteria in the slightest way. The
marking criteria only changed so that the
explanation and description was clearer but the
actual criteria against which they were being
assessed did not change.” [I1]
While designing this authentic experience, the
instructors realized how important it was to explain the
rationale for introduction of the technology, so the
students understood its relevance. The instructors
created videos explaining why lawyers and
accountants needed to master these moves in their
communication, and aligned the intervention closely
to their assessment criteria. Without this alignment,
they were concerned students would disengage and
rebel against doing the activities:
“the reason why I had to be so explicit with the
students about that is because these guys are very
complementary and they’re very, very judgmental
of the way that they’re taught. We’re talking about
expert learners who think about the way they’re
being taught and can be very critical and they will
complain if they feel that they’re being used as
guinea pigs in a project that has nothing to do with
their learning. They’ll quickly rebel, and you don’t
want to lose them. So that’s how I got the buy-in
from the students.” [I1]
“I think that’s sometimes the risk with new
technologies coming into the curriculum, is people
just go, great, that’s an excellent technology. […]
The power in using these technologies is having
that really close alignment between your outcomes
and how your tool helps solve those outcomes. And
that thought-process, it takes a while to get that fit.
And if there’s no fit, the students just go, pfff, what’s
the point? And so, it’s getting that alignment that is
the key.” [I3]
9
“we had to figure out what the specific alignment
was going to be in terms of the rhetorical moves
that AcaWriter could identify and how that married
up to the assessment task, to the report we’re asking
them to produce. As well as what sort of feedback
we want to be providing to students. Even on the
other side as well, it’s preparing a script to explain
this to students, having to record that as well and
figuring out the sequence of those and then having
to fit that.” [I2]
5.3. Being empowered and receiving responsive
support
It should go without saying that academics and
other instructors should be in the driving seat when it
comes to designing any activity with their students:
they are the subject matter experts, they know their
students, and it is their reputations that suffer when
things go wrong and students have a poor experience.
However, all too often, when new technologies are
introduced by institutions, those on the front line with
students feel disempowered, having little or no voice
in the way that the technology is conceived, invented,
purchased and deployed (Buckingham Shum,
Ferguson, & Martinez-Maldonado, 2019). Thus, the
instructors reflected strong appreciation for the level
of agency they had in designing the intervention, and
the technical support available to them:
“I always felt as though the design, the content, the
pace, the way we engage with the students, I felt I was
deferred to incredibly respectfully the whole way as
the lead on this even though in truth we were all
collaborators of equal input. But I felt I was given an
enormous amount of agency because these were my
students and my subject, and I really appreciated it.
And I felt the outcomes were better, all the better for
that.” [I1]
They felt supported by the researchers and valued
the collaborative nature of the research project and co-
design. They believed that this support encouraged
them to build authentic writing support for students
with enthusiasm:
“I felt enormously supported […]. The entire
[analytics team…] were absolutely instrumental in
solving some of the key problems I faced. But I just
want to be clear that I was supported the whole way
in keeping this really authentic for the
students.”[I1]
“it was really good. So, I think it was really
collaborative and I think what was really
encouraging was the enthusiasm to get these things
into place. I found working with the researchers,
they were really responsive, we could have weekly
meetings. We divided up the tasks and so it was a
pretty ambitious project.” [I2]
One instructor thought that it was instrumental to
have a responsive team to remedy the glitches as and
when they occur. These quick responses solving
technical issues helped them increase the motivation
of students, and to maintain credibility:
“One of the strengths of the project team was how
responsive the researchers were, particularly in
troubleshooting. […] We’ve got 400 students who
we’re trying to maximise their motivation to do
their homework and to do their assignments. And
any glitch needs to be remedied straight away,
otherwise we risk losing credibility and the
motivation of our students.” [I2]
5.4. Supporting future wider adoption
As instructors who were early adopters of the
AcaWriter tool and having experience in
implementing writing interventions for students, the
instructors suggested possible routes for wider
adoption of the tool by other academics. One instructor
thought that it was important to explain what
AcaWriter does and make it clear that it is available
for use, which would then encourage academics to try
to solve their problem using the tool. It was also
suggested that disciplines be identified where similar
text analytics technology might already be in use in
industry, making use of the tool an even more
authentic practice:
“I think the first step is to first of all, all of those
academics who don’t know what it does or how it
10
does it need to have that explained. And then it’s
important […] to listen to those academics and find
out what’s the particular problem they would like
to solve and then take it from there. […] I think it’s
more important to say to as many academics as
possible, we’ve got this tool. This is how law used
it. This is what it’s done for law, but there are many
other problems it could solve. Do you want to go
away and think about whether you could use a
writing analysis tool? […] I would also try and find
out how the particular industry that they support,
that that faculty delivers graduates into is already
using writing analysis software to give it some
practice or authentic meaning.” [I1]
Another instructor suggested the creation of
adoption packages for academics so they could easily
use different versions of AcaWriter – with or without
the fully designed intervention. It was also
recommended to explain the requirements with a list
of items the academics needed to prepare, so they
would be aware of their commitment:
“I think you could put together a couple of options
in terms of the packages, what it would mean to
adopt AcaWriter….. Because I think probably the
biggest hurdle for adoption is people not having a
sense of what it is they’re committing to in terms of
getting it in place. So, if you can be really upfront
in terms of, okay, if you want to have it just sitting
there for an assessment task then they key thing
you’re going to have to do is just help us with the
mapping and the providing of feedback. But if you
want to have an online tutorial then you’re going to
need to do all this. And so, then they have a sense
of what it is exactly they need to do in order to
customise it for them.” [I2]
An instructor mentioned that the benefits for
academics to be involved should be explained. The
academics who are responsible for improving
students’ written communication can be tapped into to
provide a solution to their problem:
“And also just make sure that you explain the
benefits, both for the academic and the student,
because obviously we’re here to benefit students
but if there’s nothing in it for the academic they’re
like, why bother? Why should I do it? [...] I think
you could tap into academics that do have to, like
as a first port of call, that are responsible for
written communication […] And so, you go to the
ones that have that need.” [I2]
The instructor also added that the academics would
be interested in being part of a broader research
project. Being able to view how the trials work across
faculties would help academics be aware of what
others are doing, and encourage them to use it for their
subject:
“The other thing would be potentially bundling this
all up as part of a broader project or research
project that’s looking at how we support written
communication in different disciplines. And so, for
them to see the broader project and to see the value
of that, I think would be attractive to some
academics so they feel that they’re not just doing
this by themselves but they’re doing it as.” [I2]
6. RQ4: What are the challenges in implementing
LA?
The challenges in implementing the writing
analytics tool and interventions were explained by the
instructors as follows. These are useful to take note of
so that the educators are better supported in the future
to overcome these challenges.
6.1. Putting in additional effort
The instructors thought that the setup of the
intervention, and related work took a lot of time and
effort. This was in addition to their normal teaching
load, and increased a number of things they had to
prepare before, during, and after delivering the
intervention. The effort involved in doing this
additional work was a challenge for academics, who
are often time-deprived even with their normal
teaching responsibilities:
“It took a lot […] I would say the main challenges
were the ones that were at the real cutting edge of
what we were trying to achieve, and that’s where
the excitement was and that was where the magic
11
was, so that was a labour of love […]. There was a
lot of work involved in the design process and
dealing with the students, dealing with emails,
recording instructions, writing them up, trying to
get buy-in from my tutors, from the tutors who also
teach this subject […] And I would say I spent at
least six hours just on the proposal, but I would say
no less than about 40 hours a semester was spent
on this, which, if you think about it, is huge.” [I1]
For the accounting instructors, a number of
materials had to be prepared to adapt the law
intervention for their context, the set-up cost of
which was high. This included:
Findings samples of student work, getting
permissions to use them as exemplars, and de-
identifying them
Marking the samples and providing example
feedback for students to learn from
Creating a sample text that the students could
improve through redrafting
Conceptually aligning AcaWriter feedback
and the assignment needs
Preparing a script to explain the relevance of
technology to students
“I probably underestimated the setup cost […]
if I think about all the things that an academic
has to do to adapt to do” [I2]
6.2. Involving the tutors
Involving all stakeholders who contribute to the
implementation of LA in classrooms and getting them
as committed as the subject designers is another
challenge and contributing factor. The instructors
noted that tutor involvement could be a potential factor
that affected how the intervention was delivered to
students. Tutors are the people who facilitate activities
in some classes, and their involvement thus plays a key
part in students’ engagement:
“I’m just wondering to what extent do we, maybe
need to get the tutor team more involved in the sell.
I don’t know, I always just go back to that level
because that’s the intermediate level that we forget
quite often. But if the tutors can’t sell it, given that
they’re going to be the face to face, more so than
even a lecture. We did obviously let them know but
to the extent of whether that could have perhaps
been stronger, I don’t know.” [I3]
Responses from the tutors themselves who
answered the questionnaire (see Appendix B) were
encouraging. For the question aimed at understanding
the role of AcaWriter and the writing intervention in
developing students’ written communication, 2 out of
3 tutors said that they fully understood its role, and one
said that they somewhat understood its role. All three
of them said that they do not require more training to
facilitate AcaWriter feedback discussion in class. In
terms of the time and effort involved in facilitating the
intervention in class, one tutor said that it was very
easy, and 2 said that it was moderately easy. All three
of them said that they required less than an hour to
understand and implement the writing intervention
with AcaWriter feedback. Finally, all the three tutors
said that they were interested in using the intervention
and AcaWriter again in future semesters. These
findings show that the tutors were reasonably well
inducted by the educators to facilitate the writing
intervention in our case. For effective classroom
practice of LA, this challenge of involving all
stakeholders has to be dealt with.
Despite the critical role they play in mediating
student-facing LA tools (or in other LA contexts, using
them to track student progress), tutors are typically
part-time, contracted staff, whose time is paid on an
hourly basis. Thus, in LA interventions, engagement
and support beyond the primary academic/instructor
will be important, but this remains under-studied in the
LA literature, and risks being overlooked in LA pilots.
6.3. Working with an imperfect tool
The next challenge for instructors was working
with the limitations in the tool; for instance, not
providing the right feedback at all times. The
instructors noted some flaws in the automated
feedback from AcaWriter in terms of correctly
identifying rhetorical moves in student writing:
“I don’t think it does it particularly well yet. I still
think it’s flawed, but I think that the fact of it as an
intervention and the fact that it does it partially well
is pretty amazing, and I think it’s really profound
12
and impactful. So, yes, it’s a really valuable tool
and I think if you took it away, we would lose
something of value in the step towards improving
student writing. But, obviously, it’s not perfect. I
actually think the fact that it’s not perfect, which,
let’s face it, spell check isn’t perfect, Grammarly
isn’t perfect. All they ask you to do is think about it
[…] And I know what Grammarly’s doing, and I
know why I would override what Grammarly
suggests. Now if that’s what the students are doing,
well, more power to them, but at least they
understand what their text is doing and how it’s
behaving.” [I1]
A specific example provided by an instructor was
AcaWriter’s inability to distinguish between use of the
word ‘innovation’ with reference to content (e.g.
“Their major innovation was…”) and rhetoric (e.g.
“We provide an innovative tool”):
“I am still slightly concerned with how AcaWriter
recognises things and so I think one of my feedback
was one of the rhetorical moves when it recognises
you’ve got a new idea in your writing.” [I3]
Even though the instructors recognized the tool’s
imperfections, they still believed that its feedback
facilitated students’ understanding of their own
writing and taught them to think about it critically.
This ability to critique automated feedback was found
to be of value to students because it led to a deeper
understanding of writing concepts: students learned to
look for these concepts in their texts, and identified
ways to signal elements of writing using those newly
learnt concepts:
“I think the real value for us is actually telling the
students that it’s an imperfect tool. It’s a tool that
can provide them with one source of feedback, but
they need to use that feedback critically. And I think
the temptation to just use it as an algorithmic
assessment takes away the critical distance they
have from the feedback.” [I2]
“And part of what we want to do in the subject is
for them to develop their critical thinking skills of
questioning what’s in front of them and saying
actually I don’t agree with that. And so, I think it’s
really important to keep that level of thinking.” [I3]
6.4. Fostering appropriate use of automated feedback
A related problem with having an imperfect tool
was for instructors to teach students how to effectively
use the feedback from the tool. The appropriate use of
feedback was not observed in all students since there
was a varied level of understanding. An instructor
noticed that some students engaged with the feedback
at a surface level, spotting the presence or absence of
moves only, and did not engage at a deeper level with
a critical eye:
“they’re engaging with the highlighting that
AcaWriter can give them in terms of the different
rhetorical moves. But they didn’t really engage
much with the additional feedback that AcaWriter
popped out” [I2]
There were also students who were too critical and
dismissive of the feedback. So a balance is needed to
explain the value of automated feedback, even if it is
imperfect:
“they’re probably still learning how to. It’s a tricky
thing to balance in terms of we want them to
actually be critical of the feedback they get from
AcaWriter, because by being critical of the
feedback we actually force them to justify their
position even more. Yet, we don’t want them to be
dismissive of the feedback they get from AcaWriter
and I think we have to strike that balance […]. So,
I think there’s still a bit of work to be done in terms
of students actually engaging with the written
communication, the value of the product.” [I2]
Another instructor thought that the associated risks
and imperfections of automated feedback should be
explained to use it with caution, and emphasized that
students should learn how to apply their human
intelligence. They noted that the students should be
taught how to properly engage with the feedback for
them to fully understand the value of the tool to
improve their writing:
13
“I would say to students in the future that they
should see it as one of the four main tools that they
would use in technology, always thinking about
writing analysis and how if you were going to rely
on it for producing an outcome that was
deterministic rather than probabilistic, that you’ve
got to think about the risk associated with that
because of its imperfections at this time, but that it
should be used. My idea would be that it should
always be used in conjunction with human
intelligence.” [I1]
Note that the other three tools of the four tools
referred to above, are spelling, grammar and
plagiarism-checking.
6.5. Disrupting disciplinary teaching and research
One instructor noted another challenge: the
intervention was disrupting the established teaching
patterns of a core discipline-related subject, by
increasing its focus on writing. While professional
writing is a competency that they want students to
develop as part of their subject, they also have core
disciplinary skills to target, which could be disrupted
when allocating time to teach writing.
“As to whether it made things more streamlined,
whether it improved efficiencies, no, it is the biggest
disruption. Because you’re talking about taking a
cohort of students and basically having to fill in
their learning with almost a different subject. It’s
like I’m having to teach them English and English
language parts of speech.” [I1]
Related, within the academics’ own disciplines
the work they undertook to implement AcaWriter –
as educational research – was undervalued,
providing educators less academic recognition
compared to conducting and publishing discipline-
specific research.
“there is a very strong sentiment in the law schools
in Australia […] against writing up teaching
pedagogic, teaching research, educational
research. We are supposed to be writing up our
research as experts in our particular substantive
area of law.” [I1]
Learning analytics innovation and implementation
may be supported by universities to incentivise
teaching and learning innovations and scholarship to
overcome this issue. In work involving AcaWriter, it
is to be noted that both groups of academics were
supported by teaching and learning grants, and
received citations/awards recognizing their work.
7. RQ4: What are the outcomes gained by
applying LA?
The instructors identified a number of outcomes
that emerged while implementing the intervention. We
triangulate these findings using other sources of data
from the overarching research work to strengthen the
arguments. These outcomes affected stakeholders
including their students, themselves, and the research
team, and fall under the following themes (aligned
with instructor motivations).
7.1. Improving students’ self-assessment and written
communication
The main outcome that the educators thought they
achieved was with regard to their students’ writing –
they found improvements in students’ self-assessment
and written communication, explained further below.
7.1.1. Improving self-assessment
Instructors from both subjects mentioned that they
saw improvements in students self-assessing their
writing, and were more aware of the assessment
criteria. Preliminary findings from student interactions
with AcaWriter in the interventions have also shown
students’ engagement with the activities in the online
system, and in the reflective activity (where students
engage with AcaWriter feedback using prompts – see
Section 2) to varied levels (Knight et al., Forthcoming;
Shibani, 2019; Shibani et al., 2017):
“once they truly understood what the tool was
doing […], the only comments we were getting in
the feedback were things like, oh, it’s interesting to
note that AcaWriter didn’t pick up what I would
have thought was quite a good rhetorical shift
because I said, it’s interesting to note, but
14
AcaWriter didn’t pick it up, but I’m not quite sure
why. And suddenly you realize, bang, they’ve got it.
They’re all over it. And they are now self-assessing.
They’re now reading their essays really critically
looking for something that prior to what we had
created they were never looking for and they didn’t
even really know to look for it at all.” [I1]
“they can reflect on the criteria that they’re being
assessed on or at least be aware of the criteria
they’re being assessed on, particularly in terms of
written communication. So, they’re starting to think
about what we’re looking for when it comes to good
written communication.” [I2]
“we can see, and you’ve shown us some of the data
as well in terms of the level of engagement with the
system. So, at base level, we’re getting the students
to do the online tutorials, they’re engaging with in
class. Most of them are actually engaging with as
part of a reflective writing exercise to improve their
final reports and most are starting to see some of
the value. Although, I think again, they’re probably
still learning how to…” [I2]
Students also seemed to reflect on their writing
more generally, which would lead to improvements in
their written communication in the long run:
“they have a more personal, reflective outcome or
a personal reflection about their own skills. And
about areas more generally where they might look
to improve their written communication. So, they
might identify a particular weakness in their
written communication more generally that they
could then think to address over the longer term….
if we go back to a scenario where you’re just asking
students to hand in an assignment and hope for the
best, absolutely our students are much more
reflective about their written communication.” [I2]
A similar outcome was also observed from tutor
responses. One tutor said that students’ paragraphs
were structured better, and since it was included as a
compulsory part of an assignment (in Accounting),
students reflected on and included rhetorical moves in
their writing. Another tutor said that although it was
hard to say if students improved their writing skills as
a direct result of the intervention, she observed that
students reflected more on their writing when
examining rhetorical moves, even though AcaWriter
did not always identify all rhetorical moves in their
writing. The third tutor said that marginal
improvements were observed, and there appeared to
be more self-assessment in student writing, but
students were still not including many rhetorical
moves.
7.1.2. Improving student writing
One instructor noted that there was an improvement
in performance on students’ written communication
over the semesters as indicated by an overall increase
in grades, although the long-term trend needs to be
examined further. The instructor from law noticed an
increased use of discourse markers in writing, which
helped students provide a well-reasoned view, an
opinion or a conclusion using explicit terms (Knight et
al., Forthcoming):
“Overall, since we’ve been working with [the
analytics team] around written communication
over the course of the last four of five semesters,
we have seen marked improvement in students’
written communication […] overall their
individual assignment pass-rate is going up.
Overall their mark is going up, but slowly.
Marginally, but slowly.” [I2]
“they were now using it in a more meaningful
way, they were producing better final versions
of their essays. I noticed a change and it was
profound that suddenly the discourse markers
were everywhere […] And suddenly you can see
they’re using the language that we were using in
the benchmarking activity because they’re smart
and they’re competitive and they know how to,
they’re good adopters if they’re told explicitly
what to do. And suddenly I noticed their essays
were better. And they will be better in court and
they will be better lawyers for it.” [I1]
This outcome is in agreement with the findings
from related work which found improvements in
student writing with the use of AcaWriter, measured
15
by the marks scored by students (Knight et al.,
Forthcoming; Shibani, 2019).
7.2. Providing formative feedback for students
Another outcome for the instructors was their
ability to provide feedback to students directly using
the tool, without waiting for tutors to respond to them.
One instructor also thought that students liked using
technology and it helped them to receive feedback at
any time of the day:
“In engaging with AcaWriter we’ve helped the
students get feedback directly on the piece of work
they’re working on and they can use the tool
directly.” [I2]
“I think that students like using tech because they
can do it any time of night, in the middle of the
morning [...] And once they understood enough
about what it does to find the feedback meaningful,
I got a lot of positive feedback from students about
the fact that it was automated and I think that’s an
important part of the story because what we’re
talking about is saying to the students, paste your
essay into this tool and then it’s going to give you
some feedback because your lecturer doesn’t have
enough time to do that for you.” [I1]
The number of student requests for feedback seen
in AcaWriter logs indicate that students are making
use of this technology to receive formative feedback
on their writing, to varied levels (Shibani, 2019).
7.3. Saving time
One instructor thought that a lot of time was saved
due to the reduction of re-mark requests from students,
attributing it to their improved understanding of the
marking criteria and learning to self-assess as a result
of the intervention:
“It’s just an enormous commitment and, of course,
this was all driven by me with two potential
outcomes. One, could the students stop asking for
re-marking? Well, I achieved that [...] We got
there. I say it was costing me 40 hours per semester,
but if you look at getting, the re-marking takes 20
minutes per paper. If you’ve got 30 students asking
for a re-mark, that’s a lot of time […] But anyway,
in an accumulative way, it should, over a 10-year
period of teaching, deliver a massive saving on that
time” [I1]
7.4. Instructor increasing knowledge of writing and
motivation to support it
The instructors felt that they had learned more
about the use of rhetorical moves in writing, writing
analytics technology, and how students write as an
outcome of their involvement in the project. They also
thought that it led them to reflect more deeply on their
domain-specific writing strategies while mapping their
assessment criteria to rhetorical moves:
“The other thing it was delivering for me is I was
learning more. It was informing my research and
my learning, my understanding of how to be explicit
with students, how to use writing analysis
technology, what it does, how students learn. As an
academic, that’s a very important process. I
realized where the gaps were in their
understanding and my explanation […] I learned
to be more explicit and I think I have become better
at pausing and asking students the right questions
to work out where their learning is up to before I
then assume anything or take the next steps, so it’s
been an intervention for me as much as it has for
the students.” [I1]
“I guess there was a reflection around the domain
specific writing strategies […] Because we’re
having to do that mapping between the rhetorical
moves and the components of the assessment, that’s
actually forcing us to think through what are the
discipline-specific rhetorical moves that need to be
tailored. So, that’s one thing that I definitely think
I reflected on.” [I2]
Furthermore, one instructor mentioned that the
process had helped them facilitate a dialogue with
students to teach them how multifaceted written
communication was:
“It has been helpful in facilitating that dialogue
with students […] Written communication is as
16
expansive as appropriate referencing, spelling,
grammar, use of tables, graphs. It’s around the
overall presentation, it’s around the underlying
coherence of the arguments, the rhetorical
strategies. It is about the appropriate use of
terminology. There is a lot in that […] it’s actually
difficult sometimes to communicate to students how
multi-faceted written communication is […] And
so, by having these different tools and different
exercises you can actually start to unpack what
good, professional written communication is for
students and they can see that they’re actually
distinct elements.” [I2]
It made them more mindful of the considerations in
delivering to diverse student cohorts with varying
needs:
“up one end we’ve got students who are fantastic
in terms of their vocabulary. But they might write
in a way that’s just really superfluous and so you’re
having to communicate to them that they need to be
writing more simply and thinking more carefully
about their structure. And down the other end you
have students that are struggling with spelling and
grammar, so whenever we’re designing these
activities, we need to be also be able to design
mindful that we’re delivering to a really diverse
cohort and we need tools that all of them are going
to get value out of.” [I2]
Tutor responses confirmed our understanding of
this key outcome for educators. All three tutors said
that it added value to their teaching, evidenced as
follows:
“It made their assignments easier to read as the
communication was more succinct and better.
While I consider I am quite good at writing,
detailing the rhetorical moves made me
understand writing better.”
“It definitely added value to my teaching,
especially with my marking.”
“I am applying rhetorical moves during
discussions to provide context and enough
justification on a concept.”
7.5. Contributing to writing research with
technological advances
The instructors understood the value of research on
writing, because they realized the importance of
teaching this core skill to students. This outcome of
aiding research on writing, even if not directly
impacting their teaching was appreciated by them.
With their openness to piloting writing analytics
technology in the classroom, they were also willing to
contribute to advancing the field of writing research.
They did not have any concerns in the use of collected
student data from the current intervention, but did
express concerns about the possible automation of
assessment using such tools. They thought that the
tools do not have the capability to exercise
professional judgement like a human, and can fail to
recognize outliers. This recognition of limitations of
LA is encouraging to see given that this kind of
understanding would transfer to their students.
“For me, it is a core skill. We need to understand
as educators what they understand about writing,
what they see when they’re writing things, what
they’re thinking about, and what they understand
are the different parts of a written piece of text and
I feel that this has been really valuable, and it has
justified the time and effort taken, including all the
data gathering we’ve done. Now, we’ve de-
identified their essays. You can’t identify the
students. There’s nothing there. As a matter of just
protecting their privacy and from an ethical
standpoint, I have no qualms about collecting all
that data and the analysis we did of it I think has
been highly technical and quite clever […] writing
analysis is used more and more now across all
parts of the law. I have my concerns with the
algorithmic bias, but that’s just a completely
different argument and isn’t something that arose
with this particular activity” [I1]
“one of the big risks and concerns in this area more
generally is that at some point you’re going to
replace, like this could be used for assessment, you
could have automated assessment or algorithmic
assessment of students work using some sort of
writing analysis tool that actually just can assess
the students written communication.” [I2]
17
“Because the thing with algorithms is that they
revert to a mean or an average. […] if you are not
average, if you’re doing something that’s really
outside of the box, which might be brilliant, you’re
going to be cut down by these algorithms that can’t
necessarily recognise those outliers.” [I3]
As detailed elsewhere, the writing analytics
implementations have helped us to advance the
research in evaluation of such applications (Shibani,
2019; Shibani, Knight, & Buckingham Shum, 2018).
This is possible using data science techniques to
analyse fine-grained writing process data, such as the
quantity and quality of revisions made in response to
automated feedback.
To summarise, the analysis of interview data
identified four key themes (visual snapshot in Figure
1) around instructors’ motivation to be involved in the
intervention design, its implementation in their
classrooms, the challenges experienced in delivering
it, and the outcomes gained as a result.
8. Discussion
Reflecting on this 3-year, longitudinal intervention
with writing analytics, we have documented numerous
insights gained from the educator interviews. From
these, we wish to highlight two key principles that
draw together many of the detailed points reported,
namely, encouraging teacher agency, and student
agency in the way that LA tools are conceived,
deployed and evaluated. We show how these findings
connect to the wider literature on LA adoption factors,
and future trajectories.
Figure 1: Snapshot of themes from the study
18
8.1. Educator agency: co-design builds trust and
leads to better outcomes
The uptake of new technology by students is often
based on how the institutional stakeholders
(educators) present the benefits of the technology to
them, and integrate it into their curriculum (Bakharia
et al., 2016; Prieto et al., 2019; Shibani et al., 2017).
In this way, educators have a significant impact in
bringing learning analytics applications into classroom
practice, and ensure effective use by learners.
For an educator not familiar with the field, the only
LA they may have seen (or heard about) is a student
activity dashboard in the university’s learning
management system. Since these products typically
log the simplest of student interactions (e.g. logins,
page views, forum posts, quiz results) many educators
would be justified in concluding that LA offers limited
insights into student learning (Jivet, Scheffel, Specht,
& Drachsler, 2018). Alternatively, LA may be
presented as a new form of staff surveillance, as
educator actions are similarly logged. Surveys of
educator (and student) attitudes to LA that do not first
make them aware of the rich variety of LA, risk
perpetuating a cycle of uninformed scepticism.
In contrast, we have evidenced in detail how the
respectful engagement of academics in the co-design
of LA that is closely aligned to their teaching built
trust, with productive outcomes. Educators’ readiness
to work with the analytics team to mutually align their
teaching and assessment with the novel capabilities of
the writing analytics tool proved pivotal to its
successful implementation. Educators greatly
appreciated the level of agency we gave them over the
design and implementation of the learning analytics
application in their classrooms.
We have subsequently seen these educators become
ambassadors to their peers who are now approaching
us, which is by far the most effective strategy to grow
institution-wide adoption and impact. Indeed, this
focus on augmenting (rather than revolutionizing)
existing practice with LA by providing agency to
educators enhances practice and improves adoption
(Ferguson et al., 2014; Knight, Shibani, et al., 2018).
For instance, existing self-assessment practices in the
Law classroom set up by the instructor are now
augmented by the use of annotated exemplars
produced as support materials for the LA tool’s
introduction, and existing drafting and revision
practices are now augmented by the provision of
automated feedback (see Knight, Shibani, et al., 2018
for details).
Engaging in Learning Design practices is another
way of giving agency to educators to align practice
with LA. In our implementations in the two
disciplines, the instructors designed transferable tasks
in the writing intervention that were relevant to their
curricula using AcaWriter. This can support adoption
of LA by not only making LA pedagogically relevant,
but also by enabling a shared common language to
represent good pedagogy. Representations like design
patterns can be used to document methods and theory
behind implementations in a common structure
(Goodyear, 2005). They facilitate the transfer of
patterns from one learning context/ discipline to
another by preserving the theoretical underpinnings
and practical considerations in a LA implementation
(Shibani et al., 2019). Resources generated as part of
the process can be shared via a repository to help other
practitioners (http://heta.io/resources/), cutting down
resource costs to develop from scratch. Furthermore,
such theory-oriented LA, including using design based
approaches helps move LA closer to foundational
research on learning (Reimann, 2016).
Trust and ethical issues surrounding the use of data
is a known concern for learning analytics practice
(Slade & Prinsloo, 2013). In our research, a key
finding is that educators greatly appreciate having
agency over the design and implementation of the
learning analytics application in their classrooms. A
lack of agency leads to lack of trust in the LA
application, hindering adoption. Within the LA field
more broadly, giving educators a genuine voice in the
design of relevant LA is now recognised as a first order
priority for LA research and practice (Buckingham
Shum et al., 2019; Hernández‐Leo et al., 2019). For
instance, co-design methods seek to give all
stakeholders a voice through skillful facilitation and
the use of accessible design tools that empower non-
technical participants (Holstein et al., 2017; Prieto-
Alvarez et al., 2018). Aiding educators to understand
the complexities of artificial intelligence technologies
such as automated feedback tools should also translate
into better understanding by their students, since staff
19
will bring a deeper understanding of why the system
behaves as it does.
Pragmatically, co-design is only possible when a
high level of control over the LA software is possible,
in order to tune it, and when the educators are in turn
ready to adjust their learning design to tightly integrate
the tool into an assignment. While it is expensive to
develop and validate the initial LA-augmented
learning design pattern, it can then be adopted/adapted
by others with less work. This requires universities to
consider capability-related questions such as: (i) what
capability do we have to modify the LA software?
(whether home-grown, or an external open source or
commercial product), (ii) what capability do
educators have to adapt their learning designs? and
(iii) how can we resource innovation pilots to couple
early-adopter educators with LA teams?
Elsewhere, we discuss strategies to build such
institutional capacity and achieve impact
(Buckingham Shum & McKay, 2018; Knight, Gibson,
& Shibani, In press). Part of the induction process that
we now use is to run training workshops (Shibani &
Abel, 2018) and develop online resources
2
designed to
help educators understand what AcaWriter can (and
cannot) do, show them examples of their colleagues’
work, and assist them in thinking through how they
might integrate the tool with their teaching.
8.2. Student agency: building evaluative judgement
and feedback literacy are key to using automated
feedback effectively
The educators reported that one motivation for their
adoption of AcaWriter was to develop students’ self-
assessment ability, as a method to support their
learning. This ability to self-assess one’s work is a
lifelong capacity referred to as “evaluative judgement”
by (Boud et al., 2015), relevant not only to formal
study, but also to future learning and employment.
Once a learner can assess their own work as well as an
expert, they have demonstrated a sound understanding
of the success criteria, and the ability to calibrate
themselves.
The emergence of LA and AI in education brings a
new challenge to understand how the nature of
feedback could change from what humans are capable
———
2
AcaWriter orientation website: https://acawriter.uts.edu.au/
of providing. In turn, as we understand what feedback
may mean in these new contexts, educators must teach
students the critical usage of such technologies. We
observed this as a challenge faced by educators in the
current study, where teaching students how to
effectively use the feedback from the tool was
necessary. A previous study has shown the importance
of students’ ability to view and interpret LA
dashboards, as it impacted students’ motivation
towards the subject, and helped guide them in their
progress and performance in learning activities and
assessments (Corrin & De Barba, 2015). Feedback
literacy (Sutton, 2012) becomes even more important
when students are asked to engage with automated
feedback that is different to human feedback on
higher-order constructs of learning like writing. AI-
based ‘reading’ and ‘annotation’ such as AcaWriter
offers relentlessly consistent annotation of text, at a
speed, scale and granularity that humans cannot
match. While feedback from AcaWriter is designed to
address structures in writing that proficient writers are
adept at identifying, the tool’s analysis is based only
on the linguistic properties of the text that it can
identify. Such is the complexity of writing that
feedback from the tool is guaranteed to be imperfect,
due to missing contextual knowledge that a human
marker would have. (We would note of course, that
human feedback is also far from perfect, but in other
ways.)
Educators commented on the fact that while
imperfect, AcaWriter was still useful. Elsewhere we
discuss the important role that imperfect LA can play
in scaffolding lifelong learning competencies, if a
desired student outcome is critical reflection (Kitto,
Buckingham Shum, & Gibson, 2018). Educators also
reported how important it was that students be
reminded of the non-authoritative nature of automated
feedback, and be given the responsibility (through
clear classroom induction, and in the design of the
automated feedback messages) to critique and
disregard the feedback if they deem it to be not useful.
Students were reminded in the introductory briefings,
and in the tool’s user interface which displays a clear
caution at the top of the feedback: “Computers don’t
read writing like humans. So, if you’re sure your
writing’s good, it's fine to disagree with AcaWriter’s
20
feedback, just like you’d ignore a poor grammar
suggestion.”
Elsewhere, we have documented how the literacy
of students to make sense of the feedback ranged from
shallow to deep, leading to varied levels of
engagement with the feedback (Shibani, 2019). Since
this feedback literacy is crucial for the uptake and
optimal use of automated feedback, it needs to be
studied in more detail in the future to support effective
use of such tools by students in authentic settings. The
approach of augmenting well-designed student tasks
with LA provides a safety net: the students are still
engaging in a meaningful activity even if the
technology is imperfect (Knight, Shibani, et al., 2018).
In sum, just as the emergence of AI in other
contexts provokes debate about what makes us ‘truly
human’ and how we should relate to machines, the
emergence of AI-powered feedback adds new
dimensions to the concept of what ‘good feedback’
looks like: it may no longer be only what humans can
provide. It follows that the meaning of ‘feedback
literacy’ may also need to expand to take into account
the machine now in the feedback loop.
8.3. Limitations of this study
We recognise that a limitation of this qualitative
analysis is that it explored the perspectives of only
three educators who were early adopters of the LA
technology. These instructors who engaged in the co-
design were self-motivated individuals who were
generally enthusiastic to support writing beyond that
required of them in the university. Engaging educators
with less experience or motivation to innovate in their
classrooms is a challenge for wider implementation
programs now being rolled out at the university. As we
have reported, however, early-adopters have a key role
to play in encouraging their colleagues. They are now
approaching us to pilot the software, and we are
designing similar design-based research projects with
them.
9. Conclusion
As the field of learning analytics approaches its first
decade as a distinctive community and emerging field,
there is rightly critical reflection on the factors that
enable and impede the adoption of LA tools at scale.
The focus of this paper has been on the educators: if
they are not enthusiastic champions of learning
analytics, technical innovations will have very limited
impact in universities. This work has foregrounded
educators’ experiences of, and reflections on, being
active participants in the co-design of a student-facing
writing analytics tool. Based on interviews with
educators who over three years implemented the
writing analytics application in their classrooms, key
themes emerged around the motivation,
implementation, challenges, and outcomes. We
conclude that the level of agency that educators and
students are granted in the conception and deployment
of the tool are key enablers to the effective
introduction of such novel tools. These insights have
arisen from our work to bridge the gap between theory
and practice by implementing effective writing
analytics in classroom practice, and the current case
study adds to that knowledge. These findings should
in principle apply more widely to other learning
analytics applications and contexts.
Acknowledgments
We are indebted to the academics who trusted us with
their students, co-designed the writing analytics
deployments described in this paper, and gave so
generously of their time in reflecting on the lessons
learned: Philippa Ryan, Nicole Sutton and Raechel
Wight. Thanks also to the subject tutors and students
involved in the study.
The paper draws from interviews previously reported
in the PhD thesis: Shibani, 2019.
21
References
Alhadad, S. S., & Thompson, K. (2017). Understanding the mediating role of teacher inquiry when connecting learning analytics with design
for learning. Interaction, Design, & Architecture (s), 33, 54-74.
Alhadad, S. S., Thompson, K., Knight, S., Lewis, M., & Lodge, J. M. (2018). Analytics-enabled teaching as design: Reconceptualisation and
call for research. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge.
Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., . . . Lockyer, L. (2016). A conceptual framework linking
learning design with learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge.
Boud, D., Lawson, R., & Thompson, D. G. (2015). The calibration of student judgement through self-assessment: disruptive effects of
assessment patterns. Higher Education Research & Development, 34(1), 45-59.
Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). (Eds.) Human-Centred Learning Analytics: Design Frameworks,
Tools and Techniques. Journal of Learning Analytics (Special Issue).
Buckingham Shum, S., & McKay, T. (2018). Architecting for learning analytics. Innovating for sustainable impact. EDUCAUSE review,
53(2), 25-37.
Corrin, L., & De Barba, P. (2015). How do students interpret feedback delivered via dashboards? In Proceedings of the fifth international
conference on learning analytics and knowledge.
Corrin, L., Kennedy, G., de Barba, P., Lockyer, L., Gaševic, D., Williams, D., & Bakharia, A. (2016). Completing the loop: Returning
meaningful learning analytic data to teachers. Sydney: Australian Office for Learning and Teaching.
Dawson, S., Joksimovic, S., Poquet, O., & Siemens, G. (2019). Increasing the Impact of Learning Analytics. Paper presented at the Ninth
International Conference of Learning Analytics and Knowledge (LAK19), Tempe, Arizona.
Dillenbourg, P. (2013). Design for classroom orchestration. Computers & Education, 69, 485-492.
doi:https://doi.org/10.1016/j.compedu.2013.04.013
Echeverria, V., Martinez-Maldonado, R., Shum, S. B., Chiluiza, K., Granda, R., & Conati, C. (2018). Exploratory versus Explanatory Visual
Learning Analytics: Driving Teachers’ Attention through Educational Data Storytelling. Journal of Learning Analytics, 5(3), 72—
97-72—97.
Eisenhardt, K. M. (1989). Building Theories from Case Study Research. The Academy of Management Review, 14(4), 532-550.
doi:10.2307/258557
Erickson, F. (1985). Qualitative methods in research on teaching: Citeseer.
Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: Overcoming the
barriers to large-scale adoption. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languages and design practice. Australasian Journal of
Educational Technology, 21(1).
Hernández‐Leo, D., Martinez‐Maldonado, R., Pardo, A., Muñoz‐Cristóbal, J. A., & Rodríguez‐Triana, M. J. (2019). Analytics for learning
design: A layered framework and tools. British Journal of Educational Technology, 50(1), 139-152.
Holstein, K., McLaren, B. M., & Aleven, V. (2017). Intelligent tutors as teachers' aides: exploring teacher needs for real-time analytics in
blended classrooms. In Proceedings of the seventh international learning analytics & knowledge conference.
Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational
practice. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge.
Kitto, K., Buckingham Shum, S., & Gibson, A. (2018). Embracing imperfection in learning analytics. In Proceedings of the 8th International
Conference on Learning Analytics and Knowledge.
Kitto, K., Lupton, M., Davis, K., & Waters, Z. (2017). Designing for student-facing learning analytics. Australasian Journal of Educational
Technology, 33(5), 152-168.
Klein, C., Lester, J., Rangwala, H., & Johri, A. (2019). Technological barriers and incentives to learning analytics adoption in higher
education: insights from users. Journal of Computing in Higher Education, 1-22.
Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (2018). Designing academic writing analytics for civil law student self-
assessment. International Journal of Artificial Intelligence in Education, 28(1), 1-28.
Knight, S., Gibson, A., & Shibani, A. (2020 (In press)). Implementing learning analytics for learning impact: Taking tools to task. The
Internet and Higher Education. doi:https://doi.org/10.1016/j.iheduc.2020.100729
Knight, S., Shibani, A., Abel, S., Gibson, A., Ryan, P., Sutton, N., . . . Buckingham Shum, S. (Forthcoming). AcaWriter: A Learning
Analytics Tool for Formative Feedback on Academic Writing. Journal of Writing Research.
Knight, S., Shibani, A., & Buckingham Shum, S. (2018). Augmenting Formative Writing Assessment with Learning Analytics: A Design
Abstraction Approach. Paper presented at the 13th International Conference of the Learning Sciences, London, United Kingdom.
Koh, E., Shibani, A., Tan, J. P.-L., & Hong, H. (2016). A pedagogical framework for learning analytics in collaborative inquiry tasks: An
example from a teamwork competency awareness program. In Proceedings of the Sixth International Conference on Learning
Analytics & Knowledge.
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American
Behavioral Scientist, 57(10), 1439-1459.
Macfadyen, L. P., Dawson, S., Pardo, A., & Gaševic, D. (2014). Embracing big data in complex educational systems: The learning analytics
imperative and the policy challenge. Research & Practice in Assessment, 9, 17-28.
22
Mangaroska, K., & Giannakos, M. (2018). Learning analytics for learning design: A systematic literature review of analytics-driven design to
enhance learning. IEEE Transactions on Learning Technologies.
Prieto-Alvarez, C. G., Martinez-Maldonado, R., & Anderson, T. D. (2018). Co-designing learning analytics tools with learners. Learning
Analytics in the Classroom: Translating Learning Analytics for Teachers.
Prieto, L. P., Rodríguez-Triana, M. J., Martínez-Maldonado, R., Dimitriadis, Y., & Gašević, D. (2019). Orchestrating learning analytics
(OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the classroom level. Australasian
Journal of Educational Technology.
Prieto, L. P., Sharma, K., Kidzinski, Ł., Rodríguez‐Triana, M. J., & Dillenbourg, P. (2018). Multimodal teaching analytics: Automated
extraction of orchestration graphs from wearable sensor data. Journal of computer assisted learning, 34(2), 193-203.
Reimann, P. (2016). Connecting learning analytics with learning research: The role of design-based research. Learning: Research and
Practice, 2(2), 130-142.
Rodríguez‐Triana, M. J., Martínez‐Monés, A., Asensio‐Pérez, J. I., & Dimitriadis, Y. (2015). Scripting and monitoring meet each other:
Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. British Journal of
Educational Technology, 46(2), 330-343.
Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(1),
12-27.
Scanlon, E., Sharples, M., Fenton-O’Creevy, M., Fleck, J., Cooban, C., Ferguson, R., . . . Waterhouse, P. (2013). Beyond Prototypes:
Enabling Innovation in Technology-Enhanced Learning. Retrieved from http://oro.open.ac.uk/41119/1
Sergis, S., & Sampson, D. G. (2017). Teaching and learning analytics to support teacher inquiry: A systematic literature review. In Learning
analytics: Fundaments, applications, and trends (pp. 25-63): Springer.
Shibani, A. (2018). AWA-Tutor: A Platform to Ground Automated Writing Feedback in Robust Learning Design. Paper presented at the 8th
International Conference on Learning Analytics & Knowledge (LAK18), Sydney.
Shibani, A. (2019). Augmenting Pedagogic Writing Practice with Contextualizable Learning Analytics. University of Technology Sydney,
Sydney, Australia.
Shibani, A., & Abel, S. (2018). Using AcaWriter to help develop student writing. Retrieved from https://futures.uts.edu.au/events/using-
acawriter-help-develop-student-writing/
Shibani, A., Knight, S., & Buckingham Shum, S. (2018). Understanding Revisions in Student Writing through Revision Graphs. Paper
presented at the 19th International Conference on Artificial Intelligence in Education, London, UK.
Shibani, A., Knight, S., & Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing
Analytics Evaluations. Paper presented at the The 9th International Conference on Learning Analytics and Knowledge (LAK’19),
Tempe, Arizona.
Shibani, A., Knight, S., Buckingham Shum, S., & Ryan, P. (2017). Design and Implementation of a Pedagogic Intervention Using Writing
Analytics. Paper presented at the 25th International Conference on Computers in Education, New Zealand.
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510-1529.
Sutton, P. (2012). Conceptualizing feedback literacy: knowing, being, and acting. Innovations in Education and Teaching International,
49(1), 31-40.
Thomas, D. R. (2003). A general inductive approach for qualitative data analysis. Paper presented at the School of Population Health,
University of Auckland.
Thompson, K., Alhadad, S. S., Buckingham Shum, S., Howard, S., Knight, S., Martinez-Maldonado, R., & Pardo, A. (2018). Connecting
expert knowledge in the design of classroom learning experiences. In From data and analytics to the classroom: Translating
learning analytics for teachers.
Tsai, Y.-S., & Gasevic, D. (2017). Learning analytics in higher education-challenges and policies: a review of eight learning analytics
policies. In Proceedings of the seventh international learning analytics & knowledge conference.
van Leeuwen, A., van Wermeskerken, M., Erkens, G., & Rummel, N. (2017). Measuring teacher sense making strategies of learning
analytics: a case study. Learning: Research and Practice, 3(1), 42-58.
Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. In Proceedings of the Fourth
International Conference on Learning Analytics And Knowledge.
Wise, A. F., & Cui, Y. (2018). Envisioning a Learning Analytics for the Learning Sciences. Paper presented at the 13th International
Conference of the Learning Sciences, London, United Kingdom.
23
Appendix A - Interview Guide for Instructor Interviews
The purpose of this interview is to understand the views of the instructors who have used AcaWriter to help their
students improve academic writing. This will help researchers to improve the feedback from AcaWriter and to
better support academics in using it for their subjects. The interview will be fairly unstructured but guided by
some sample questions given below. The interview will be recorded and transcribed for research purposes.
Motivation, experience, and expectations
We want to understand a little about the learning context, and your involvement in the project, so we have a few
questions about that…
What was the writing context you wanted to use AcaWriter for?
What is your prior experience in the use of technological innovations in your classroom?
What was the motivation for you to be involved in the project?
o What did you expect to use AcaWriter for? What did you think it would be helpful for? What
were your expected outcomes?
o What were the new affordances you thought AcaWriter would add to your existing teaching
plan?
Implementation and Usage
To introduce AcaWriter in your subject, you worked with researchers in [centrename] to implement a learning
design, so we have a few questions about the design and implementation…
Can you tell us a bit about how you designed and implemented the use of AcaWriter?
o For the law instructor: Can you tell us a bit about how you designed the tasks in the writing
intervention for your context? How did they evolve in the many iterations over time?
o For the accounting instructors: In your subject, we provided the example intervention in law,
with various steps. Can you tell us a bit about how you adopted and adapted the learning design
from that context to yours?
How did you find the process of working with researchers?
Did you face any problems or constraints during the implementation? How easy was this process of
working with researchers and implementing the innovative approach in your class?
o How much time did you spend preparing for it?
o How much effort did it require to implement the intervention?
Did you feel like you had enough agency/power in how the intervention was designed for your class?
24
Findings, value added and future usage
The next few questions are focused on the impact of the intervention, and how effective the AcaWriter tool is...
What impact do you think the intervention has had on learning?
o Do you think students learned/ engaged more in their writing?
o Did AcaWriter help your students become more aware of and reflect on their writing?
What value did you think it added to your teaching?
o Did the tool and/or intervention encourage you to reflect on your previous teaching of academic
writing?
o Do you think AcaWriter improved the efficiency of your teaching?
o Did you learn anything new from this intervention?
How do you see the role of student data and analysis in academic writing?
o Is it clear how and why the writing data is being analysed?
o Does the AcaWriter tool help make student learning (e.g. key features of academic writing)
visible? (to them and/or you?)
o Do you have any concerns e.g. about whether the AcaWriter tool makes it clear what data is
being collected
Can you see any improvements that could be made to the tool or/and intervention?
o What changes would you make in the future?
o Will you use it again in future semesters?
o What could we do to support other academics to adopt the tool?
25
Appendix B – Tutor Feedback Questionnaire
To trial AcaWriter and study if it helps students learn to reflect more on their writing, we worked closely with
[subject co-ordinator names] to create meaningful writing tasks with AcaWriter in [subject name]. We are
researching the effectiveness of these interventions, and would like to receive your feedback (both positive and
negative). As tutors who facilitate these activities in class, you play a key role in bringing these technologies to
students for effective use. We’d like to hear your thoughts on how these interventions can positively influence
student writing, as well as limitations or challenges you’ve noticed, and any support you feel would help tutors in
the future. Data is collected as part of the Academic Essay Self Evaluation and Revision project ([Ethics number],
approved by Ethics Review), email [email ID] for any questions. By completing this form, you consent to use this
data for research. We'd appreciate if you can provide detailed responses to the questions below.
[Section 1 for Law:] AcaWriter and Writing Intervention: To help improve students’ written communication in
Law essays, a writing intervention was designed for students with online activities including self-assessment and
revision with exemplar essays in a tutorial session. Students also used the AcaWriter software providing automated
feedback on rhetorical moves. Below are a few questions about that:
[Section 1 for Accounting:] AcaWriter and Writing Intervention: To help improve students’ written
communication in business reports, a writing intervention was designed for students with homework activities in
Week 1, in-class discussion in Week 2, and a self-evaluation exercise for assignment, using the AcaWriter software
providing automated feedback on rhetorical moves. Below are a few questions about that:
1. How familiar are you with the use of Writing analytics software (E.g. Turnitin, Grammarly, AcaWriter) in
general?
1 (Not at all familiar) to 3 (Very familiar)
2. To what level do you understand the role of AcaWriter and the writing intervention in developing students’
written communication?
I fully understand
I somewhat understand
I do not understand at all
26
3. Would you require more training to facilitate AcaWriter feedback discussion in class?
Yes
No
Any other comments?
_______________________________________
[Section 2] Time and Effort: We would like to know about the time and effort required to facilitate the intervention
in class. So here are some questions about that:
1. How easy was it to implement the writing intervention in your tutorial? (Was it a lot of effort?)
1 (Not at all easy) to 3 (Very easy)
2. How much time did it require for you to understand and implement the writing intervention with AcaWriter
feedback?
Less than an hour
About an hour
More than an hour
Any other comments?
_______________________________________
[Section 3] Outcomes: We would like to know if there were noticeable outcomes in your students that you observed
while in class and/or in marking their assignments. The below questions focus on that:
1. How useful was the writing intervention with AcaWriter for students to improve their writing?
1 (Not at all useful) to 5 (Very useful)
27
2. Did you notice improvements in students’ writing skills? Why/ why not?
_______________________________________
3. Did you notice changes in students’ engagement with writing (E.g. Did they learn to self-assess/ reflect more on
their writing/ include more rhetorical moves?) Why/ why not?
_______________________________________
4. What value did you think it added to your teaching? Did you learn anything new?
_______________________________________
[Section 4] Future Directions: We have some final questions about possible future interventions to improve student
writing:
1. Would you be interested in using the intervention and AcaWriter again in future semesters?
Yes
No
2. What changes would you make in the future?
_______________________________________
Is there anything else you would like to know more about?
_______________________________________