Conference PaperPDF Available

Developing a Learning Analytics Intervention Design and tool for Writing Instruction

Authors:

Abstract and Figures

Academic writing can be supported by the provision of formative feedback in many forms including instructor feedback, peer feedback and automated feedback. However, for these feedback types to be effective, they should be applied in well-designed pedagogic contexts. In my pilot study, automated feedback from a writing analytics tool has been implemented in pedagogic interventions, which integrate learning analytics technologies to existing practice in an educational context. To improve the learning design and to study the use of human insights in this context, a peer discussion module is planned to be added. This kind of peer discussion can augment automated feedback applications by making students aware of the limitations of such artificial intelligence powered feedback, and develop writing literacy by providing additional contextual feedback for their peers. The learning analytics intervention design when tested across different disciplines can validate the usefulness of this approach to improve students' academic writing in authentic pedagogic contexts. The design can be implemented using a learning analytics tool which is developed to facilitate the intervention and provide analytic capabilities by collecting learner data.
No caption available
… 
Content may be subject to copyright.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
1
Developing a Learning Analytics Intervention Design and tool for
Writing Instruction
Antonette Shibani
University of Technology Sydney
antonette.aileenshibani@student.uts.edu.au
ABSTRACT: Academic writing can be supported by the provision of formative feedback in many
forms including instructor feedback, peer feedback and automated feedback. However, for
these feedback types to be effective, they should be applied in well-designed pedagogic
contexts. In my pilot study, automated feedback from a writing analytics tool has been
implemented in pedagogic interventions, which integrate learning analytics technologies to
existing practice in an educational context. To improve the learning design and to study the
use of human insights in this context, a peer discussion module is planned to be added. This
kind of peer discussion can augment automated feedback applications by making students
aware of the limitations of such artificial intelligence powered feedback, and develop writing
literacy by providing additional contextual feedback for their peers. The learning analytics
intervention design when tested across different disciplines can validate the usefulness of this
approach to improve students’ academic writing in authentic pedagogic contexts. The design
can be implemented using a learning analytics tool which is developed to facilitate the
intervention and provide analytic capabilities by collecting learner data.
Keywords: academic writing, automated feedback, peer feedback, learning design, learning
analytics
1. INTRODUCTION
Academic writing is challenging to learn for many students, who could be supported by the provision
of formative feedback on the writing. Formative feedback aids students to gain awareness on where
they stand in terms of goals in their current work and how to improve their progress by determining
the way forward (Sadler, 1989). Formative feedback on students’ drafts can help them to improve
their writing through the application of the feedback in revisions of their text. Due to the time-
consuming nature of instructor provided feedback for student drafts, especially in large cohorts,
alternative forms of feedback like peer feedback and automated feedback have been studied to help
improve students’ writing skills.
Automated tools use computational techniques to provide immediate feedback on students’ drafts.
They reduce waiting time and human effort, ensure consistency and encourage students to practise
writing and revision by providing feedback multiple times while writing drafts (Shermis, Raymat, &
Barrera, 2003). Such writing analytics tools which offer the potential to provide timely automated
feedback on students’ writing are good examples of learning analytics applications in pedagogy.
However, these technologies have to be embedded in the curriculum by implementing them in well-
designed contexts for better uptake by students, and for solving existing pedagogical issues using
learning analytics. This move from developing learning analytics technologies to integrating them as
part of a larger educational context can be done using ‘pedagogical learning analytics intervention
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
2
design’ which refers to the systematic efforts to incorporate the use of analytics as a productive part
of teaching and learning practices in a given educational context” (Wise, 2014). This adds value to
learning by closing the gap between potential and actual use of technologies. The alignment of
learning analytics to learning design also provides a contextual framework to document the pedagogic
intent of analytics applications and to collect data for its evidence (Lockyer, Heathcote, & Dawson,
2013).
While working with learning analytics tools and dashboards, the human context is often emphasized
as central in interpreting and making sense of the analytics (Siemens, 2012). This is because learning
is a complex activity involving social processes. Sense-making and interpretations are hence important
in writing analytics tools, in order for students to understand and implement the automated feedback
that is provided on their writing. One way of providing sense-making support is through peer feedback
and discussion, where students can interpret, discuss and critique automated feedback on writing
with their peers. This approach of combining peer discussion and automated feedback has two core
benefits. First, peer discussion overcomes limitations in automated feedback by complementing it
with contextual feedback by peers to capture features missed by the tool. This brings in a human
context which is lacking in automated feedback and enhances the social and cognitive processes
involved in writing. Students also learn from each other while providing feedback on each other’s
writing by making judgements about their performance (Allal, Lopez, Lehraus, & Forget, 2005).
Second, automated feedback may address a concern in peer feedback regarding student’s abilities to
provide meaningful feedback, by scaffolding this feedback and provoking discussion around the
identified features.
I propose a design that combines known effective practices like peer feedback and discussion with
automated feedback, to complement each other. The design will be applicable in pedagogic contexts
where learning analytics can augment existing learning designs to improve students’ writing skills. To
represent the design in a theoretical and practical way for implantation in practice, abstractions like
conjecture mapping (Sandoval, 2014) and design patterns (Goodyear, 2005) will be developed. Thus,
the main aim of my research is to develop a pedagogic learning analytics intervention design that
combines automated and peer feedback to improve students’ academic writing, and design patterns
to help implement it in classroom settings across different pedagogic contexts. In doing that, I will
study the following as part of my research:
The impact of automated feedback in student writing
The impact of the inclusion of a peer discussion component with automated feedback
The implementation of learning design across different contexts
2. CURRENT STATUS OF WORK
In the first part of my doctoral study, a learning design aligned with learning analytics was developed
by embedding the automated feedback tool Academic Writing Analytics (AWA) in a law subject
(Shibani, Knight, Buckingham Shum, & Ryan, 2017). This was to help students understand the role of
rhetorical structures and the usage of automated feedback on them in a way that it can be applied for
their subject essay writing. These rhetorical structures guide the reader through the argument
structure of a text, and are a key part of academic writing (Hyland, 2005), with their presence having
some (small) relationship to essay quality (Simsek et al., 2015). An intervention grounded by pedagogy
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
3
was designed for students to learn essay writing and revision skills based on rhetorical moves in the
context of their subject curriculum by augmenting existing practice with learning analytics. The
intervention consisted of a set of tasks which were completed by the students individually in a
classroom during a tutorial session. To carry out the pedagogical intervention, I developed a tool called
AWA-Tutor as an extension of AWA which guides students through different activities to learn
rhetorical writing (Shibani, 2018). The tool also collects data when guiding the students through the
tasks to enable learning analytics to collect evidence for the learning design. The design consisting of
the tasks and the data collected are listed in Table 1 in the Appendix.
From this intervention, the impact of the types of feedback provided was studied using the essays
revised by students of different assigned groups, and by automated feedback group in particular
(Shibani et al., 2017). It was because this area studying the impact on students’ writing has not been
researched extensively in many tools that provide automated feedback, since the focus has mainly
been on the accuracy of such tools. The analytic data collected from the tool also enabled the analysis
of students’ revision process which was previously hard to track. Studying this revision data and
processes in the context of student essays aids to gain insights on processes involved in student writing
(Shibani, Knight, & Shum, 2018, Submitted for Review).
Thus, the writing analytics tool was embedded within a curriculum as a pedagogic intervention for
improving students’ writing ability by making use of several tasks. This design provided an intervention
design for students to learn writing skills, and a platform to deliver the activity, with integrated
learning analytics for collecting data for instructors and researchers, and delivering instant feedback
to students. All the tasks in this intervention were completed by students individually. The next design
will introduce a peer discussion component along with automated feedback to study its effectiveness
(Shibani, 2017). This design has been piloted, with some improvements made in the design and in the
tool. The design included a new component at the beginning of the activity which explains the use of
rhetorical moves and discourse markers to students. This was done by providing a supporting material
which was read aloud during the tutorial. The upgraded tool included descriptions of rhetorical moves
identified by the automated feedback component with example sentences. Preliminary analysis from
this data has shown increased acceptance and understanding of the activity among students, although
there is no significant different noted across the conditions. The next study will study the effects in
detail based on a design that encompasses automated feedback and peer discussion at student level
to study their differences, with some improvements made to automated feedback as well. The
learning design is improved across iterations, based on observations from the previous
implementation and feedback from the instructor and students. When it is stabilized for wide usage,
a design package consisting of the design patterns and a tool to help implement the design in
pedagogical contexts will be developed for practitioners adopting this writing instruction approach.
3. GOALS
The main goal of my research is the development of a learning design, abstractions and a tool that can
help implement the design to be used by researchers and practitioners for writing instruction. The
design will also be tested in a discipline other than the one it was originally developed for to validate
its transferability. The abstractions and learning patterns that evolve can be used by practitioners for
implementing the design in their classroom. Using this design, I also aim to study some research
questions which are not extensively studied in past literature.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
4
3.1 Research Questions
The overall aim of my research is to develop an effective learning analytics intervention design that
can be used across disciplines to study and impact students’ writing in authentic contexts by making
use of both automated and peer feedback. The specific research questions are:
1 What is the impact of automated feedback on student writing?
What are students’ perceptions of automated feedback?
What is the impact of automated feedback on student revisions?
2 What is the impact of automated feedback when combined with peer discussion on student
writing and revisions?
Do students produce higher quality texts when a peer feedback component is added
to automated feedback?
How do peer discussion dynamics impact the outcome?
What kinds of automated and peer feedback did students act on?
What is the student self-reported value of peer feedback in combination with
automated writing feedback?
3 How does the transfer of intervention design work across disciplines?
What are the abstractions to be developed to help a practitioner implement the
learning analytics intervention design in their discipline?
What is the student self-reported value of the intervention in the other discipline?
3.2 Methodology
The overarching methodology used in this research is design-based research (DBR), which is “a
systematic, but flexible methodology aimed to improve educational practices through iterative
analysis, design, development, and implementation, based on collaboration among researchers and
practitioners in real world settings, and leading to contextually-sensitive design principles and
theories’’ (pp. 6–7) (Wang & Hannafin, 2005). Changes are made to the design of the intervention and
the tool based on the feedback from the earlier implementation and discussion with the instructor
until the design is stabilized.
From the pilot studies which were carried out earlier, further improvements that can be made to
automated feedback format were identified for better student uptake. New ways of helping students
by providing contextual feedback and improving social sense making in writing were also trialed by
including peer discussion in the design. The proposed design will hence be an extension of the previous
studies, by making changes to the tool and design in addition to the inclusion of a peer discussion
component, as shown in Figure 1.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
5
Figure 1: Proposed learning analytics intervention design for rhetorical writing instruction
This design will be used to study the impact of automated feedback and dyadic peer discussion on
students’ writing by comparing individually revised texts and revised texts after peer discussion in the
context of with/without automated feedback. The peer discussion dynamics which is likely to have an
effect on the outcomes will be studied using qualitative analysis building on the work of discourse-
centric learning analytics (Knight & Littleton, 2015). Based on the results of the effect of peer
discussion component, the design may be updated in the future to run with/ without peer discussion.
The current design is for improving rhetorical writing of students by teaching the structure of text in
terms of rhetorical structures, discourse markers and automated feedback on them featuring the use
of AWA tool. However, it can also be potentially applied to writing instruction by making use of tools
that provide other types of feedback E.g. cohesion. The design is also pedagogically sound to
implement even without the use of automated feedback and analytics capabilities.
4. CONCLUSION
Academic writing in students can be supported by the use of resources and tools in most pedagogic
contexts. My doctoral research hence focusses on developing an efficient learning design by
combining automated and peer feedback for writing instruction. A learning analytics pedagogic
intervention design and automated tools to help carry out the intervention should enable new ways
to embed learning analytics applications in authentic contexts, and ultimately, improve writing. The
validated design could be potentially transferrable across contexts with the development of
standardized abstractions.
ACKNOWLEDGEMENTS
I would like to thank my supervisors Prof. Simon Buckingham Shum and Dr. Simon Knight for their
constant support and guidance in my research.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
6
REFERENCES
Allal, L., Lopez, L. M., Lehraus, K., & Forget, A. (2005). Whole-class and peer interaction in an activity
of writing and revision Writing in context (s) (pp. 69-91): Springer.
Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languages and
design practice. Australasian Journal of Educational Technology, 21(1).
Hyland, K. (2005). Metadiscourse: Wiley Online Library.
Knight, S., & Littleton, K. (2015). Discourse-centric learning analytics: mapping the terrain. Journal of
Learning Analytics, 2(1), 185-209.
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning
analytics with learning design. American Behavioral Scientist, 57(10), 1439-1459.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional
science, 18(2), 119-144.
Sandoval, W. (2014). Conjecture mapping: An approach to systematic educational design research.
Journal of the learning sciences, 23(1), 18-36.
Shermis, M. D., Raymat, M. V., & Barrera, F. (2003). Assessing Writing through the Curriculum with
Automated Essay Scoring. ERIC document reproduction service no ED 477 929.
Shibani, A. (2017). Combining automated and peer feedback for effective learning design in writing
practices. In W. Chen et al. (Ed.). Proceedings of the 25th International Conference on
Computers in Education, New Zealand.
Shibani, A. (2018). AWA-Tutor: A Platform to Ground Automated Writing Feedback in Robust Learning
Design. Proceedings of the 8th International Conference on Learning Analytics & Knowledge
(LAK18), Sydney.
Shibani, A., Knight, S., Buckingham Shum, S., & Ryan, P. (2017). Design and Implementation of a
Pedagogic Intervention Using Writing Analytics. In W. Chen et al. (Ed.). Proceedings of the 25th
International Conference on Computers in Education, New Zealand.
Shibani, A., Knight, S., & Shum, S. B. (2018, Submitted for Review). Understanding Students’ Revisions
in Writing: From Word Counts to the Revision Graph.
Siemens, G. (2012). Sensemaking: Beyond analytics as a technical activity. Presentation at the
EDUCAUSE ELI.
Simsek, D., Sándor, Á., Buckingham Shum, S., Ferguson, R., De Liddo, A., & Whitelock, D. (2015).
Correlations between automated rhetorical analysis and tutors' grades on student essays.
Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp.
355-359), NewYork.
Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning
environments. Educational technology research and development, 53(4), 5-23.
Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics.
Proceedings of the Proceedings of the Fourth International Conference on Learning Analytics
And Knowledge (pp. 203-211).
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
7
Appendix
Table 1: Design of tasks in iteration 1
Task
Purpose
Tool Design
Data collected
Matching
exercise
Enabling students’ understanding of
the instructor’s rubric elements
(Assessment criteria) for assessing
rhetorical essays by matching sample
rhetorical moves that correspond to
the rubric elements.
Interactive drag and
drop interface
(augmented by
analytics for
immediate feedback)
Time taken to
complete the task
Viewing
exemplar
revisions
Enabling learning to revise using
exemplars and provide understanding
of the current activity requirements by
giving students a sample revised essay
with changes made by the instructor
to improve it highlighted.
Displaying a tracked
version of a sample
improved essay in pdf
format that shows
revisions made using
rhetorical moves and
their rationale.
Nil
Essay
Assessment
Assessing the given low quality essay
by understanding the assessment
criteria, and identifying possible
revisions that can be made to improve
its quality.
Guiding questions to
enable assessment
and reflection.
Student responses
Revision &
Self-
assessment
(Main task)
Practising revision by the assessed
essay to improve its quality using
feedback (if provided, based on the
group assignment). In this first
iteration, students were assigned to
one of the following feedback groups
to study the feedback effects: AWA
Feedback Group, Instructor Feedback
Group and No Feedback Group.
A revision editor for all
groups and an
interface to receive
automated feedback
for AWA feedback
group when
requested
(augmented by
analytics for
immediate feedback)
Drafts during the
revision process,
Revised essays,
Student responses
for assessment
questions,
Requests for
automated
feedback
Task
evaluation
Receiving feedback from the students
on the whole activity for evaluation of
the task design. Students also get to
download an instructor’s sample
revised essay and their own revised
essay for reflection.
Evaluation questions
to provide feedback
Student responses,
Click history
tracked to record
the download of
revised essays.
... In this research, the sample (Wise et al., 2014) This article shows the design of LA interventions for students' participation in discussions 3. The Design of an Intervention Model and Strategy based on the Behavior Data of Learners: A Learning Analytics Perspective (Wu et al., 2015) This paper shows an intervention model involving means of intervention and the content of this intervention 4. Integrated Representations and Small Data towards Contextualized and Embedded Analytics Tools for Learners (Harrer & Göhnert, 2015) This paper describes an approach to support learners by means of visualization and contextualization of LA interventions 5. A Conceptual Framework Linking Learning Design with Learning Analytics (Bakharia et al., 2016) This paper shows an LA conceptual framework that supports enquiry-based evaluation of learning designs (Shibani, 2018) This paper shows a proposed Learning Analytics Intervention design for rhetorical writing instruction by providing automated feedback from a writing analytics tool is the population in this research; all Year Two undergraduate students who enrolled in a computer-based course were selected as the participants (N = 50) with both male (N = 20) and female (N = 30) students. All had experience in using e-learning since year one and they were active in e-learning. ...
Article
Full-text available
The emergence of Learning Analytics has brought benefits to the educational field, as it can be used to analyse authentic data from students to identify the problems encountered in e-learning and to provide intervention to assist students. However, much is still unknown about the development of Learning Analytics intervention in terms of providing personalised learning materials to students to meet their needs in order to enhance their learning performance. Thus, this study aims to develop a Learning Analytics intervention in e-learning to enhance students’ learning performance. In order to develop the intervention, four stages of Learning Analytics Cycle proposed by Clow: learner, data, metrics and intervention were carried out, integrating with two well-known models: Felder-Silverman’s and Keller’s Attention, Relevance, Confidence and Satisfaction (ARCS) models in e-learning, to develop various Learning Objects in e-learning. After that, a case study was carried out to assess this intervention with various validated research instruments. A quantitative approach involving a one-group pre-test–post-test experimental design was adopted, which consists of a population of 50 undergraduate students who enrolled in the Information System Management in Education course. The results indicated that the Learning Analytics intervention is useful, as it overall helped the majority of students to enhance their motivation, academic achievement, cognitive engagement and cognitive retention in e-learning. From this study, readers can understand the way to implement the Learning Analytics intervention which is proved to made positive impact on students’ learning achievement with the Cohen’s d of 5.669. Lastly, this study contributes significant new knowledge to the current understanding of how Learning Analytics intervention can perform to optimize students’ learning experience and also serves to fill a gap in research on Learning Analytics, namely the lack of development of interventions to assist students.
... reader). In preliminary work, peers have been asked to provide feedback to each other on their work, making use of the AWA tool to particularly foreground rhetorical structures in their comments (see, for example, Shibani 2017Shibani , 2018Shibani et al. 2017). ...
Book
This book is the first to explore the big question of how assessment can be refreshed and redesigned in an evolving digital landscape. There are many exciting possibilities for assessments that contribute dynamically to learning. However, the interface between assessment and technology is limited. Often, assessment designers do not take advantage of digital opportunities. Equally, digital innovators sometimes draw from models of higher education assessment that are no longer best practice. This gap in thinking presents an opportunity to consider how technology might best contribute to mainstream assessment practice. Internationally recognised experts provide a deep and unique consideration of assessment’s contribution to the technology-mediated higher education sector. The treatment of assessment is contemporary and spans notions of ‘assessment for learning’, measurement and the roles of peer and self within assessment. Likewise the view of educational technology is broad and includes gaming, learning analytics and new media. The intersection of these two worlds provides opportunities, dilemmas and exemplars. This book serves as a reference for best practice and also guides future thinking about new ways of conceptualising, designing and implementing assessment.
Chapter
Learning analytics as currently deployed has tended to consist of large-scale analyses of available learning process data to provide descriptive or predictive insight into behaviours. What is sometimes missing in this analysis is a connection to human-interpretable, actionable, diagnostic information. To gain traction, learning analytics researchers should work within existing good practice particularly in assessment, where high quality assessments are designed to provide both student and educator with diagnostic or formative feedback. Such a model keeps the human in the analytics design and implementation loop, by supporting student, peer, tutor, and instructor sense-making of assessment data, while adding value from computational analyses.
Conference Paper
Full-text available
Increasingly, the importance of aligning learning analytics with learning design is being understood as a way to uphold its core aim of improving educational practices, while also collecting meaningful data about learner’s activities that can be interpreted in context (Lockyer, Heathcote, & Dawson, 2013). In light of this, a writing analytics tool “AWA-Tutor” has been developed that integrates analytics with pedagogy. AWA-Tutor is a web-based tool developed as an extension of the Academic Writing Analytics (AWA) tool that provides automated feedback on students’ writing based on rhetorical structures in the text (Shibani, Knight, Buckingham Shum, & Ryan, 2017).
Conference Paper
Full-text available
Academic writing is a key skill required for higher education students, which is often challenging to learn. A promising approach to help students develop this skill is the use of automated tools that provide formative feedback on writing. However, such tools are not widely adopted by students unless useful for their discipline-related writing, and embedded in the curriculum. This recognition motivates an increased emphasis in the field on aligning learning analytics applications with learning design, so that analytics-driven feedback is congruent with the pedagogy and assessment regime. This paper describes the design, implementation, and evaluation of a pedagogic intervention that was developed for law students to make use of an automated Academic Writing Analytics tool (AWA) for improving their academic writing. In exemplifying this pedagogically aligned learning analytic intervention, we describe the development of a learning analytics platform to support the pedagogic design, illustrating its potential through example analyses of data derived from the task.
Conference Paper
Full-text available
The provision of formative feedback has been shown to support self-regulated learning for improving students' writing. Formative peer feedback is a promising approach, but requires scaffolding to be effective for all students. Automated tools making use of writing analytics techniques are another useful means to provide formative feedback on students' writing. However, they should be applied through effective learning designs in pedagogic contexts for better uptake and sense-making by students. Such learning analytics applications open up the possibilities to combine different types of feedback for effective design of interventions in authentic contexts. A framework combining peer feedback and automated feedback is proposed to design effective interventions for improving student writing. Automated feedback is augmented by peer feedback for better contextual feedback and sense making, and peer feedback is enhanced by automated feedback as scaffolding, thus complementing each other.
Article
Full-text available
Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This article describes a technique for mapping conjectures through a learning environment design, distinguishing conjectures about how the design should function from theoretical conjectures that explain how that function produces intended outcomes.
Article
Full-text available
This article considers the developing field of learning analytics and argues that to move from small-scale practice to broad scale applicability, there is a need to establish a contextual framework that helps teachers interpret the information that analytics provides. The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action.
Article
Full-text available
This paper provides an overview of some recent work in automated essay scoring that focuses on writing improvement at the postsecondary level. The paper illustrates the Vantage Intellimetric (tm) automated essay scorer that is being used as part of a Fund for the Improvement of Postsecondary Education (FIPSE) project that uses technology to grade electronic portfolios. The purpose of the electronic portfolio is to demonstrate a mechanism for translating the general learning goal on writing in an operational way that permits the developmental tracking of students throughout their undergraduate curriculum. Moreover, the technology can be readily incorporated into any course in which writing is a significant component. (Contains 22 references.) (Author/SLD)
Article
There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA provides the opportunity to explore the ways in which discourse of various forms both resources and evidences learning; the ways in which small and large groups, and individuals, make and share meaning together through their language use; and the particular types of language — from discipline specific, to argumentative and socio-emotional — associated with positive learning outcomes. DCLA is thus not merely a computational aid to help detect or evidence “good” and “bad” performance (the focus of many kinds of analytics), but a tool to help investigate questions of interest to researchers, practitioners, and ultimately learners. The paper ends with three core issues for DCLA researchers — the challenge of context in relation to DCLA; the various systems required for DCLA to be effective; and the means through which DCLA might be delivered for maximum impact at the micro (e.g., learner), meso (e.g., school), and macro (e.g., government) levels.
Article
During the past decade, design-based research has demonstrated its potential as a methodology suitable to both research and design of technology-enhanced learning environments (TELEs). In this paper, we define and identify characteristics of design-based research, describe the importance of design-based research for the development of TELEs, propose principles for implementing design-based research with TELEs, and discuss future challenges of using this methodology. (http://www.springerlink.com/content/a582109091287128/)
Conference Paper
When assessing student essays, educators look for the students’ ability to present and pursue well-reasoned and strong arguments. Such scholarly argumentation is often articulated by rhetorical metadiscourse. Educators will be necessarily examining metadiscourse in students’ writing as signals of the intellectual moves that make their reasoning visible. Therefore students and educators could benefit from available powerful automated textual analysis that is able to detect rhetorical metadiscourse. However, there is a need to validate such technologies in higher education contexts, since they were originally developed in noneducational applications. This paper describes an evaluation study of a particular language analysis tool, the Xerox Incremental Parser (XIP), on undergraduate social science student essays, using the mark awarded as a measure of the quality of the writing. As part of this exploration, the study presented in this paper seeks to assess the quality of the XIP through correlational studies and multiple regression analysis.
Conference Paper
This article addresses a relatively unexplored area in the emerging field of learning analytics, the design of learning analytics interventions. A learning analytics intervention is defined as the surrounding frame of activity through which analytic tools, data, and reports are taken up and used. It is a soft technology that involves the orchestration of the human process of engaging with the analytics as part of the larger teaching and learning activity. This paper first makes the case for the overall importance of intervention design, situating it within the larger landscape of the learning analytics field, and then considers the specific issues of intervention design for student use of learning analytics. Four principles of pedagogical learning analytics intervention design that can be used by teachers and course developers to support the productive use of learning analytics by students are introduced: Integration, Agency, Reference Frame and Dialogue. In addition three core processes in which to engage students are described: Grounding, Goal-Setting and Reflection. These principles and processes are united in a preliminary model of pedagogical learning analytics intervention design for students, presented as a starting point for further inquiry.