Figure - uploaded by Antonette Shibani
Content may be subject to copyright.

Source publication
Conference Paper
Full-text available
Academic writing can be supported by the provision of formative feedback in many forms including instructor feedback, peer feedback and automated feedback. However, for these feedback types to be effective, they should be applied in well-designed pedagogic contexts. In my pilot study, automated feedback from a writing analytics tool has been implem...

Similar publications

Article
Full-text available
Research that integrates Learning Analytics (LA) with formative feedback has been shown to enhance student individual learning processes and performance. Debates on LA-based feedback highlight the need to further understand what data sources are appropriate for LA, how soon the feedback should be sent to students and how different types of feedback...
Article
Full-text available
Teachers’ work is increasingly augmented with intelligent tools that extend their pedagogical abilities. While these tools may have positive effects, they require use of students’ personal data, and more research into student preferences regarding these tools is needed. In this study, we investigated how learning strategies and study engagement are...

Citations

... In this research, the sample (Wise et al., 2014) This article shows the design of LA interventions for students' participation in discussions 3. The Design of an Intervention Model and Strategy based on the Behavior Data of Learners: A Learning Analytics Perspective (Wu et al., 2015) This paper shows an intervention model involving means of intervention and the content of this intervention 4. Integrated Representations and Small Data towards Contextualized and Embedded Analytics Tools for Learners (Harrer & Göhnert, 2015) This paper describes an approach to support learners by means of visualization and contextualization of LA interventions 5. A Conceptual Framework Linking Learning Design with Learning Analytics (Bakharia et al., 2016) This paper shows an LA conceptual framework that supports enquiry-based evaluation of learning designs (Shibani, 2018) This paper shows a proposed Learning Analytics Intervention design for rhetorical writing instruction by providing automated feedback from a writing analytics tool is the population in this research; all Year Two undergraduate students who enrolled in a computer-based course were selected as the participants (N = 50) with both male (N = 20) and female (N = 30) students. All had experience in using e-learning since year one and they were active in e-learning. ...
Article
Full-text available
The emergence of Learning Analytics has brought benefits to the educational field, as it can be used to analyse authentic data from students to identify the problems encountered in e-learning and to provide intervention to assist students. However, much is still unknown about the development of Learning Analytics intervention in terms of providing personalised learning materials to students to meet their needs in order to enhance their learning performance. Thus, this study aims to develop a Learning Analytics intervention in e-learning to enhance students’ learning performance. In order to develop the intervention, four stages of Learning Analytics Cycle proposed by Clow: learner, data, metrics and intervention were carried out, integrating with two well-known models: Felder-Silverman’s and Keller’s Attention, Relevance, Confidence and Satisfaction (ARCS) models in e-learning, to develop various Learning Objects in e-learning. After that, a case study was carried out to assess this intervention with various validated research instruments. A quantitative approach involving a one-group pre-test–post-test experimental design was adopted, which consists of a population of 50 undergraduate students who enrolled in the Information System Management in Education course. The results indicated that the Learning Analytics intervention is useful, as it overall helped the majority of students to enhance their motivation, academic achievement, cognitive engagement and cognitive retention in e-learning. From this study, readers can understand the way to implement the Learning Analytics intervention which is proved to made positive impact on students’ learning achievement with the Cohen’s d of 5.669. Lastly, this study contributes significant new knowledge to the current understanding of how Learning Analytics intervention can perform to optimize students’ learning experience and also serves to fill a gap in research on Learning Analytics, namely the lack of development of interventions to assist students.
... reader). In preliminary work, peers have been asked to provide feedback to each other on their work, making use of the AWA tool to particularly foreground rhetorical structures in their comments (see, for example, Shibani 2017Shibani , 2018Shibani et al. 2017). ...
Book
This book is the first to explore the big question of how assessment can be refreshed and redesigned in an evolving digital landscape. There are many exciting possibilities for assessments that contribute dynamically to learning. However, the interface between assessment and technology is limited. Often, assessment designers do not take advantage of digital opportunities. Equally, digital innovators sometimes draw from models of higher education assessment that are no longer best practice. This gap in thinking presents an opportunity to consider how technology might best contribute to mainstream assessment practice. Internationally recognised experts provide a deep and unique consideration of assessment’s contribution to the technology-mediated higher education sector. The treatment of assessment is contemporary and spans notions of ‘assessment for learning’, measurement and the roles of peer and self within assessment. Likewise the view of educational technology is broad and includes gaming, learning analytics and new media. The intersection of these two worlds provides opportunities, dilemmas and exemplars. This book serves as a reference for best practice and also guides future thinking about new ways of conceptualising, designing and implementing assessment.
Chapter
Learning analytics as currently deployed has tended to consist of large-scale analyses of available learning process data to provide descriptive or predictive insight into behaviours. What is sometimes missing in this analysis is a connection to human-interpretable, actionable, diagnostic information. To gain traction, learning analytics researchers should work within existing good practice particularly in assessment, where high quality assessments are designed to provide both student and educator with diagnostic or formative feedback. Such a model keeps the human in the analytics design and implementation loop, by supporting student, peer, tutor, and instructor sense-making of assessment data, while adding value from computational analyses.