Ilka Wolter’s research while affiliated with Leibniz Institute for Educational Trajectories and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (4)


Behavioral trace data in an online learning environment as indicators of learning engagement in university students
  • Article

October 2024

·

57 Reads

Frontiers in Psychology

·

·

Julia Mendzheritskaya

·

[...]

·

Learning in asynchronous online settings (AOSs) is challenging for university students. However, the construct of learning engagement (LE) represents a possible lever to identify and reduce challenges while learning online, especially, in AOSs. Learning analytics provides a fruitful framework to analyze students' learning processes and LE via trace data. The study, therefore, addresses the questions of whether LE can be modeled with the sub-dimensions of effort, attention, and content interest and by which trace data, derived from behavior within an AOS, these facets of LE are represented in self-reports. Participants were 764 university students attending an AOS. The results of best-subset regression analysis show that a model combining multiple indicators can account for a proportion of the variance in students' LE (highly significant R2 between 0.04 and 0.13). The identified set of indicators is stable over time supporting the transferability to similar learning contexts. The results of this study can contribute to both research on learning processes in AOSs in higher education and the application of learning analytics in university teaching (e.g., modeling automated feedback).


Behavioral trace data in an online learning environment as indicators of learning engagement in university students
  • Article
  • Full-text available

October 2024

·

75 Reads

Learning in asynchronous online settings (AOSs) is challenging for university students. However, the construct of learning engagement (LE) represents a possible lever to identify and reduce challenges while learning online, especially, in AOSs. Learning analytics provides a fruitful framework to analyze students' learning processes and LE via trace data. The study, therefore, addresses the questions of whether LE can be modeled with the sub-dimensions of effort, attention, and content interest and by which trace data, derived from behavior within an AOS, these facets of LE are represented in self-reports. Participants were 764 university students attending an AOS. The results of best-subset regression analysis show that a model combining multiple indicators can account for a proportion of the variance in students' LE (highly significant R² between 0.04 and 0.13). The identified set of indicators is stable over time supporting the transferability to similar learning contexts. The results of this study can contribute to both research on learning processes in AOSs in higher education and the application of learning analytics in university teaching (e.g., modeling automated feedback).

Download


Exploring Learners’ Self-reflection and Intended Actions After Consulting Learning Analytics Dashboards in an Authentic Learning Setting

September 2024

·

66 Reads

Learning Analytics Dashboards (LAD) have been developed as feedback tools to help students self-regulate their learning (SRL), using the large amounts of data generated by online learning platforms. Despite extensive research on LAD design, there remains a gap in understanding how learners make sense of information visualised on LADs and how they self-reflect using these tools. We address this gap through an experimental study where a LAD delivered personalised SRL feedback based on interactions and progress to a treatment group, and minimal feedback based on the average scores of the class to a control group. Following the feedback, students were asked to state in writing how they would change their study behaviour. Using a coding scheme covering learning strategies, metacognitive strategies and learning materials, three human coders coded 1,251 self-reflection texts submitted by 417 students at three time points. Our results show that learners who received personalised feedback intend to focus on different aspects of their learning in comparison to the learners who received minimal feedback and that the content of the dashboard influences how students formulate their self-reflection texts. Based on our findings, we outline areas where support is needed to improve learners’ sense-making of feedback on LADs and self-reflection in the long term.