Conference PaperPDF Available

Engaging Students as Co-Designers of Learning Analytics

Authors:

Abstract and Figures

As Learning Analytics (LA) moves from theory into practice, researchers have called for increased participation of stakeholders in design processes. The implementation of such methods, however, still requires attention and specification. In this report, we share strategies and insights from a co-design process that involved university students in the development of a LA tool. We describe the participatory design workshops and highlight three strategies for engaging students in the co-design of learning analytics tools.
Content may be subject to copyright.
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
Engaging Students as Co-Designers of Learning Analytics
Juan Pablo Sarmiento Fabio Campos Alyssa Wise
New York University New York University New York University
jp.sarmiento@nyu.edu fabioc@nyu.edu alyssa.wise@nyu.edu
ABSTRACT: As Learning Analytics (LA) moves from theory into practice, researchers have called
for increased participation of stakeholders in design processes. The implementation of such
methods, however, still requires attention and specification. In this report, we share strategies
and insights from a co-design process that involved university students in the development of
a LA tool. We describe the participatory design workshops and highlight three strategies for
engaging students in the co-design of learning analytics tools.
Keywords: participatory design, student-facing dashboards, higher education.
1. INTRODUCTION
While the use of Learning Analytics (LA) in higher education institutions is increasing, not many of
these tools are intended for students’ direct use (Bodily and Verbert, 2017). When such student-facing
tools are created, research shows that there can be misalignment between designers’ intentions,
students’ perceptions, and institutional limitations, with learners often distrusting the tools as a result
(de Quincey et. al., 2019). This may be because the majority of such tools are developed without the
direct involvement of students (Bodily and Verbert, 2017). In response, researchers have called for
increased participation of stakeholders in the design processes of these tools (Ahn, Campos, Hays &
DiGiacomo, 2019; Buckingham Shum, Ferguson & Martinez-Maldonado, 2019).
At New York University we have recently embarked on such a participatory design effort, developing
a learning analytics tool with and for students. This report describes a series of co-design workshops
held as part of the participatory process, highlighting key strategies for engaging students in co-
designing LA tools for their own tools.
2. INSTITUTIONAL BACKGROUND
New York University prides itself on being an innovator in higher education. With over 50,000 students
in both undergraduate and graduate programs, it has begun developing multiple LA initiatives to
support its diverse and growing student body. This includes the creation of a LA faculty service as part
of the university-wide instructional technology offerings, and the creation of an organization devoted
to LA research, NYU-LEARN research network. These two entities have been collaborating on a variety
of projects, and in November 2017 embarked on a joint project for the research and design of a
student-facing learning analytics tool, with the intent of eventual roll out to the entire student body.
3. PROCESS FRAMING AND PARTICIPATORY DESIGN WORKSHOPS
Our team, composed of members from both the LA service in IT and the research network, was tasked
with leading the process of developing a student-facing LA tool which would address challenges for
student learning at our university. The team formed a steering committee, composed of stakeholders
from NYU-LEARN, Faculty of Arts and Science and student advisors, which was responsible for setting
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
the major boundaries of the project and participant sampling strategies. The committee decided to
target students with diverse backgrounds, focusing on first-generation participants as “extreme
users,” whose needs are often underserved when designing for an “average” user (Pullin & Newell,
2007).
Our design process utilized elements of Human-Centered Design, bringing stakeholders in not only as
sources of information but as co-designers. The aim was to understand their experiences, needs and
points of view, while taking into account learning challenges previously identified by the university.
Prior to the co-design sessions, we conducted in-depth interviews with 13 students, three faculty and
six advisors to surface the specific needs and challenges faced by students which LA could potentially
address. These were used to develop user profiles ("personas") that outlined salient problems for
students’ learning experiences. Students were then invited to take part in co-design sessions to
develop LA solutions that tackled these user personas’ challenges.
Participants were recruited through both an open call and individual emails which targeted a list of
106 first-generation students identified by advisors as potentially interested in design. Recruitment
was challenging due to time constraints of students at the end of term. Although more than 20
signalled interest, only 10 were able to participate. Most participants came back for multiple design
workshops and many indicated explicitly that they were happy with the experience. Students who
participated said they did so because of an interest in design and/or big data.
We conducted three 5-hour design workshops, in which a total of 10 freshman, junior, and graduate
students (between 4 and 7 in each session) learned about design and LA and worked alongside
researchers and a professional designer. The workshops consisted of: (1) Ice breakers; (2) Short lecture
on Design Thinking and Human-Centered Design; (3) Collaborative fleshing out of student personas
through empathy mapping; (4) Collaborative mapping of data types and representations relevant to
the personas; (5) Individual ideation of solutions through brain-writing; (6) Selection of promising
solutions; (7) Collaborative ideation and paper prototyping; (8) Iteration and development of new
prototypes (Fig. 1). Facilitators and a designer worked side by side with participants in these activities,
iterating between sketching designs, sharing them, providing feedback and redesigning.
Figure 1: Structure of co-design workshops
To respond to varying levels of data literacy among participants, part 4 of the session was devoted to
mapping, alongside students, the types of data the university could collect that might be relevant to
the challenges participants identified. Similarly, as a starting point for an ideation brainwriting session
(where participants wrote long lists of ideas for solutions), students were presented with a set of cards
(Fig. 2), which had (a) inferences that could potentially be made based on the available data (for
example “which readings have you opened?”), and (b) action phrases which suggested potential uses
for such information (such as “compare”; “help you understand”). Participants randomly took a couple
of the cards from each group and put them together, to stimulate lateral thinking and simultaneously
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
ground their ideas in the available data. They would then share their favorite ideas and iterate on
these ideas in teams.
The resulting designs ranged broadly, from a tool to aid students in finding “study buddies” to systems
where users who took a given course could share information with new students. Students’ solutions
were often similar to those described in the literature (like tools for monitoring course progress or
using peer activity as a motivator) and sometimes pushed in directions new to the field (like leveraging
social networks and emotion as central elements of design).
Participants also brought their identities and interests into the discussion. For instance, one student
advocated for platforms which took into account variables such as wellbeing and mental health, after
having themselves been diagnosed with depression. Some ideas were mixed with others in playful
ways, often using metaphors such as a “hive” in which students visualized competing deadlines and
responsibilities from multiple courses, merged with a timeline, and a system for monitoring peer
resource activity (Fig. 2). Students also identified qualitative input from other students (such as advice,
tips or warnings) as a valuable source of information for how to successfully navigate personal and
academic challenges. While LA typically concerns itself with the presentation of data for insights, this
suggests that systems that pair analysis with the communication of know-how information through a
human network can have special value for students.
Figure 2: Ideation cards, student sketches and first prototype of the Hive interface.
4. KEY STRATEGIES FOR ENGAGING STUDENTS
Below we highlight three key strategies we found useful in the participatory design workshops:
Explicitly Address Power Dynamics. Power imbalances, such as those between students and
researchers or designers, can hinder participatory processes (Dollinger, 2018). Developing trust
between participants can be a way to overcome this problem. Throughout the workshops, facilitators
reminded participants that it was safe to share opinions and to be provocative and encouraged
students to challenge facilitators’ views and assumptions. The strategy seemed to succeed, in that
participants scrutinized the very objective of the session. They pointed out that questions of happiness
and meaning may be more relevant to students than academics, inciting a discussion about the role
of a university in students’ lives.
Keep the Problem Space Flexible. Traditional design thinking advocates for a clear problem framing,
but when working with students we saw value in allowing the room to go back to problematizing the
question and refrained from limiting students' ideation even while hunting for solutions. When
students pointed to unexpected problem spaces, such as happiness vs. academics, or the role of family
and motivation, these were honored. We believe that this played a role in them framing and
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
integrating solutions in novel ways, such as tools which used interactions between student’s academic
and emotional data to provide suggestions.
Use Vulnerability to Develop Psychological Safety. Psychological safety can be an important
precondition to creativity (Hunter et. al., 2007), though one hard to achieve with strangers. As
facilitators, we used sharing vulnerability as a means to create a safe space, through ice breaker
exercises where participants worked together, engaged in physical contact and were silly in front of
each other. Likewise, participants were encouraged to share personal experiences and stories. Success
of the strategy is evident in video recordings of the workshops, which show participants shifting from
being hesitant to contributing to laughing and enjoying working together. The environment is likely to
have played a role in students being increasingly comfortable with sharing unconventional paths to a
solution, such as a tool where achievement was monitored through the “health” of a virtual pet (like
a Tamagotchi). Several of these more unconventional solutions emphasized affective and playful
design, an area that has yet not been central in LA tool creation.
5. CONCLUSION AND NEXT STEPS
As some in the field have argued, having students actively participate in the process of co-creating LA
tools can have a positive impact on ownership, understanding and use (de Quincey et. al., 2019;
Dollinger, 2018; Buckingham Shum et al, 2019). Our experience with participatory design suggests that
addressing issues of power imbalance and developing an environment conducive to creative thinking
has the potential to uncover innovative designs with the potential to improve students’ experience
and learning. At the time of this writing, we are preparing to test this hypothesis through user
experience trials of prototypes developed from these workshops; to understand if the process indeed
produces student-facing LA tools whose purpose is valued by students and which are seen as aligning
with their authentic needs.
REFERENCES
Ahn, J., Campos, F., Hays, M., & DiGiacomo, D. (2019). Designing in Context: Reaching beyond Usability
in Learning Analytics Dashboard Design. Journal of Learning Analytics, 6(2), 70-85.
Bodily, R., & Verbert, K. (2017). Trends and issues in student-facing learning analytics reporting
systems research. In Proceedings of the Seventh International Learning Analytics & Knowledge
Conference (pp. 309-318).
Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-Centred Learning
Analytics. Journal of Learning Analytics, 6(2), 1-9.
Dollinger, M. (2018). Technology for the scalability of co-creation with students. Open Oceans:
Learning without borders, 346-350.
Hunter, S. T., Bedell, K. E., & Mumford, M. D. (2007). Climate for creativity: A quantitative review.
Creativity research journal, 19(1), 69-90.
Pullin, G., & Newell, A. (2007, July). Focussing on extra-ordinary users. In Proceedings International
Conference on Universal Access in Human-Computer Interaction (pp. 253-262).
de Quincey, E., Briggs, C., Kyriacou, T., & Waller, R. (2019, March). Student Centred Design of a
Learning Analytics System. In Proceedings of the 9th International Conference on Learning
Analytics & Knowledge (pp. 353-362).
... This was consolidated in a special issue of the Journal of Learning Analytics on Human-Centred Learning Analytics ) and four PhDs have recently been completed with explicit attention to participatory design for LA, reflecting interest in the emerging generation of researchers (Dollinger, 2019;Echeverria, 2020). Moreover, the best practitioner paper at LAK20 focused on co-designing with learners (Sarmiento et al., 2020). A version of this workshop was successfully held in LAK20 and in ECTEL21. ...
... Holstein et al.(2019)featured a number of co-design techniques, namely card sorting, directed storytelling, semi-structured interviews, prototyping and behavioural mapping, to co-design a classroom analytics innovation with teachers. Whilst some examples of LA design processes have focused on engaging with students, these are just starting to emerge(Chen & Zhu, 2019; de Quincey et al., 2019;Prieto-Alvarez et al., 2018, Prieto et al., 2019Sarmiento et al., 2020). ...
Conference Paper
Full-text available
Commonly, time limits are a necessary part of every exam, and they may introduce an unintended influence such as speededness on the item and ability parameter estimates as they have not been accounted for while modeling the latent ability. The changepoint analysis (CPA) method can be used to obtain more accurate parameters by detecting speeded examinees, the location of change-point, removing speeded responses, and reestimating parameters. In the current study, several examinees were detected as speeded across five sections of a 250-item exam, and two main patterns were observed. In addition, speededness was further investigated using response times (RTs) per item and two patterns were observed for examinees with a decrease in performance after the estimated changepoint. Recommendations for practitioners, limitations, and future research were discussed in the conclusion section.
... For example, Dollinger and Lodge [13] and Prieto-Alvarez et al. [3] incorporated principles of co-creation and co-design (respectively) to give an active voice to teachers, learners and other educational stakeholders in decisions that would shape a LA innovation. Wise and Jung [46] and Sarmiento et al. [34] also adopted co-design methodologies, such as the use of ideation cards, sketching and prototyping to understand teachers' and learners' authentic needs, respectively, to allow educational stakeholders to define characteristics of the tools that they will end up using. Holstein et al. [20], through a longitudinal design project in which several design tools were used, demonstrated how co-design approaches can lead to unexpected technological innovations and can increase the likelihood of technology acceptance and adoption. ...
... Inspired by the emphasis on "question formulation" as a key step in the LA sensemaking process [32,33,44,46], and the need for HCD practices to address authentic needs of stakeholders in LA [20,34,36,46] we propose a question-driven LA design approach to ensure that the end-user LA interface explicitly addresses teachers' questions. We particularly focus on teachers to connect to the inquiry cycle that they commonly engage in to reflect on their practice. ...
Conference Paper
Full-text available
One of the ultimate goals of several learning analytics (LA) initiatives is to close the loop and support students’ and teachers’ reflective practices. Although there has been a proliferation of end-user interfaces (often in the form of dashboards), various limitations have already been identified in the literature such as key stakeholders not being involved in their design, little or no account for sense-making needs, and unclear effects on teaching and learning. There has been a recent call for human-centred design practices to create LA interfaces in close collaboration with educational stakeholders to consider the learning design, and their authentic needs and pedagogical intentions. This paper addresses the call by proposing a question-driven LA design approach to ensure that end-user LA interfaces explicitly address teachers’ questions. We illustrate the approach in the context of synchronous online activities, orchestrated by pairs of teachers using audio-visual and text-based tools (namely Zoom and Google Docs). This study led to the design and deployment of an open-source monitoring tool to be used in real-time by teachers when students work collaboratively in breakout rooms, and across learning spaces.
... Stakeholders must be involved in deriving actionable intelligence as the interpretation of data needs to consider the context that generated it [12], as does identifying the questions and conversations that data can inform [11,20,26]. The importance of stakeholder involvement in both identifying relevant questions and resulting actions has been highlighted in a number of studies analysing LAD usage [20,23]. ...
... Stakeholders must be involved in deriving actionable intelligence as the interpretation of data needs to consider the context that generated it [12], as does identifying the questions and conversations that data can inform [11,20,26]. The importance of stakeholder involvement in both identifying relevant questions and resulting actions has been highlighted in a number of studies analysing LAD usage [20,23]. In addition, Aguilar and Baek [1] found that students' reaction to a LAD was influenced by the design of its presentation, so both designers and educators need to consider the role of an LAD within their learning context, and how its presence may influence student behaviour, positively and negatively. ...
... (Hilliger et al., 2020) recognized desired course-level indicators as mostly descriptive and refered to the rejection of predictive indicators mentioned in former papers. Sarmiento et al. (2020) described their approach of a series of co-design workshops for learning analytics tools with and for students at New York University (USA). Based on in-depth interviews, personas were developed to map the biggest challenges of the learning experiences. ...
... Part 2 provides students an opportunity to think about the impact of machine learning algorithms on users, which fits well in such a course. Further, by integrating the discussion into the course and awarding points for participation, student time constraints as in (Sarmiento et al., 2020) were overcome. ...
Conference Paper
Selecting courses that optimally fit a student's situation can help reduce the risk of dropping out. Data exploration and performance prediction approaches can be applied to help students make these decisions. To ensure that an enrollment support system meets the needs of students, they should be involved as early as possible in the development process. This paper presents an initial assessment of some functionalities of a novel course enrollment support system based on student performance data. The results include a collection of indicators and sources of information, as well as an overview of needs and concerns. The insights gathered will help to develop a system that has the trust of students.
... Stakeholders must be involved in deriving actionable intelligence as the interpretation of data needs to consider the context that generated it [12], as does identifying the questions and conversations that data can inform [11,20,26]. The importance of stakeholder involvement in both identifying relevant questions and resulting actions has been highlighted in a number of studies analysing LAD usage [20,23]. ...
... Stakeholders must be involved in deriving actionable intelligence as the interpretation of data needs to consider the context that generated it [12], as does identifying the questions and conversations that data can inform [11,20,26]. The importance of stakeholder involvement in both identifying relevant questions and resulting actions has been highlighted in a number of studies analysing LAD usage [20,23]. In addition, Aguilar and Baek [1] found that students' reaction to a LAD was influenced by the design of its presentation, so both designers and educators need to consider the role of an LAD within their learning context, and how its presence may influence student behaviour, positively and negatively. ...
... In learning analytics, similar conversations have arisen about the need to bring stakeholders into our design processes [16] as part of a larger shift towards Human-Centered Learning Analytics [59]. Such efforts could go further by specifically thinking about groups of stakeholders that could be most negatively impacted by a learning analytics application and designing ways to intentionally include them in our discussions and processes of design [55]. ...
... For example, in Action Research, practitioners such as teachers become leads or co-leads of research [6,37]; in Participatory Action Research researchers pair with learners and practitioners to improve their spaces together and generate theory from these initiatives [3,19,21]; and in Youth Participatory Action Research, as students are included in the decision making and inquiry, the process becomes a learning experience as well as the site for theory building [12]. While learning analytics has begun to involve intended users as participants in processes of design [16,55,59], we can move further to consider subverting the top-down research and design hierarchies that shape the ways we work. ...
Conference Paper
This paper puts forth the idea of a subversive stance on learning analytics as a theoretically-grounded means of engaging with issues of power and equity in education and the ways in which they interact with the usage of data on learning processes. The concept draws on efforts from fields such as socio-technical systems and critical race studies that have a long history of examining the role of data in issues of race, gender and class. To illustrate the value that such a stance offers the field of learning analytics, we provide examples of how taking a subversive perspective can help us to identify tacit assumptions-in-practice, ask generative questions about our design processes and consider new modes of creation to produce tools that operate differently in the world.
... Like Carrillo et al. collaborating with five teachers during a design workshop to get informed about the educational uses of mind maps in the class in order to design a teachers' dashboard to aid them monitor learners' engagement during mind mapping activities [17]. And others share strategies and insights from a co-design process that involved university students in the development of a learning analytics tool [18]. Prieto-Alvarez et al. illustrated how a series of tools and techniques can be applied for co-designing learning analytics tools. ...
... Learners need to be engaged in the decision making if such systems are to be accepted and adopted. With the rise of human-centred learning analytics (HCLA) (Buckingham Shum et al., 2019) and student-centred learning analytics (Ochoa and Wise, 2020), more and more authors seek to engage critical stakeholders, identify their needs and design tools that address those needs through questionnaires (Hilliger et al., 2020), focus groups (Bennett and Folley, 2019) or co-design sessions (de Quincey et al., 2019;Sarmiento et al., 2020;Dollinger and Lodge, 2018). ...
Thesis
Learning dashboards are learning analytics (LA) tools built to make learners aware of their learning performance and behaviour and supporting self-reflection. However, most of the existing dashboards follow a “one size fits all” philosophy disregarding individual differences between learners, e.g., differences that stem from diverse cultural backgrounds, different motivations for learning or different levels of self-regulated learning (SRL) skills. In this thesis, we challenge the assumption that impactful learning analytics should be limited to making learners aware of their learning, but rather should encourage and support learners in taking action and changing their learning behaviour. We thus take a learner-centred approach and explore what information learners choose to receive on learning analytics dashboards and how this choice relates to their learning motivation and their SRL skills. We also investigate how dashboard designs support learners in making sense of the displayed information and how learner goals and level of SRL skills influence what learners find relevant on such interfaces. The large-scale experiments conducted with both higher education students and with MOOC learners bring empirical evidence as to how aligning the design of learning analytics dashboards with the learners’ intentions, learning motivation and the level of self-regulated learning skills influences the uptake and impact of such tools. The outcomes of these studies are synthesised in eleven recommendations for learning analytics dashboard design grouped according to the phase of the dashboard life-cycle to which they apply: (i) methodological aspects to be considered before designing dashboards, (ii) design requirements to be considered during the design phase and (iii) support offered to learners after the dashboard has been rolled out.
Article
Learning analytics has drawn the attention of academics and administrators in recent years as a tool to better understand students' needs and to tailor appropriate and timely responses. While many value the potential of learning analytics, it is not without critics, especially with regards to ethical concerns surrounding the level and type of data gathered, and scepticism on data's ability to measure something useful and actionable. This paper gathers the thoughts of key stakeholders', including staff and students, and their expectations of learning analytics, their priorities for using student data, and how they should be supported to act on the data, with the aim of aiding institutions with their plans to implement learning analytics. For this analysis we explored stakeholders' awareness, concerns, priorities and support needs with respect to effectively accessing, interpreting and utilising learning data through the use of surveys and focus groups. These were adapted from the previously published SHEILA framework protocols with additional topics added relating to awareness, uses of data, and support. The focus groups were used to capture prevalent themes, followed by surveys to gain perspectives on these themes from a wider stakeholder audience. Overall, results suggest there are significant differences in the perspectives of each of the stakeholders. There is also a strong need for both additional training and ongoing support to manage and realise stakeholders' understanding and goals around learning analytics. Further research is necessary to explore the needs of other stakeholders not captured in this study especially around differently abled students.
Conference Paper
Full-text available
Network analysis simulations were used to guide decision-makers while configuring instructional spaces on our campus during COVID-19. Course enrollment data were utilized to estimate metrics of student-to-student contact under various instruction mode scenarios. Campus administrators developed recommendations based on these metrics; examples of learning analytics implementation are provided.
Article
Full-text available
Researchers and developers of learning analytics (LA) systems are increasingly adopting human-centred design (HCD) approaches, with growing need to understand how to apply design practice in different educational settings. In this paper, we present a design narrative of our experience developing dashboards to support middle school mathematics teachers’ pedagogical practices, in a multi-university, multi-school district, improvement science initiative in the United States. Through documentation of our design experience, we offer ways to adapt common HCD methods — contextual design and design tensions — when developing visual analytics systems for educators. We also illuminate how adopting these design methods within the context of improvement science and research–practice partnerships fundamentally influences the design choices we make and the focal questions we undertake. The results of this design process flow naturally from the appropriation and repurposing of tools by district partners and directly inform improvement goals.
Conference Paper
Full-text available
Student-staff co-creation is a growing topic in higher education research. Framed as a mechanism for universities to better modify and meets the needs and expectations of students, student co-creation has a wealth of potential benefits. However, with the expansion of research, many scholars have stumbled upon a similar limitation, the scalability of co-creation. This issue is due to co-creation currently occurring in face-to-face (f2f) interactions (e.g. pedagogical consultants). However, co-creation can also arise in online spaces, enabled by technology, that could allow for greater scalability. In this paper, three strategies supported with technology to enhance the scalability of co-creation will be discussed including, crowdsourcing, customisation and prosumer behaviour with relevant industry examples for each as well as suggestions for practice in higher education. The limitations, benefits, and new directions for research will further be discussed. It is the aim of the paper to provoke ideas on how co-creation can be made more accessible to all students.
Conference Paper
Full-text available
Universal Access" is often focused on modifying main-stream products to respond to the demands of older and disabled people - which implies an extremely wide range of user characteristics. "Accessible" system design can produce systems which may be "accessible" but are in no sense "usable". Many system developers also seem to believe that a consideration of older and disabled people mean the abandonment of exciting and beautiful designs. In contrast, we recommend driving inclusive design from the margins not the centre, and that designers should consider a number of "extra-ordinary users" in depth as individual people, rather than as representatives of an age group and/or disability, and design for their desires, and tastes as well as their needs. This provides a reasonable design brief, and the consideration of extremes acts as an effective provocation within the design process. A number of case studies will illustrate the effectiveness of this approach. Ways in which communication with extreme users can be most effectively conducted are also described.
Article
The design of effective learning analytics extends beyond sound technical and pedagogical principles. If these analytics are to be adopted and used successfully to support learning and teaching, their design process needs to take into account a range of human factors, including why and how they will be used. In this editorial, we introduce principles of human-centred design developed in other, related fields that can be adopted and adapted to support the development of Human-Centred Learning Analytics (HCLA). We draw on the papers in this special section, together with the wider literature, to define human-centred design in the field of learning analytics and to identify the benefits and challenges that this approach offers. We conclude by suggesting that HCLA will enable the community to achieve more impact, more quickly, with tools that are fit for purpose and a pleasure to use.
Conference Paper
Current Learning Analytics (LA) systems are primarily designed with University staff members as the target audience; very few are aimed at students, with almost none being developed with direct student involvement and undertaking a comprehensive evaluation. This paper describes a HEFCE funded project that has employed a variety of methods to engage students in the design, development and evaluation of a student facing LA dashboard. LA was integrated into the delivery of 4 undergraduate modules with 169 student sign-ups. The design of the dashboard uses a novel approach of trying to understand the reasons why students want to study at university and maps their engagement and predicted outcomes to these motivations, with weekly personalised notifications and feedback. Students are also given the choice of how to visualise the data either via a chart-based view or to be represented as themselves. A mixed-methods evaluation has shown that students' feelings of dependability and trust of the underlying analytics and data is variable. However, students were mostly positive about the usability and interface design of the system and almost all students once signed-up did interact with their LA. The majority of students could see how the LA system could support their learning and said that it would influence their behaviour. In some cases, this has had a direct impact on their levels of engagement. The main contribution of this paper is the transparent documentation of a User Centred Design approach that has produced forms of LA representation, recommendation and interaction design that go beyond those used in current similar systems and have been shown to motivate students and impact their learning behaviour.
Conference Paper
We conducted a literature review on systems that track learning analytics data (e.g., resource use, time spent, assessment data, etc.) and provide a report back to students in the form of visualizations, feedback, or recommendations. This review included a rigorous article search process; 945 articles were identified in the initial search. After filtering out articles that did not meet the inclusion criteria, 94 articles were included in the final analysis. Articles were coded on five categories chosen based on previous work done in this area: functionality, data sources, design analysis, perceived effects, and actual effects. The purpose of this review is to identify trends in the current student-facing learning analytics reporting system literature and provide recommendations for learning analytics researchers and practitioners for future work.
Climate for creativity: A quantitative review
  • S T Hunter
  • K E Bedell
  • M D Mumford
Hunter, S. T., Bedell, K. E., & Mumford, M. D. (2007). Climate for creativity: A quantitative review. Creativity research journal, 19(1), 69-90.