Conference Paper

User Requirements for Learning Analytics Dashboard in Maritime Simulator Training

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This study investigates user requirements for the design of a Learning Analytics Dashboard (LAD) tailored for assessment in maritime simulator training. User requirements for LAD components and visualization elements were examined. Further, perceptions towards the integration of LAD in performance assessment was explored using Likert-scale questions. Data was collected from three Nordic maritime institutions. Situational awareness emerged as the most important component of a maritime LAD, with heat maps preferred for visualization. Both teachers and students have positive perceptions towards the utilization of LAD. Disparities in user requirement and perception towards LAD use across universities, study levels, and simulator modality experience were explored. These insights are pivotal for the advancement and tailoring of LADs in maritime simulator training contexts.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Previous research also highlights the simulator instructors' practice of providing instructions and feedback to the students as central for simulation-based training to facilitate professional learning (e.g., Hontvedt & Arnseth, 2013;. In recent years technologies such as learning analytics (LA), multimodal learning analytics (MMLA), and intelligent learning systems (ILS) have gained interest in MET, providing novel ways of automating feedback in simulation-based training, and as a result, offering new opportunities for remote training, skill acquisition, and competence development for maritime students (e.g., Smith et al., 2023;Munim et al. 2023;Yang et al., 2021). ...
Article
Full-text available
Collaborative learning in high-fidelity simulators is an important part of how master mariner students are preparing for their future career at sea by becoming part of a ship’s bridge team. This study aims to inform the design of multimodal learning analytics to be used for providing automated feedback to master mariner students engaged in collaborative learning activities in high-fidelity navigation simulators. Through a design ethnographic approach, we analyze video records of everyday training practices at a simulator center in Scandinavia, exploring (a) how feedback is delivered to students during collaborative activities in full-mission simulators and (b) which sensors are needed and why they are needed for capturing the multimodal nature of professional performance, communication, and collaboration in simulation-based collaborative learning. Our detailed analysis of two episodes from the data corpus shows how the delivery of feedback during simulations consists of recurring, multidimensional, and multimodal feedback cycles, comprising instructors’ close monitoring of student’s actions to continuously assess the fit between the learning objectives and the ongoing task. Through these embedded assessments, feedback that draws on the rich semiotic resources of the simulated environment, while considering aspects of realism and authenticity, is provided. Considering the multidimensional and multimodal nature of feedback in professional learning contexts, we identify technologies and sensors needed for capturing professional performance in simulated environments.
Chapter
Full-text available
Developing objective assessment approach in maritime simulator training can be a highly challenging task due to the complexity of simulating realistic scenarios, capturing relevant performance indicators and establishing good assessment protocols. This study provides a synthesis of simulation scenario contexts, data collection tools, and data analysis approaches in published simulator training studies. A systematic literature review (SLR) approach was followed for identifying the relevant studies for in-depth content analysis. The findings reveal that the reviewed studies focused on full-mission simulator-based assessment using collected data from various tools including surveys, eye-tracking, ECG, video or voice recording etc. The findings hold relevance in the development of learning analytics for facilitating objective assessment in maritime simulator training.KeywordsNavigation simulatorNavigation competenceTraining assessmentLearning analytics
Conference Paper
Full-text available
Performance assessment is fundamental for skill and competence development in professional education and training. This paper reviews the methods and metrics for performance assessment during maritime simulator-based education and training process, and creates an understanding regarding the current stage of knowledge and tools that could be used to systematically evaluate the student's learning progress and achievement to support learning technology development and theoretically informed maritime education. Performance assessments under desktop simulator-, full-scale simulator-, VR/AR- and cloud simulator-based training process have been selected as the focus areas of this study. High-quality professional education and training is important for improving workforce skills and sustainable development. This paper hopes to provide an opportunity to reflect on the learning assessment issues and opportunities to elevate the future of learning.
Conference Paper
Full-text available
Performance evaluation is fundamental for skill formation and competence development in nautical education and training. Accurate observation and evaluation of students' performance during simulation processes are essential for providing targeted feedback, identifying areas for improvement, and assessing whether the necessary knowledge and skills required to operate vessels safely have been acquired. Instructors are typically the primary assessors; differences in judgement could arise due to variations in their preferences and experience with the tasks, which can in turn influence how the performance is perceived and potentially result in variations in assessment results. This study reviews current methods for performance evaluation used in nautical training and put forth a technology-assisted method that use a combination of control inputs, audio, visual and motion data to classify behaviours and provide feedback for improvement. This can be further developed as an objective and automatic performance evaluation method that serve as a viable supplement to relying solely on human judgment.
Article
Full-text available
With the exponential growth of educational data, increasing attention has been given to student learning supported by learning analytics dashboards. Related research has indicated that dashboards relying on descriptive analytics are deficient compared to more advanced analytics. However, there is a lack of empirical data to demonstrate the performance and differences between different types of analytics in dashboards. To investigate these, the study used a controlled, between-groups experimental method to compare the effects of descriptive and prescriptive dashboards on learning outcomes. Based on the learning analytics results, the descriptive dashboard describes the learning state and the prescriptive dashboard provides suggestions for learning paths. The results show that both descriptive and prescriptive dashboards can effectively promote students’ cognitive development. The advantage of prescriptive dashboard over descriptive dashboard is its promotion in learners’ learning strategies. In addition, learners’ prior knowledge and learning strategies determine the extent of the impact of dashboard feedback on learning outcomes.
Article
Full-text available
In order to successfully implement learning analytics (LA), we need a better understanding of student expectations of such services. Yet, there is still a limited body of research about students' expectations across countries. Student expectations of LA have been predominantly examined from a view that perceives students as a group of individuals representing homogenous views. This study examines students' ideal (i.e., representing their wanted outcomes) and predicted expectations (i.e., unveiling what they realistically expect the LA service is most likely to be) of LA by employing a person-centered approach that allows exploring the heterogeneity that may be found in student expectations. We collected data from 132 students in the setting of Swedish higher education by means of an online survey. Descriptive statistics and Latent Class Analysis (LCA) were used for the analysis. Our findings show that students' ideal expectations of LA were considerably higher compared to their predicted expectations. The results of the LCA exhibit that the Swedish students' expectations of LA were heterogeneous, both regarding their privacy concerns and their expectations of LA services. The findings of this study can be seen as a baseline of students' expectations or a cross-sectional average, and be used to inform student-centered implementation of LA in higher education.
Article
Full-text available
During the COVID-19 pandemic period, all the Sri Lankan universities delivered lectures in fully online mode using Virtual Learning Environments. In fully online mode, students cannot track their performance level, their progress in the course, and their performances compared to the rest of the class. This paper presents research work conducted at the University of Colombo School of Computing (UCSC), Sri Lanka, to solve the above problems and facilitate students learning in fully online and blended learning environments using Learning Analytics. The research objective is to design and create a Technology Enhanced Learning Analytics (TELA) dashboard for improving students’ motivation, engagement, and grades. The Design Science research strategy was followed to achieve the objectives of the research. Initially, a literature survey was conducted analyzing features and limitations in current Learning Analytic dashboards. Then, current Learning Analytic plugins for Moodle were studied to identify their drawbacks. Two surveys with 136 undergraduate students and interviews with 12 lecturers were conducted to determine required features of the TELA system. The system was designed as a Moodle Plugin. Finally, an evaluation of the system was done with third-year undergraduate students of the UCSC. The results showed that the TELA dashboard can improve students' motivation, engagement, and grades. As a result of the system, students could track their current progress and performance compared to the peers, which helps to improve their motivation to engage more in the course. Also, the increased engagement in the course enhances the student’s self-confidence since the student can see continuous improvement of his/her progress and performance which in turn improves the student’s grades.
Article
Full-text available
This study investigates current approaches to learning analytics (LA) dashboarding while highlighting challenges faced by education providers in their operationalization. We analyze recent dashboards for their ability to provide actionable insights which promote informed responses by learners in making adjustments to their learning habits. Our study finds that most LA dashboards merely employ surface-level descriptive analytics, while only few go beyond and use predictive analytics. In response to the identified gaps in recently published dashboards, we propose a state-of-the-art dashboard that not only leverages descriptive analytics components, but also integrates machine learning in a way that enables both predictive and prescriptive analytics. We demonstrate how emerging analytics tools can be used in order to enable learners to adequately interpret the predictive model behavior, and more specifically to understand how a predictive model arrives at a given prediction. We highlight how these capabilities build trust and satisfy emerging regulatory requirements surrounding predictive analytics. Additionally, we show how data-driven prescriptive analytics can be deployed within dashboards in order to provide concrete advice to the learners, and thereby increase the likelihood of triggering behavioral changes. Our proposed dashboard is the first of its kind in terms of breadth of analytics that it integrates, and is currently deployed for trials at a higher education institution.
Article
Full-text available
For service implementations to be widely adopted, it is necessary for the expectations of the key stakeholders to be considered. Failure to do so may lead to services reflecting ideological gaps, which will inadvertently create dissatisfaction among its users. Learning analytics research has begun to recognise the importance of understanding the student perspective towards the services that could be potentially offered; however, student engagement remains low. Furthermore, there has been no attempt to explore whether students can be segmented into different groups based on their expectations towards learning analytics services. In doing so, it allows for a greater understanding of what is and is not expected from learning analytics services within a sample of students. The current exploratory work addresses this limitation by using the three-step approach to latent class analysis to understand whether student expectations of learning analytics services can clearly be segmented, using self-report data obtained from a sample of students at an Open University in the Netherlands. The findings show that student expectations regarding ethical and privacy elements of a learning analytics service are consistent across all groups; however, those expectations of service features are quite variable. These results are discussed in relation to previous work on student stakeholder perspectives, policy development, and the European General Data Protection Regulation (GDPR).
Article
Full-text available
The advances in technology to capture and process unprecedented amounts of educational data has boosted the interest in Learning Analytics Dashboard (LAD) applications as a way to provide meaningful visual information to administrators, parents, teachers and learners. Despite the frequent argument that LADs are useful to support target users and their goals to monitor and act upon the information provided, little is known about LADs’ theoretical underpinnings and the alignment (or lack thereof) between LADs intended outcomes and the measures used to evaluate their implementation. However, this knowledge is necessary to illuminate more efficient approaches in the development and implementation of LAD tools. Guided by the self‐regulated learning perspective and using the Preferred Reporting Items for Systematic Reviews and Meta‐Analyses (PRISMA) framework, this systematic literature review addressed this gap by examining whether and how learner‐facing LAD’s target outcomes align with the domain measures used to evaluate their implementations. Out of the 1297 papers retrieved from 15 databases, 28 were included in the final quantitative and qualitative analysis. Results suggested an intriguing lack of alignment between LADs’ intended outcomes (mostly cognitive domain) and their evaluation (mostly affective measures). Based on these results and on the premise that LADs are designed to support learners, a critical recommendation from this study is that LADs’ target outcomes should guide the selection of measures used to evaluate the efficacy of these tools. This alignment is critical to enable the construction of more robust guidelines to inform future endeavours in the field. Practitioner notes What is already known about this topic There has been an increased interest and investment in learning analytics dashboards to support learners as end‐users. Learner‐facing learning analytics dashboards are designed with different purposes, functionalities and types of data in an attempt to influence learners’ behaviour, achievement and skills. What this paper adds This paper reports trends and opportunities regarding the design of learner‐facing learning analytics dashboards, contexts of implementation, as well as types and features of learner‐facing learning analytics dashboard studies. The paper discusses how affect and motivation have been largely overlooked as target outcomes in learner‐facing learning analytics dashboards. Implications for practice and/or policy Based on the evidence gathered through the review, this paper makes recommendations for theory (eg, inclusion of motivation as an important target outcome). The paper makes recommendations related to the design, implementation and evaluation of learning analytics dashboards. The paper also highlights the need for further integration between learner‐facing learning analytics dashboards and open learner models.
Article
Full-text available
We use a randomised experiment to study the effect of offering half of 556 freshman students a learning analytics dashboard and a weekly email with a link to their dashboard, on student behaviour in the online environment and final exam performance. The dashboard shows their online progress in the learning management systems, their predicted chance of passing, their predicted grade and their online intermediate performance compared with the total cohort. The email with dashboard access, as well as dashboard use, has positive effects on student behaviour in the online environment, but no effects are found on student performance in the final exam of the programming course. However, we do find differential effects by specialisation and student characteristics.
Article
Full-text available
Learning analytics (LA) tools promise to improve student learning and retention. However, adoption and use of LA tools in higher education is often uneven. In this case study, part of a larger exploratory research project, we interviewed and observed 32 faculty and advisors at a public research university to understand the technological incentives and barriers related to LA tool adoption and use. Findings indicate that lack of a trustworthy technological infrastructure, misalignment between LA tool capabilities and user needs, and the existence of ethical concerns about the data, visualizations, and algorithms that underlie LA tools created barriers to adoption. Improving tool integration, clarity, and accuracy, soliciting the technological needs and perspectives of LA tool users, and providing data context may encourage inclusion of these tools into teaching and advising practice.
Article
Full-text available
La habilidad para participar y contribuir a los debates es importante para el aprendizaje informal y formal. Especialmente cuando se abordan temas altamente complejos, puede ser difícil apoyar a los alumnos que participan en una discusión grupal efectiva y mantenerse al tanto de toda la información generada colectivamente durante la discusión. La tecnología puede ayudar con el compromiso y razonamiento en debates tan grandes, por ejemplo, puede monitorear cuán saludable es un debate y proporcionar indicadores sobre la distribución de la participación. Un marco especial que pretende aprovechar la inteligencia de grupos de pequeños a muy grandes con el apoyo de herramientas de discurso y argumentación estructuradas es la Inteligencia Colectiva Controvertida (ICC). Las herramientas de CCI proporcionan una fuente rica de datos semánticos que, si se procesan de manera adecuada, pueden generar un sofisticado análisis del discurso en línea. Este estudio presenta un panel de visualización con varios análisis visuales que muestran aspectos importantes de los debates en línea que han sido facilitados por las herramientas de discusión de CCI. El tablero de instrumentos fue diseñado para mejorar la creación de sentidos y la participación en los debates en línea y se ha evaluado con dos estudios, un experimento de laboratorio y un estudio de campo, en el contexto de dos institutos de educación superior. Este artículo informa sobre los resultados de una evaluación de usabilidad del panel de visualización. Los hallazgos descriptivos sugieren que los participantes con poca experiencia en el uso de visualizaciones analíticas pudieron desempeñarse bien en determinadas tareas. Esto constituye un resultado prometedor para la aplicación de tales tecnologías de visualización, ya que las interfaces analíticas de aprendizaje centradas en el discurso pueden ayudar a apoyar el compromiso de los alumnos y su razonamiento en debates en línea complejos.
Article
Full-text available
p class="3">The importance of teachers in online learning is widely acknowledged to effectively support and stimulate learners. With the increasing availability of learning analytics data, online teachers might be able to use learning analytics dashboards to facilitate learners with different learning needs. However, deployment of learning analytics visualisations by teachers also requires buy-in from teachers. Using the principles of technology acceptance model, in this embedded case-study, we explored teachers’ readiness for learning analytics visualisations amongst 95 experienced teaching staff at one of the largest distance learning universities by using an innovative training method called Analytics4Action Workshop. The findings indicated that participants appreciated the interactive and hands-on approach, but at the same time were skeptical about the perceived ease of use of learning analytics tools they were offered. Most teachers indicated a need for additional training and follow-up support for working with learning analytics tools. Our results highlight a need for institutions to provide effective professional development opportunities for learning analytics.</p
Conference Paper
Full-text available
In this empirical study, we investigate the role of national cultural dimensions as distal antecedents of the use intensity of e-tutorials, which constitute the digital component within a blended learning course. Profiting from the context of a dispositional learning analytics application, we investigate cognitive processing strategies and metacognitive regulation strategies, motivation and engagement variables, and learning emotions as proximal antecedents of tool use and tool performance. We find that cultural diversity explains a substantial part of the variation in learning dispositions and tool use. The design of personalized learning paths will, therefore, profit from including national cultural dimensions as a relevant design factor.
Article
Full-text available
This paper presents a systematic literature review of the state-of-the-art of research on learning dashboards in the fields of Learning Analytics and Educational Data Mining. Research on learning dashboards aims to identify what data is meaningful to different stakeholders and how data can be presented to support sense-making processes. Learning dashboards are becoming popular due to the increased use of educational technologies, such as Learning Management Systems (LMS) and Massive Open Online Courses (MOOCs). The initial search of five main academic databases and GScholar resulted in 346 papers out of which 55 papers were included in the final analysis. Our review distinguishes different kinds of research studies as well as various aspects of learning dashboards and their maturity regarding evaluation. As the research field is still relatively young, most studies are exploratory and proof-of-concept. The review concludes by offering a definition for learning dashboards and by outlining open issues and future lines of work in the area of learning dashboards. There is a need for longitudinal research in authentic settings and studies that systematically compare different dashboard designs.
Article
Full-text available
The Learning Analytics Dashboard (LAD) is an application to show students' online behavior patterns in a virtual learning environment. This supporting tool works by tracking students' log-files, mining massive amounts of data to find meaning, and visualizing the results so they can be comprehended at a glance. This paper reviews previously developed applications to analyze their features. Based on the implications from the review of previous studies as well as a preliminary investigation on the need for such tools, an early version of the LAD was designed and developed. Also, in order to improve the LAD, a usability test incorporating a stimulus recall interview was conducted with 38 college students in two blended learning classes. Evaluation of this tool was performed in an experimental research setting with a control group and additional surveys were conducted asking students' about perceived usefulness, conformity, level of understanding of graphs, and their behavioral changes. The results indicated that this newly developed learning analytics tool did not significantly impact on their learning achievement. However, lessons learned from the usability and pilot tests support that visualized information impacts on students' understanding level; and the overall satisfaction with dashboard plays as a covariant that impacts on both the degree of understanding and students' perceived change of behavior. Taking in the results of the tests and students' openended responses, a scaffolding strategy to help them understand the meaning of the information displayed was included in each sub section of the dashboard. Finally, this paper discusses future directions in regard to improving LAD so that it better supports students' learning performance, which might be helpful for those who develop learning analytics applications for students.
Chapter
Learning analytic dashboards help instructors track, and supervise students in online or hybrid education in order to meet the needs of teachers and better understand their preferences and their expectations of the dashboards, an online questionnaire was conducted for Moroccan teachers in higher education to determine their needs and their uses of a dashboard in a scenario of blended learning, to learn more about the indicators they deem most relevant to their teaching activities.
Article
Technological advancements have generated a strong interest in exploring learner behavior data through learning analytics to provide both learner and instructor with process-oriented feedback in the form of dashboards. However, little is known about the typology of dashboard feedback relevant for different learning goals, learners and teachers. While most dashboards and the feedback that they give are based only on learner performance indicators, research shows that effective feedback needs also to be grounded in the regulatory mechanisms underlying learning processes and an awareness of the learner’s learning goals. The design artefact presented in this article uses a conceptual model that visualizes the relationships between dashboard design and the learning sciences to provide cognitive and behavioral process-oriented feedback to learners and teachers to support regulation of learning. A practical case example is given that demonstrates how the ideas presented in the paper can be deployed in the context of a learning dashboard. The case example uses several analytics/visualization techniques based on empirical evidence from earlier research that successfully tested these techniques in various learning contexts.
Article
This article is a comprehensive literature review of student-facing learning analytics reporting systems that track learning analytics data and report it directly to students. This literature review builds on four previously conducted literature reviews in similar domains. Out of the 945 articles retrieved from databases and journals, 93 articles were included in the analysis. Articles were coded based on the following five categories: functionality, data sources, design analysis, student perceptions, and measured effects. Based on this review, we need research on learning analytics reporting systems that targets the design and development process of reporting systems, not only the final products. This design and development process includes needs analyses, visual design analyses, information selection justifications, and student perception surveys. In addition, experiments to determine the effect of these systems on student behavior, achievement, and skills are needed to add to the small existing body of evidence. Furthermore, experimental studies should include usability tests and methodologies to examine student use of these systems, as these factors may affect experimental findings. Finally, observational study methods, such as propensity score matching, should be used to increase student access to these systems but still rigorously measure experimental effects.
Article
More and more learning in higher education settings is being facilitated through online learning environments. Students’ ability to self-regulate their learning is considered a key factor for success in higher education. Learning analytics offer a promising approach to supporting and understanding students’ learning processes better. The purpose of this study was to investigate students’ expectations toward features of learning analytics systems and their willingness to use these features for learning. A total of 20 university students participated in an initial qualitative exploratory study. They were interviewed about their expectations of learning analytics features. The findings of the qualitative study were complemented by a quantitative study with 216 participating students. Findings show that students expect learning analytics features to support their planning and organization of learning processes, provide self-assessments, deliver adaptive recommendations, and produce personalized analyses of their learning activities.