Content uploaded by Jan Delcker
Author content
All content in this area was uploaded by Jan Delcker on Mar 18, 2024
Content may be subject to copyright.
Open Access
© The Author(s) 2024. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits
use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original
author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third
party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the mate-
rial. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or
exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://
creativecommons.org/licenses/by/4.0/.
RESEARCH ARTICLE
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
https://doi.org/10.1186/s41239-024-00452-7
International Journal of Educational
Technology in Higher Education
First-year students AI-competence
asapredictor forintended andde facto use
ofAI-tools forsupporting learning processes
inhigher education
Jan Delcker1* , Joana Heil1, Dirk Ifenthaler1,2, Sabine Seufert3 and Lukas Spirgi3
Abstract
The influence of Artificial Intelligence on higher education is increasing. As important
drivers for student retention and learning success, generative AI-tools like transla-
tors, paraphrasers and most lately chatbots can support students in their learning
processes. The perceptions and expectations of first-years students related to AI-tools
have not yet been researched in-depth. The same can be stated about necessary
requirements and skills for the purposeful use of AI-tools. The research work examines
the relationship between first-year students’ knowledge, skills and attitudes and their
use of AI-tools for their learning processes. Analysing the data of 634 first-year students
revealed that attitudes towards AI significantly explains the intended use of AI tools.
Additionally, the perceived benefits of AI-technology are predictors for students’ per-
ception of AI-robots as cooperation partners for humans. Educators in higher educa-
tion must facilitate students’ AI competencies and integrate AI-tools into instructional
designs. As a result, students learning processes will be improved.
Keywords: Artificial Intelligence, Higher education, Learning process, AI tool, Chatbot
Introduction
AI-robots are agents programmed to fulfill tasks traditionally done by humans (Dang
& Liu, 2022). e number of interactions between humans and AI-robots is increasing,
which is a strong indicator of the integration of AI-technology into the lives of humans
(Kim etal., 2022). A popular example is the deployment of chatbots on a website. ese
AI-robots can guide users and respond to basic user requests (Larasati etal., 2022). e
technology behind semi-automated and fully automated human-like task fulfillment is
based on AI-methods and AI-algorithms (Gkinko & Elbanna, 2023). ese AI-methods
and -algorithms form the main programming characteristics of AI-robots (Woschank
etal., 2020). e features lead to an increasing similarity in the performance of humans
and AI-robots (Byrd etal., 2021). Additionally, the appearance and behavior of AI-robots
are becoming more human-like (Hildt, 2021). While most machines are easily distin-
guishable from humans, AI-robots might be hard to identify (Desaire etal., 2023) and
*Correspondence:
delcker@uni-mannheim.de
1 University of Mannheim, L4, 1,
68161 Mannheim, Germany
2 Curtin University, Perth,
Australia
3 University of St. Gallen, St.
Gallen, Switzerland
Page 2 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
the ability to identify AI-robots is one of the many challenges accompanying these new
technologies. As a result, humans even start to attribute AI-robots with human-like
understanding, as well as mental capacities (Roesler etal., 2021).
Accordingly, new and changing demands in humans’ digital competencies are required
to deal with the various applications of AI-robots in all sectors of human life (Seufert
& Tarantini, 2022). One of these fields is higher education, which is strongly affected
by introducing AI-technology and AI-robots (Ouyang et al., 2022; Popenici & Kerr,
2017). Future applications for AI-technology can be found at all levels of higher educa-
tion (Ocaña-Fernández etal., 2019). On the student level, virtual AI teaching assistants
(Kim etal., 2020; Liu etal., 2022) and intelligent tutoring systems (Azevedo etal., 2022;
Latham, 2022) have the capability to guide individual learner paths (Brusilovsky, 2023;
Rahayu et al., 2023). Educators might implement automated grading and assessment
tools (Heil & Ifenthaler, 2023; Celik etal., 2022) or create educational content with gen-
erative AI (Bozkurt & Sharma, 2023; Kaplan-Rakowski etal., 2023). e administration
of higher education institutions has to adapt their policies to the new technology (Chan,
2023), while incorporating learning analytic tools to improve study conditions, reduce
drop-out rates, and adapt their study programs (Aldowah etal., 2019; Ifenthaler & Yau,
2020; Ouyang et al., 2023; Tsai et al., 2020). ese developments are embedded into
national policy-making processes, such as creating ethics guidelines (Jobin etal., 2019)
and competence frameworks (Vuorikari etal., 2022) for AI-technology.
According to recent studies, first-year students enter institutions of higher learning
with various perceptions and expectations about university life, for instance, in terms
of social aspects, learning experiences, and academic support (Houser, 2004). While
students’ generic academic skills appear to be well-established for coping with higher
education requirements, their competencies related to AI seem to be limited (Ng etal.,
2023).
As of now, there are no conceptual frameworks that cover the use of human-like AI-
technology, focusing on first-year students within the context of higher education. us,
this study is targeting this research gap. For this purpose, seven functionalities of AI-
tools have been conceptualized for their application in the context of higher education.
is conceptualization is a helpful differentiation to analyze the intent and frequency
of use, as well as possible indicators that might affect intent and frequency of use. As a
result, implications for further implementing AI-tools in higher education learning pro-
cesses will be derived.
Background
First‑year students
First-year students’ perceptions and expectations and how they cope with academic
requirements in higher education have been identified as important factors for learning
success and student retention (Mah, & Ifenthaler, 2018; Tinto, 1994; Yorke & Longden,
2008). Several studies identified a mismatch between first-year students’ perceptions and
academic reality (Smith & Wertlieb, 2005). Furthermore, research indicates that many
first-year students do not know what is expected at university and are often academi-
cally unprepared (Mah & Ifenthaler 2017; McCarthy & Kuh, 2006). Students’ prepared-
ness is particularly relevant concerning generic skills such as academic competencies,
Page 3 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
which they should possess when entering university (Barrie, 2007). Numerous aspects,
including sociodemographic features, study choices, cognitive ability, motivation, per-
sonal circumstances, and academic and social integration, have been linked to first-year
students’ learning success and retention in higher education (Bean & Eaton, 2020; Sanavi
& Matt, 2022). Mah & Ifenthaler (2017) identified five academic competencies for suc-
cessful degree completion: time management, learning skills, self-monitoring, technol-
ogy proficiency, and research skills. Accordingly, coping with academic requirements is
an important driver of student retention in higher education (omas, 2002). Moreover,
students’ perceptions of their first year can affect student success (Crisp etal., 2009).
More recently, it has been argued that competencies related to AI are an important
driver for student retention and learning success (Bates etal., 2020; Mah, 2016; Ng etal.,
2023). Nonetheless, first-year students’ perceptions, expectations, and academic com-
petencies for coping with academic requirements related to AI-tools have not yet been
researched in-depth.
Conceptualization ofAI‑tools inhigher education
Dang and Liu (2022) propose a differentiation of AI-robots, which is also used in this
study. ey categorize AI-robots into << mindful >> (AI-robots with increasingly human
characteristics) and << mindless >> (AI-robots with machine characteristics) tools. e
so-called mindful AI-robots can perform more complex tasks, react to the prompts of
the users in a more meaningful way, and are designed to act and look like humans. On
the other hand, mindless AI-robots perform fewer complex tasks and appear more like
machines. In the following, a short overview of AI-tools is provided, including their main
functionality and examples for practical use in higher education learning processes:
Mindless AI‑robots
1) Translation text generators: ese tools use written text as input and translate the text
into a different language. Translation text generators can help to quickly translate text
into the language a student is most familiar with or to translate into a language that is
required by the assignment. Many study programs require students to hand in (some)
papers in a language different from the study program’s language (Galante, 2020). Two of
the most prominent translation text generators are Google Translate and DeepL (Mar-
tín-Martín etal., 2021).
2) Summarizing/rephrasing text generators: ese tools use written text as input and
can change the structure of the text. On the one hand, they are used to extract critical
information, keywords, or main concepts out of structured text, reducing the complexity
of the input text. In this way, they help the user focus on the input text’s most important
aspects, allowing them to get a basic understanding of complex frameworks. Summariz-
ing text, such as research literature or lecture slides, is an important learning strategy
in the context of higher education (Mitsea & Drigas, 2019). On the other hand, these
text generators can rephrase text input, an important task when writing research papers:
In most cases, written research assignments include some theoretical chapter based on
existing research literature. Students must rephrase and restructure existing research
literature to show their understanding of concepts and theories (Aksnes etal., 2019).
Quillbot is an example of such a rephrasing tool (Fitria, 2021).
Page 4 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
3) Writing assistants: Writing assistants can enhance the quality of written text. ese
tools automatically check for grammar and spelling mistakes, while the text is being cre-
ated. Furthermore, these tools can give recommendations to the writer to improve the
language used: they can provide suggestions for alternative formulations to avoid collo-
quial language and unnecessary iterations. Writing assistants are usually a part of word
processors (e.g., Microsoft Word), but standalone programs or extensions such as Gram-
marly also exist (Koltovskaia, 2020).
4) Text generators: ese tools can automatically generate written text. Text genera-
tors take short prompts as input and produce text based on this input. e output text
is mainly used for blog entries, text-based social media posts, or Twitter messages. ey
can be differentiated from chatbots as they cannot produce more complex pieces of text.
WriteSonic is a such a text generator tool (Almaraz-López etal., 2023).
Mindful AI‑robots
5) Chatbot: Chatbots are applications that simulate human interactions (Chong etal.,
2021). In the context of business, they are generally used to answer customer ques-
tions automatically. In education, these chatbots help to guide learners through online
environments or administrative processes. With the release of ChatGPT, a new kind of
chatbot was introduced. ese chatbots can produce various output formats, including
working algorithms, presentations, or pictures, based on prompts that are very similar
to human interactions (Almaraz-López etal., 2023; Fauzi etal., 2023; Fuchs, 2023). Stu-
dents can use chatbots to automatically produce content, which is traditionally being
used as part of instructional design, especially final assessments.
6) Virtual avatars: Virtual avatars are digital representations of living beings. ey
can be used in online classroom settings to represent teachers and learners alike. In
these classroom settings, virtual representations, such as Synthesia, have been shown to
improve students’ learning performance, compared to classes without virtual represen-
tation (Herbert & Dołżycka, 2022).
7) Social-humanoid robots: ese tools not only simulate human behavior and per-
form human tasks, but in many cases, social-humanoid robots are also built close to
human complexity, featuring hands, legs, and faces (van Pinxteren etal., 2019). ey can
perform human-like mimic to various degrees. Currently, these social-humanoid robots
are used as servers in restaurants and are tested in medical and educational institutions
(Henschel etal., 2021).
AI‑competencies andAI‑ethics
e European DigComp Framework 2.2 is a comprehensive framework, that organizes
different components of digital competencies deemed essential for digitally competent
citizens (Vuorikari etal., 2022). Within this framework, AI literacy can be found in three
dimensions: knowledge, proficiency, and attitudes. Basic ideas about the functionality
and application areas of AI technology are allocated to the knowledge dimension. is
dimension also holds theoretical knowledge about AI laws and regulations, such as the
European data protection regulation. e ability of a person to take advantage of AI and
use it to improve various aspects of their life can be found in the proficiency dimen-
sion. Successfully deploying AI technology to solve problems requires the capability to
Page 5 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
choose adequate tools and consequently control these chosen tools. Competent citi-
zens must be able to form an opinion on AI technology’s benefits, risks, and disadvan-
tages. is allows them to participate in political and social decision-making processes.
rough a meta-analysis of guidelines, Jobin etal. (2019) identifies eleven ethical prin-
ciples which must be considered when working with AI, such as transparency, justice,
fairness and trust. Hands-on examples are the guidelines by Diakopoulos etal. (2016) as
well as Floridi etal. (2018). e attitude dimension holds these competencies. As with
many technological advancements, higher education will be one of the main drivers for
facilitating digital AI-competencies (Cabero-Almenara etal., 2023; Ehlers & Kellermann,
2019).
Furthermore, AI technology will change the various learning processes within higher
education (Kim etal., 2022). is includes the perspective of educators (Kim etal., 2022),
learners (Zawacki-Richter et al., 2019), and administration alike (Leoste et al., 2021).
Although research indicates these impacts, research on AI-robots in higher education is
scarce, mainly because higher education institutions rarely use the different applications
broadly (Kim etal., 2022; Lim etal., 2023).
e functionalities of the different tools offer students various potential applications
for learning processes. Following the Unified eory of Acceptance and Use of Technol-
ogy (UTAUT), the intent to use new digital tools as well as the actual usage of technol-
ogy might be influenced by the expectation of performance, the expectation of effort,
social influence, and facilitating conditions (Venkatesh etal., 2003). Strzelecki (2023)
states that the assumptions made by UTAUT also hold for AI-tools, more specifically
ChatGPT, although he could not identify a significant effect from facilitating condi-
tions. In accordance with the DigComp 2.2 framework, this study focuses on students’
attitudes, proficiency, and knowledge regarding AI-technology as additional constructs
influencing the intent to use and actual usage of AI-tools.
Furthermore, the study builds on the considerations by Dang and Liu (2022) and
examines which constructs influence students’ perception of AI-technology as competi-
tors and cooperation for humans: Research in the field of AI uncovers a range of pos-
sible outcomes from increasing AI integration into human society (Einola & Khoreva,
2023). While some argue that AI technology will compete with humans in the work-
place, leading to a massive job loss (Zanzotto, 2019) and deskilling of human workers
(Li etal., 2023). On the other hand, AI has the potential to be a cooperation partner
for humans by automating processes (Bhargava etal., 2021; Joksimovic etal., 2023) or
relieving humans from physical and psychological stress (Raisch & Krakowski, 2021).
Hypotheses
is research project aims to better understand first-year students’ perceptions as well as
the intended and de facto use of AI-tools. While AI-competencies are understood as an
essential driver for learning success and student retention (Ng etal., 2023), the following
hypotheses emerge from the research gaps identified for the context of higher education:
Hypothesis 1 e underlying constructs of AI-competencies (skills, attitude, knowl-
edge) have a positive effect on the intention to use AI-robots, while the intention to use
AI-robots has a positive effect on the actual use of AI-robots.
Page 6 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
Hypothesis 2a Students’ AI-competencies and the perceived benefits of AI-technology
are predictors for students’ perception of AI-robots as cooperation partners for humans.
Hypothesis 2b Students’ AI-competencies and the perceived risks of AI-technology
are predictors for students’ perception of AI-robots as competition for humans.
Method
Data collection andparticipants
An online questionnaire was designed to collect data from first-year students at a Ger-
man and a Swiss university. Possible participants were asked to take part in the survey
through an e-mail, which was send through the universities e-mail system. In total,
N = 638 first-year students participated in the survey. On average, they were 20.62years
old, with a standard deviation of 2.25years. Of the N = 638 students, N = 309 identified
as male, N = 322 as female, and N = 7 as non-binary. e lowest average use of the mind-
less tools could be found in paraphrasing and summarizing tools (M = 1.13, SD = 1.51).
e use of online writing assistants was slightly higher (M = 1.94, SD = 1.76), and the
highest average usage could be found in online translation tools (M = 3.53, SD = 1.18).
e average use of mindless robots was relatively low (M = 2.2, SD = 1.05). e willing-
ness to use the robots ranged from the lowest in virtual avatars (M = 2.23, SD = 1.13) to
the highest in online translation tools (M = 3.16, SD = 1.17).
Instrument
e online questionnaire consists of three parts. e instrument’s first part comprises
questions regarding knowledge, skills, and attitudes regarding AI-technology (Vuori-
kari etal., 2022). e different AI-robots are presented in part 2 of the questionnaire.
For each tool, current and intended usage was gathered, following the unified theory of
acceptance and use of technology (UTAUT) (Venkatesh etal., 2003). e items were for-
mulated to match the different tools with those tasks that are relevant for students, such
as writing assignments or preparing for exams. In addition, ethical considerations for
each tool were prompted (Vuorikari etal., 2022). e actual use of the robots by the
participants was evaluated with a 6-point Likert scale and their potential willingness to
use them with the help of a 5-point Likert scale. e third part of the instrument sum-
marizes items that collect demographic data. e instrument can be found in Additional
file1.
Analysis
A path analysis was conducted based on the factors of AI-competence, taken from the
DigiComp2.2 framework (Skills, Attitude, Knowledge), in combination with the UTAUT
models’ assumption that the intention to use technology influences the actual use of AI-
tools. A visualization of the model can be found in Fig.1. e path analysis was done
with RStudio, more specifically, the package lavaan (Rosseel, 2012).
Multiple linear regression analyses were conducted in RStudio to answer Hypotheses
2a and 2b.
Page 7 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
Results
Hypothesis 1: theinuence ofskills, attitude, andknowledge ontheintended use
ofAI‑tools
e model has a relative well fit, with a non-significant chi-square (3, 638) = 7.3, p = 0.06,
and the fit Comparative Fit Index (CFI) = 0.96, above the respective cut-off value of 0.95.
e Tucker-Lewis Index (TLI) = 0.91 is slightly lower than 0.95. e RMSEA = 0.05 is
below 0.08.
e results indicate a significant positive influence of attitude (ß = 0.26, p < 0.01) and
a significant negative influence of skills (β = −0.1, p = 0.02) on the intention to use the
tools. Knowledge seems to have no significant impact (β = −0.06, p = 0.19). Further-
more, the intention to use the AI-tools significantly predicts their actual use (β = 0.33,
p < 0.01, R2 = 0.11). e path analysis is shown in Fig.1.
Hypothesis 2a: perceived benets asindicators forAI ascooperation partner
A multiple linear regression was conducted to analyze the influencing factors on stu-
dents’ rating of AI as cooperation partners. Concerning students’ rating of AI as a coop-
eration opportunity, the influence of AI-competence and the perceived benefits of AI
were included in the analysis. Both factors are significant predictors and explain 15.41%
of the variation in the estimation of AI as a cooperation possibility for humans. F(2,
635) = 57.84, p < 0.01. Both AI-competence, β = 0.22, p < 0.01, t(637) = 5.9 and perceived
benefits, β = 0.27, p < 0.01, t(637) = 7.2 are significant predictors.
Hypothesis 2b: perceived risks asindicators forAI ascompetition
A multiple linear regression was conducted to analyze the influencing factors on stu-
dents’ rating of AI as a competitor for humans. When considering the influence of
Fig. 1 Path analysis—skills, attitudes, and knowledge as predictors for intended and de facto use of AI-tools
Page 8 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
perceived risks and AI-competence on students’ rating of AI as competition, both fac-
tors explain 2.26% of the variation in the dependent factor. F(2,635) = 7.33, p < 0.01.
While the AI-competence is a significant predictor, β = 0.09, t(637) = 10.2, p < 0.01, p er-
ceived risk is not β = 0.03, t(637) = 1.64, p = 0.1.
Discussion
Findings
e analyzed data provides insights into the actual use and implementation of AI-tools
in students’ learning process in their entry phase. So far, mindless AI-tools are favored
by the participants compared to mindful tools. ese mindless AI-tools provide useful
functionalities regarding tasks that can be considered as typical for higher education
programs, such as written papers, presentations, or reports (Flores etal., 2020; Medland,
2016). ese functionalities include translations (Einola & Khoreva, 2023) or summaries
(Fitria, 2021). e analysis results show that the intention to use these tools is affected by
students’ perceived skills, knowledge, and attitudes (Venkatesh etal., 2003). A positive
attitude has a positive effect on the intended use of AI-tools. A positive attitude includes
a general interest and an openness about AI technology, but also a strong interest in a
critical discussion about AI technology. Students’ curiosity about the new technology
leads to factual testing and might give students a better understanding of what the AI-
tools have to offer them in practice, reflecting on the challenges and opportunities of
AI-technology. e findings of the path analysis indicate that proficiency in controlling
the tools does not have a positive effect on the intended use. is result can be explained
through the aforementioned importance of attitude towards AI-technology (Almaraz-
López etal., 2023; Vuorikari etal., 2022). Students’ curiosity for the new technology
might outweigh their perceived need for a distinct AI proficiency.
Additionally, many AI-tools can be easily accessed and give the impression of being
easy to use. e same argument holds for the construct of knowledge. e student’s
intention to use AI-tools for learning processes appears to be independent of their theo-
retical knowledge of the systems’ internal functionalities. While this knowledge might
help students to understand better the results they receive from AI-tools or increase
their ability to formulate adequate prompts (Zamfirescu-Pereira etal., 2023), the absence
of theoretical knowledge does not present itself as a barrier to the intended use.
Implications
ese findings do have important implications for the further implementation of AI-tools
in higher education learning processes (Heil & Ifenthaler 2023; Celik etal., 2022; Kaplan-
Rakowski etal., 2023; Latham, 2022; Liu etal., 2022; Ocaña-Fernández etal., 2019). At first
glance, using AI-tools does not require prior practical and theoretical training from stu-
dents. At the same time, students might not be able to fully apprehend the possibilities of
AI-tools or effectively use them to improve their learning processes (Alamri et al., 2021;
Børte etal., 2023). Educators should, therefore, integrate these tools into their instructional
design practices and pair them with additional practices to facilitate the students’ AI-com-
petencies (Lindfors etal., 2021; Sailer etal., 2021; Zhang etal., 2023). As a result, students
Page 9 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
will be able to use AI-tools to improve their learning processes, while simultaneously being
able to critically reflect on the input, output, and influence of the respective AI-tools.
e results of Hypotheses 2a and 2b show a significant effect of AI competence and the
perceived benefits of AI-tools on the expected cooperation potential of AI technology
(Bhargava etal., 2021; Raisch & Krakowski, 2021). Instructional designers and other stake-
holders in higher education need to provide best-practice examples of how AI-tools can
be used to positively influence learning processes if they want to facilitate the usage of the
respective tools.
Limitations andoutlook
ChatGPT was not yet openly accessible when the data for this survey was collected. e
overall usage of AI tools has likely increased since ChatGPT was introduced to a broader
user base (Strzelecki, 2023). e presence of ChatGPT in media and scientific discussions
might have led students to look into other AI-tools, such as DeepL (Einola & Khoreva,
2023) or Quillbot (Fitria, 2021) as well. e composition of the student sample also limits
the study’s results. While the University in Switzerland is more open towards the usage of
AI technology, policymakers in German universities tend to be more restrictive towards the
use of AI (von der Heyde etal., 2023). To overcome the limitation of the sample size, future
studies will include students from a broader range of academic years. As a result, the gener-
alizability of the result will be improved.
e present discussion about ChatGPT and the influence of AI-tools in general on higher
education underlines the need to educate learners about AI and their respective AI-compe-
tencies (Almaraz-López etal., 2023; Chong etal., 2021; Fauzi etal., 2023). A second study
is currently being conducted to analyze how the introduction of ChatGPT to the public
sphere has changed students’ attitudes toward AI and their use of AI-tools, both intended
and factual. It can be assumed that the powerful tool leads to an increasing awareness of AI,
as well as a broad usage over different study programs and for various tasks within higher
education programs. Further studies should include additional research approaches to
collect additional data about students’ experiences and usage of AI tools, such as a think-
a-loud study or interviews with students. ese approaches give insights into the teach-
ing strategies which might help students to facilitate AI competences and improve their
learning outcomes through AI tools. An example of such a strategy is a class that teaches
students to write scientific texts with the support of ChatGPT. A comprehensive under-
standing of necessary competencies and pedagogical are the foundation for holistic AI liter-
acy programs. ese programs need to be accessible for all students and flexible enough to
adhere to different levels of prior knowledge and learning preferences. Another important
task for ongoing research projects is the analysis of the relationship between AI competen-
cies, pedagogical concepts and the learning outcome of students, especially regarding the
different tools which might be used in the future. Additional, longitudinal studies might be
best suited to gather detailed data through out AI-supported learning process.
Conclusion
e increasing capabilities of AI-tools offer a wide range for possible application in
higher education institutions. Once the gap between the theoretical chances and appli-
cable solutions is closed, multiple stakeholders, such as administrator, educators and
Page 10 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
students, will be able to benefit from individualized learning paths, automated feedback
or data-based decision-making processes. Lately, an increasing number of research work
has been published to close this gap. e introduction of ChatGPT to the general public
has fueled the discussions about AI technology, especially in the field of higher educa-
tion institutions. One of the challenges encased in the implementation of AI into learn-
ing processes is the facilitation of students’ AI competencies. Students need the practical
skills, theoretical knowledge and comprehensive attitudes to unlock the potential of
AI-technology for their learning processes. Educators and higher education institutions
have the responsibility to create safe learning environments which foster points of con-
tact with AI as well as possibilities to actively engage with AI. ese learning environ-
ments must provide students with access to relevant AI-tools and must be founded on
holistic legal frameworks and regulations.
Supplementary Information
The online version contains supplementary material available at https:// doi. org/ 10. 1186/ s41239- 024- 00452-7.
Additional le1. AI-Competence Instrument.
Acknowledgements
Not applicable.
Author contributions
All authors participated in planning the study, designing the data collection tools, collecting and analyzing data for the
study. The first author (corresponding author) led the writing up process, with contributions from the second and third
authors. All authors read and approved the final manuscript.
Funding
Not applicable.
Availability of data and materials
The data supporting this study’s findings are available on request from the corresponding author. The data are not
publicly available due to privacy or ethical restrictions.
Declarations
Competing interests
The authors declare no known competing financial interests or personal relationships that could have appeared to influ-
ence the work reported in this paper. The authors declare no conflict of interest.
Received: 22 September 2023 Accepted: 1 March 2024
References
Aksnes, D. W., Langfeldt, L., & Wouters, P. (2019). Citations, citation indicators, and research quality: An overview of basic
concepts and theories. SAGE Open, 9(1), 215824401982957. https:// doi. org/ 10. 1177/ 21582 44019 829575
Alamri, H. A., Watson, S., & Watson, W. (2021). Learning technology models that support personalization within blended
learning environments in higher education. TechTrends, 65(1), 62–78. https:// doi. org/ 10. 1007/ s11528- 020- 00530-3
Aldowah, H., Al-Samarraie, H., & Fauzy, W. M. (2019). Educational data mining and learning analytics for 21st century
higher education: a review and synthesis. Telematics and Informatics, 37, 13–49. https:// doi. org/ 10. 1016/j. tele. 2019.
01. 007
Almaraz-López, C., Almaraz-Menéndez, F., & López-Esteban, C. (2023). Comparative study of the attitudes and perceptions
of university students in business administration and management and in education toward artificial intelligence.
Education Sciences, 13(6), 609. https:// doi. org/ 10. 3390/ educs ci130 60609
Azevedo, R., Bouchet, F., Duffy, M., Harley, J., Taub, M., Trevors, G., Cloude, E., Dever, D., Wiedbusch, M., Wortha, F., & Cerezo,
R. (2022). Lessons learned and future directions of metaTutor: leveraging multichannel data to scaffold self-regu-
lated learning with an intelligent tutoring system. Frontiers in Psychology. https:// doi. org/ 10. 3389/ fpsyg. 2022. 813632
Barrie, S. C. (2007). A conceptual framework for the teaching and learning of generic graduate attributes. Studies in Higher
Education, 32(4), 439–458. https:// doi. org/ 10. 1080/ 03075 07070 14761 00
Bates, T., Cobo, C., Mariño, O., & Wheeler, S. (2020). Can artificial intelligence transform higher education? International
Journal of Educational Technology in Higher Education, 17(1), 42. https:// doi. org/ 10. 1186/ s41239- 020- 00218-x
Page 11 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
Bean, J. P., & Eaton, S. B. (2020). A Psychological Model of College Student Retention. https:// api. seman ticsc holar. org/ Corpu
sID: 22493 7248
Bhargava, A., Bester, M., & Bolton, L. (2021). Employees’ perceptions of the implementation of robotics, artificial intelli-
gence, and automation (RAIA) on job satisfaction, job security, and employability. Journal of Technology in Behavioral
Science, 6(1), 106–113. https:// doi. org/ 10. 1007/ s41347- 020- 00153-8
Børte, K., Nesje, K., & Lillejord, S. (2023). Barriers to student active learning in higher education. Teaching in Higher Educa-
tion, 28(3), 597–615. https:// doi. org/ 10. 1080/ 13562 517. 2020. 18397 46
Bozkurt, A., & Sharma, R. (2023). Generative AI and prompt engineering: The art of whispering to let the genie out of the algo-
rithmic world. 18, i–vi. https:// doi. org/ 10. 5281/ zenodo. 81749 41
Brusilovsky, P. (2023). AI in education, learner control, and human-AI collaboration. International Journal of Artificial Intel-
ligence in Education. https:// doi. org/ 10. 1007/ s40593- 023- 00356-z
Byrd, K., Fan, A., Her, E., Liu, Y., Almanza, B., & Leitch, S. (2021). Robot vs human: Expectations, performances and gaps
in off-premise restaurant service modes. International Journal of Contemporary Hospitality Management, 33(11),
3996–4016. https:// doi. org/ 10. 1108/ IJCHM- 07- 2020- 0721
Cabero-Almenara, J., Gutiérrez-Castillo, J. J., Guillén-Gámez, F. D., & Gaete-Bravo, A. F. (2023). Digital Competence of Higher
Education Students as a Predictor of Academic Success. Technology, Knowledge and Learning, 28(2), 683–702. https://
doi. org/ 10. 1007/ s10758- 022- 09624-8
Celik, I., Dindar, M., Muukkonen, H., & Järvelä, S. (2022). The promises and challenges of artificial intelligence for teachers:
A systematic review of research. TechTrends, 66(4), 616–630. https:// doi. org/ 10. 1007/ s11528- 022- 00715-y
Chan, C. K. Y. (2023). A comprehensive AI policy education framework for university teaching and learning. International
Journal of Educational Technology in Higher Education, 20(1), 38. https:// doi. org/ 10. 1186/ s41239- 023- 00408-3
Chong, T., Yu, T., Keeling, D. I., & de Ruyter, K. (2021). AI-chatbots on the services frontline addressing the challenges and
opportunities of agency. Journal of Retailing and Consumer Services, 63, 102735. https:// doi. org/ 10. 1016/j. jretc onser.
2021. 102735
Crisp, G., Palmer, E., Turnbull, D., Nettelbeck, T., Ward, L., LeCouteur, A., Sarris, A., Strelan, P., & Schneider, L. (2009). First year
student expectations: Results from a university-wide student survey. Journal of University Teaching and Learning
Practice, 6(1), 16–32. https:// doi. org/ 10. 53761/1. 6.1.3
Dang, J., & Liu, L. (2022). Implicit theories of the human mind predict competitive and cooperative responses to AI robots.
Computers in Human Behavior, 134, 107300. https:// doi. org/ 10. 1016/j. chb. 2022. 107300
Desaire, H., Chua, A. E., Isom, M., Jarosova, R., & Hua, D. (2023). Distinguishing academic science writing from humans
or ChatGPT with over 99% accuracy using off-the-shelf machine learning tools. Cell Reports Physical Science, 4(6),
101426. https:// doi. org/ 10. 1016/j. xcrp. 2023. 101426
Diakopoulos, N., Friedler, S., Arenas, M. et al. (2016). Principles for accountable algorithms and a social impact statement
for algorithms. FATML . http:// www. fatml. org/ resou rces/ princ iples- for- accou ntable- algor ithms.
Ehlers, U., & Kellermann, S. A. (2019). Future Skills - The Future of Learning andHigher education. Results of the Interna-
tional Future Skills Delphi Survey
Einola, K., & Khoreva, V. (2023). Best friend or broken tool? Exploring the co-existence of humans and artificial intelligence
in the workplace ecosystem. Human Resource Management, 62(1), 117–135. https:// doi. org/ 10. 1002/ hrm. 22147
Fauzi, F., Tuhuteru, L., Sampe, F., Ausat, A. M. A., & Hatta, H. R. (2023). Analysing the role of ChatGPT in improving student
productivity in higher education. Journal on Education, 5(4), 14886–14891. https:// doi. org/ 10. 31004/ joe. v5i4. 2563
Fitria, T. N. (2021). QuillBot as an online tool: Students’ alternative in paraphrasing and rewriting of English writing. Englisia:
Journal of Language, Education, and Humanities, 9(1), 183. https:// doi. org/ 10. 22373/ ej. v9i1. 10233
Flores, M. A., Brown, G., Pereira, D., Coutinho, C., Santos, P., & Pinheiro, C. (2020). Portuguese university students’ concep-
tions of assessment: Taking responsibility for achievement. Higher Education, 79(3), 377–394. https:// doi. org/ 10.
1007/ s10734- 019- 00415-2
Floridi, L., Cowls, J., Beltrametti, M., et al. (2018). AI4People—An ethical framework for a good AI Society: Opportunities,
risks, principles, and recommendations. Minds & Machines, 28, 689–707. https:// doi. org/ 10. 1007/ s11023- 018- 9482-5
Fuchs, K. (2023). Exploring the opportunities and challenges of NLP models in higher education: is Chat GPT a blessing or
a curse? Frontiers in Education. https:// doi. org/ 10. 3389/ feduc. 2023. 11666 82
Galante, A. (2020). Pedagogical translanguaging in a multilingual English program in Canada: Student and teacher per-
spectives of challenges. System, 92, 102274. https:// doi. org/ 10. 1016/j. system. 2020. 102274
Gkinko, L., & Elbanna, A. (2023). The appropriation of conversational AI in the workplace: A taxonomy of AI chatbot users.
International Journal of Information Management, 69, 102568. https:// doi. org/ 10. 1016/j. ijinf omgt. 2022. 102568
Heil, J., & Ifenthaler, D. (2023). Online Assessment in Higher Education: A Systematic Review. Online Learning. https:// doi.
org/ 10. 24059/ olj. v27i1. 3398
Henschel, A., Laban, G., & Cross, E. S. (2021). What makes a robot social? A review of social robots from science fiction to a
home or hospital near you. Current Robotics Reports, 2(1), 9–19. https:// doi. org/ 10. 1007/ s43154- 020- 00035-0
Herbert, C., & Dołżycka, J. D. (2022). Personalized avatars without agentic interaction: Do they promote learning per-
formance and sense of self in a teaching context? A pilot study. In A. González-Briones, A. Almeida, A. Fernandez,
A. El Bolock, D. Durães, J. Jordán, & F. Lopes (Eds.), Highlights in practical applications of agents, multi-agent systems,
and complex systems simulation. The PAAMS Collection. PAAMS 2022 (pp. 169–180). Springer. https:// doi. org/ 10. 1007/
978-3- 031- 18697-4_ 14
Hildt, E. (2021). What sort of robots do we want to interact with? Reflecting on the human side of human-artificial intel-
ligence interaction. Frontiers in Computer Science. https:// doi. org/ 10. 3389/ fcomp. 2021. 671012
Houser, M. L. (2004). We don’t need the same things! Recognizing differential expectations of instructor communication
behavior for nontraditional and traditional students. The Journal of Continuing Higher Education, 52(1), 11–24. https://
doi. org/ 10. 1080/ 07377 366. 2004. 10400 271
Ifenthaler, D., & Yau, J.Y.-K. (2020). Utilising learning analytics to support study success in higher education: a sys-
tematic review. Educational Technology Research and Development, 68(4), 1961–1990. https:// doi. org/ 10. 1007/
s11423- 020- 09788-z
Page 12 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
Jobin, A., Ienca, M., & Vayena, E. (2019). The global landscape of AI ethics guidelines. Nature Machine Intelligence, 1(9),
389–399. https:// doi. org/ 10. 1038/ s42256- 019- 0088-2
Joksimovic, S., Ifenthaler, D., Marrone, R., De Laat, M., & Siemens, G. (2023). Opportunities of artificial intelligence for sup-
porting complex problem-solving: Findings from a scoping review. Computers and Education: Artificial Intelligence, 4,
100138. https:// doi. org/ 10. 1016/j. caeai. 2023. 100138
Kaplan-Rakowski, R., Grotewold, K., Hartwick, P., & Papin, K. (2023). Generative AI and teachers’ perspectives on its imple-
mentation in education. Journal of Interactive Learning Research, 34(2), 313–338.
Kim, J., Lee, H., & Cho, Y. H. (2022). Learning design to support student-AI collaboration: Perspectives of leading
teachers for AI in education. Education and Information Technologies, 27(5), 6069–6104. https:// doi. org/ 10. 1007/
s10639- 021- 10831-6
Kim, J., Merrill, K., Xu, K., & Sellnow, D. D. (2020). My teacher is a machine: Understanding students’ perceptions of AI
teaching assistants in online education. International Journal of Human-Computer Interaction, 36(20), 1902–1911.
https:// doi. org/ 10. 1080/ 10447 318. 2020. 18012 27
Koltovskaia, S. (2020). Student engagement with automated written corrective feedback (AWCF) provided by Grammarly:
A multiple case study. Assessing Writing, 44, 100450. https:// doi. org/ 10. 1016/j. asw. 2020. 100450
Larasati, P. D., Irawan, A., Anwar, S., Mulya, M. F., Dewi, M. A., & Nurfatima, I. (2022). Chatbot helpdesk design for digital
customer service. Applied Engineering and Technology, 1(3), 138–145. https:// doi. org/ 10. 31763/ aet. v1i3. 684
Latham, A. (2022). Conversational intelligent tutoring systems: The state of the art. In A. E. Smith (Ed.), Women in engineer-
ing and science (pp. 77–101). Springer. https:// doi. org/ 10. 1007/ 978-3- 030- 79092-9_4
Leoste, J., Jõgi, L., Õun, T., Pastor, L., López, S. M. J., & Grauberg, I. (2021). Perceptions about the future of integrating
emerging technologies into higher education—the case of robotics with artificial Intelligence. Computers., 10(9),
110. https:// doi. org/ 10. 3390/ compu ters1 00901 10
Li, C., Zhang, Y., Niu, X., Chen, F., & Zhou, H. (2023). Does artificial intelligence promote or inhibit on-the-job learning?
Human reactions to AI at work. Systems, 11(3), 114. https:// doi. org/ 10. 3390/ syste ms110 30114
Lim, W. M., Gunasekara, A., Pallant, J. L., Pallant, J. I., & Pechenkina, E. (2023). Generative AI and the future of education:
Ragnarök or reformation? A paradoxical perspective from management educators. The International Journal of
Management Education, 21(2), 100790. https:// doi. org/ 10. 1016/j. ijme. 2023. 100790
Lindfors, M., Pettersson, F., & Olofsson, A. D. (2021). Conditions for professional digital competence: The teacher educators’
view. Education Inquiry, 12(4), 390–409. https:// doi. org/ 10. 1080/ 20004 508. 2021. 18909 36
Liu, J., Zhang, L., Wei, B., & Zheng, Q. (2022). Virtual teaching assistants: Technologies, applications and challenges. In
Humanity driven AI (pp. 255–277). Springer International Publishing. https:// doi. org/ 10. 1007/ 978-3- 030- 72188-6_ 13.
Mah, D.-K. (2016). Learning analytics and digital badges: Potential impact on student retention in higher education.
Technology, Knowledge and Learning, 21(3), 285–305. https:// doi. org/ 10. 1007/ s10758- 016- 9286-8
Mah, D.-K., & Ifenthaler, D. (2017). Academic staff perspectives on first-year students’ academic competencies. Journal of
Applied Research in Higher Education, 9(4), 630–640. https:// doi. org/ 10. 1108/ JARHE- 03- 2017- 0023
Mah, D.-K., & Ifenthaler, D. (2018). Students’ perceptions toward academic competencies: The case of German first-year
students. Issues in Educational Research, 28, 120–137.
Martín-Martín, A., Thelwall, M., Orduna-Malea, E., & Delgado López-Cózar, E. (2021). Google Scholar, Microsoft Academic,
Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: A multidisciplinary comparison of coverage via cita-
tions. Scientometrics, 126(1), 871–906. https:// doi. org/ 10. 1007/ s11192- 020- 03690-4
McCarthy, M., & Kuh, G. D. (2006). Are students ready for college? Phi Delta Kappan, 87(9), 664–669. https:// doi. org/ 10.
1177/ 00317 21706 08700 909
Medland, E. (2016). Assessment in higher education: Drivers, barriers and directions for change in the UK. Assessment &
Evaluation in Higher Education, 41(1), 81–96. https:// doi. org/ 10. 1080/ 02602 938. 2014. 982072
Mitsea, E., & Drigas, A. (2019). A journey into the metacognitive learning strategies. International Journal of Online and
Biomedical Engineering (IJOE), 15(14), 4. https:// doi. org/ 10. 3991/ ijoe. v15i14. 11379
Ng, D. T. K., Su, J., & Chu, S. K. W. (2023). Fostering secondary school students’ AI literacy through making AI-driven recy-
cling bins. Education and Information Technologies. https:// doi. org/ 10. 1007/ s10639- 023- 12183-9
Ocaña-Fernández, Y., Valenzuela-Fernández, L. A., & Garro-Aburto, L. L. (2019). Artificial Intelligence and its implications in
higher education. Propósitos y Representaciones. https:// doi. org/ 10. 20511/ pyr20 19. v7n2. 274
Ouyang, F., Wu, M., Zheng, L., Zhang, L., & Jiao, P. (2023). Integration of artificial intelligence performance prediction and
learning analytics to improve student learning in online engineering course. International Journal of Educational
Technology in Higher Education, 20(1), 4. https:// doi. org/ 10. 1186/ s41239- 022- 00372-4
Ouyang, F., Zheng, L., & Jiao, P. (2022). Ar tificial intelligence in online higher education: A systematic review of empirical
research from 2011 to 2020. Education and Information Technologies, 27(6), 7893–7925. https:// doi. org/ 10. 1007/
s10639- 022- 10925-9
Popenici, S. A. D., & Kerr, S. (2017). Exploring the impact of artificial intelligence on teaching and learning in higher educa-
tion. Research and Practice in Technology Enhanced Learning, 12(1), 22. https:// doi. org/ 10. 1186/ s41039- 017- 0062-8
Rahayu, N. W., Ferdiana, R., & Kusumawardani, S. S. (2023). A systematic review of learning path recommender systems.
Education and Information Technologies, 28(6), 7437–7460. https:// doi. org/ 10. 1007/ s10639- 022- 11460-3
Raisch, S., & Krakowski, S. (2021). Artificial intelligence and management: The automation–augmentation paradox. Acad-
emy of Management Review, 46(1), 192–210. https:// doi. org/ 10. 5465/ amr. 2018. 0072
Roesler, E., Manzey, D., & Onnasch, L. (2021). A meta-analysis on the effectiveness of anthropomorphism in human-robot
interaction. Science Robotics, 6(58), eabj5425. https:// doi. org/ 10. 1126/ sciro botics. abj54 25
Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software. https:// doi. org/ 10.
18637/ jss. v048. i02
Sailer, M., Schultz-Pernice, F., & Fischer, F. (2021). Contextual facilitators for learning activities involving technology in
higher education: The C♭-model. Computers in Human Behavior, 121, 106794. https:// doi. org/ 10. 1016/j. chb. 2021.
106794
Sanavi, S., & Matt, J. (2022). The influence of the first-year seminar participation on student retention. Journal of Education
and Training Studies, 10(4), 90. https:// doi. org/ 10. 11114/ jets. v10i4. 5669
Page 13 of 13
Delckeretal. Int J Educ Technol High Educ (2024) 21:18
Seufert, S., & Tarantini, E. (2022). Gestaltung der digitalen Transformation in Schulen: Ein Reifegradmodell für die Berufsbil-
dung. MedienPädagogik: Zeitschrift Für Theorie Und Praxis Der Medienbildung, 49(Schulentwicklung), 301–326. https://
doi. org/ 10. 21240/ mpaed/ 49/ 2022. 07. 15.X
Smith, J. S., & Wertlieb, E. C. (2005). Do first-year college students’ expectations align with their first-year experiences?
NASPA Journal, 42(2), 153–174. https:// doi. org/ 10. 2202/ 1949- 6605. 1470
Strzelecki, A. (2023). To use or not to use ChatGPT in higher education? A study of students’ acceptance and use of tech-
nology. Interactive Learning Environments. https:// doi. org/ 10. 1080/ 10494 820. 2023. 22098 81
Thomas, L. (2002). Student retention in higher education: The role of institutional habitus. Journal of Education Policy,
17(4), 423–442. https:// doi. org/ 10. 1080/ 02680 93021 01402 57
Tinto, V. (1994). Leaving college: Rethinking the causes and cures of student attrition. University of Chicago Press. https://
doi. org/ 10. 7208/ chica go/ 97802 26922 461. 001. 0001
Tsai, Y.-S., Rates, D., Moreno-Marcos, P. M., Muñoz-Merino, P. J., Jivet, I., Scheffel, M., Drachsler, H., Delgado Kloos, C., &
Gašević, D. (2020). Learning analytics in European higher education—Trends and barriers. Computers & Education,
155, 103933. https:// doi. org/ 10. 1016/j. compe du. 2020. 103933
van Pinxteren, M. M. E., Wetzels, R. W. H., Rüger, J., Pluymaekers, M., & Wetzels, M. (2019). Trust in humanoid robots: Implica-
tions for services marketing. Journal of Services Marketing, 33(4), 507–518. https:// doi. org/ 10. 1108/ JSM- 01- 2018- 0045
Venkatesh, M., & DavisDavis, G. B. F. D. (2003). User acceptance of information technology: Toward a unified view. MIS
Quarterly, 27(3), 425. https:// doi. org/ 10. 2307/ 30036 540
von der Heyde, M., Goebel, M., Zoerner, D., & Lucke, U. (2023). Integrating AI tools with campus infrastructure to support
the life cycle of study regulations. Proceedings of European University, 95, 332–344.
Vuorikari, R., Kluzer, S., & Punie, Y. (2022). D igComp 2.2, The Digital Competence framework for citizens—With new examples
of knowledge, skills and attitudes. Publications Office of the European Union. https:// doi. org/ 10. 2760/ 115376.
Woschank, M., Rauch, E., & Zsifkovits, H. (2020). A review of further directions for artificial intelligence, machine learning,
and deep learning in smart logistics. Sustainability, 12(9), 3760. https:// doi. org/ 10. 3390/ su120 93760
Yorke, M., & Longden, B. (2008). The first-year experience of higher education in the UK—Final report. The Higher Education
Academy.
Zamfirescu-Pereira, J. D., Wong, R. Y., Hartmann, B., & Yang, Q. (2023). Why Johnny can’t prompt: how non-AI experts try
(and fail) to design LLM prompts. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems,
1–21. https:// doi. org/ 10. 1145/ 35445 48. 35813 88.
Zanzotto, F. M. (2019). Viewpoint: Human-in-the-loop Artificial Intelligence. Journal of Artificial Intelligence Research, 64,
243–252. https:// doi. org/ 10. 1613/ jair.1. 11345
Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence
applications in higher education – where are the educators? International Journal of Educational Technology in Higher
Education, 16(1), 39. https:// doi. org/ 10. 1186/ s41239- 019- 0171-0
Zhang, C., Schießl, J., Plößl, L., Hofmann, F., & Gläser-Zikuda, M. (2023). Acceptance of artificial intelligence among pre-
service teachers: A multigroup analysis. International Journal of Educational Technology in Higher Education, 20(1), 49.
https:// doi. org/ 10. 1186/ s41239- 023- 00420-7
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Jan Delcker is a post doc researcher at the Chair of Learning, Design and Technology at University of
Mannheim, Germany. Jan’s research focuses on the transformation of educational institutions through the
implementation of digital technology.
Joana Heil is a PhD candidate at the Chair of Learning, Design and Technology at University of Man-
nheim, Germany. The design and development of online assessment, adaptive feedback and learning ana-
lytics are the focus of Joana’s research.
Dirk Ifenthaler is Professor and Chair of Learning, Design and Technology at University of Mannheim,
Germany and UNESCO Deputy Chair on Data Science in Higher Education Learning and Teaching at Curtin
University, Australia. Dirk’s research focuses on the intersection of cognitive psychology, educational tech-
nology, data analytics, and organisational learning.
Sabine Seufert Professor of Business Education, heads the Institute for Educational Management and
Technologies at the University of St. Gallen. Her research focuses on digital transformation and artificial
intelligence in education.
Lukas Spirgi research associate and PhD student, works at the Institute for Educational Management
and Technologies at the University of St. Gallen. He conducts research in the field of digital transformation
and artificial intelligence in education.