ArticlePDF Available

Cheating in the Wake of COVID-19: How Dangerous is Ad-hoc Online Testing for Academic Integrity?

Abstract and Figures

Worldwide, higher education institutions made quick and often unprepared shifts from on-site to online examination in 2020 due to the COVID-19 health crisis. This sparked an ongoing debate on whether this development made it easier for students to cheat. We investigated whether students did indeed cheat more often in online than in on-site exams and whether the use of online exams was also associated with higher rates of other behaviors deemed as academic dishonesty. To answer our research questions, we questioned 1,608 German students from a wide variety of higher education institutions about their behavior during the summer semester of 2020. The participating students reported that they cheated more frequently in online than in on-site exams. Effects on other measures of academic dishonesty were negligible. These results speak for the notion that the swift application of ad-hoc online testing during 2020 has led to negative consequences for academic integrity.
Content may be subject to copyright.
Computers and Education Open 2 (2021) 100055
Available online 17 October 2021
2666-5573/© 2021 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
Cheating in the wake of COVID-19: How dangerous is ad-hoc online testing
for academic integrity?
Stefan Janke
a
,
*, Selma C. Rudert
b
, ¨
Anne Petersen
a
, Tanja M. Fritz
c
, Martin Daumiller
c
a
University of Mannheim, School of Social Sciences, Germany
b
University of Koblenz and Landau, Germany
c
University of Augsburg, Germany
ARTICLE INFO
Keywords:
Academic dishonesty
Cheating
Examination
Online testing
ABSTRACT
Worldwide, higher education institutions made quick and often unprepared shifts from on-site to online exam-
ination in 2020 due to the COVID-19 health crisis. This sparked an ongoing debate on whether this development
made it easier for students to cheat. We investigated whether students did indeed cheat more often in online than
in on-site exams and whether the use of online exams was also associated with higher rates of other behaviors
deemed as academic dishonesty. To answer our research questions, we questioned 1608 German students from a
wide variety of higher education institutions about their behavior during the summer semester of 2020. The
participating students reported that they cheated more frequently in online than in on-site exams. Effects on
other measures of academic dishonesty were negligible. These results speak for the notion that the swift
application of ad-hoc online testing during 2020 has led to negative consequences for academic integrity.
The COVID-19 pandemic forced many higher education institutions
across the globe to apply new ways of teaching and testing to mitigate
health risks for instructors and students. Especially the increased reli-
ance on online exams has ignited debates on whether this method of
performance assessment comes with higher risks for academic integrity
than on-site exams, as students supposedly have more opportunities to
cheat [11,22,42]. While prior research on academic dishonesty is
inconclusive regarding the question of whether online testing fosters
dishonest behavior [26], the extensive and sudden shift from on-site to
online testing in the higher education sector in the wake of the pandemic
represents an unparalleled event. More than the previous small steps
towards digitalization, this large-scale change in performance assess-
ment provided a challenge for maintaining academic integrity. Higher
education practitioners often had little time to prepare for the transition
[4]. As a result, educators relied on a range of ad-hoc solutions to testing,
many of which were characterized by low accountability and less than
optimal procedures to monitor studentsbehavior during the assessment
situation [15]. From the studentsperspective, the anticipation of less
accountability for cheating in combination with difculties to prepare
for the online exams (e.g., due to new family and community obligations
emerging during COVID-19, unfamiliarity with new exam formats, or a
lack of necessary self-regulated learning skills, [42]), might have
increased the appeal to cheat in online exams. In the present work, we
investigate whether students did indeed cheat more during the
COVID-19 pandemic when examined through online exams compared to
on-site exams, and whether a shift in the mode of examination was also
associated with elevated rates of further dishonest behaviors (e.g., more
plagiarism, lying or bribing of university instructors for better grades).
1. Academic dishonesty in virtual environments
Broadly dened, academic dishonesty refers to a set of behaviors that
can be understood as an intentional breaking of academic rules for
personal gain (e.g., plagiarism, lying, and falsications). Such a deni-
tion is best reected in multifaceted models of academic dishonesty (see,
for instance, [7]). However, academic dishonesty can also be dened
more narrowly in terms of singular clear-cut behaviors such as cheating
in exams (e.g., [21]). The terms academic dishonesty and cheating are as
a result sometimes used interchangeably within the literature. Here, we
use the term academic dishonesty or dishonest behavior when referring to
the broader multifaceted construct. In contrast, we use the term cheating
when referring to the more narrowly dened behavior of using unal-
lowed material or unallowed assistance during exams.
Both academic dishonesty in general and cheating during exams in
E-mail address: stefan.janke@uni-mannheim.de (S. Janke).
*
Corresponding author.
Contents lists available at ScienceDirect
Computers and Education Open
journal homepage: www.sciencedirect.com/journal/computers-and-education-open
https://doi.org/10.1016/j.caeo.2021.100055
Received 26 April 2021; Received in revised form 10 September 2021; Accepted 15 October 2021
Computers and Education Open 2 (2021) 100055
2
particular are highly prevalent in higher education institutions. Prior
research has repeatedly shown that the majority of students admit to
having engaged in academic dishonesty during their studies [9,38,52].
This nding has inspired a multitude of research frameworks aiming to
explain which personal and environmental factors can elevate or reduce
cheating rates. Exemplary theories on such factors include social norms
theory, explaining cheating as a result of injunctive and descriptive
norms [5,40], deterrence theory that focuses on how expectations about
punishment impact cheating [14], and cognitive theories focusing on
the impact of neutralization techniques and feelings of entitlement [50].
The present contribution relies on more general theoretical approaches
that aim to integrate several singular factors into more general frameworks
and provide a comprehensive picture of impact factors. Particularly, we
build on models that explain human behavior as a function of outcome ex-
pectations, outcome value, and costs (e.g., [56]). In the terms of such models,
academic dishonesty may be understood as a function of studentsevaluation
of the potential outcome of dishonest behavior, their perceived ability to
succeed without dishonesty as well as on (external and internal) costs bound
to the expected likelihood of potential sanctions [43].
Under normal conditions (i.e., outside of the pandemic), the imple-
mentation of online exams will likely neither impact studentsexpectations
to succeed in an exam by regular means nor inuence the outcome value of a
good grade in the exam. However, online exams may lower the expected
costs of cheating, given that both faculty members [42,47] as well as students
[16,29] share the belief that it is easier to get away with cheating in online
exams. This could facilitate higher cheating rates in online exams compared
to on-site exams even under normal conditions, that is, without an acceler-
ation of digitalization due to a worldwide health crisis.
Empirically, the number of studies comparing the frequency of
cheating between on-site and online exams is limited to less than a dozen
studies [26]. The results of these studies are inconclusive, with some
demonstrating higher cheating rates [28,30,34] and others showing
steady [33,54] or even lower rates [23,51]. Holden and colleagues [26]
argue that these inconsistencies could partly be due to different oper-
ationalizations of cheating. To this end, previous studies differed in
whether they explicitly focused on cheating in online exams or
compared a multitude of aspects of academic dishonesty depending on
the mode of teaching (online vs. on-site).
Studies that explicitly focused on the impact of the examination mode
on cheating during the exam showed that online exams were associated
with increased cheating compared to on-site exams [28,30]. This could be
due to instructors implementing examination environments that do not
sufciently allow to shape and monitor the physical testing environment.
As a result, students may feel less accountable and less likely to be caught
cheating during online exams. In contrast, studies that focused on the
question whether the mode of teaching impacted wider arrays of
dishonest behavior typically yielded smaller effects [23,54], possibly
because the teaching process itself may not impact studentsperception of
accountability. In sum, we assume that the mode of examination impacts
cheating rates to a stronger degree than the shift to online teaching.
Practitioners in higher education have proposed solutions to address
reduced accountability for cheating in online exams. One such solution
is the use of proctoring, that is, ensuring academic integrity with methods
such as live observation via webcam or delayed checks for fraudulent
behavior through recordings. In fact, results from empirical research
indicate that proctoring online exams reduced the inated performance
rates that can be observed in online exams without proctoring [2,19,46],
while also enhancing the perceived accountability for academic
dishonesty [27]. A further solution to reduce cheating is to make con-
sequences for cheating more salient, for instance by letting students read
and copy warning statements about potential sanctions [53]. Finally,
researchers have promoted changes in the overall design of online exams
by advocating open-ended questions over multiple-choice [6],
open-book exams [57], or even collaborative exams [48]. These changes
make cheating either less feasible (open-ended questions) or even
encourage behaviors that are typically considered as cheating
(collaboration and use of textbooks/internet resources during the exam)
to facilitate deeper processing of the learning material.
However, several factors can deter instructors from using strategies
to mitigate cheating, starting with a lack of knowledge about these
procedures, a lack of motivation to implement them, and also technical
issues such as a lack of access to proctoring software. This was likely a
problem during the onset of the COVID-19 pandemic, when higher ed-
ucation instructors with little experience and restricted time had to
familiarize themselves with online testing [44], while higher education
institutions often still lacked both the technical expertise as well as the
legal foundation for certain methods such as proctoring [13].
1.1. COVID-19 and online exams: ad-hoc solutions as a response to rapid
changes
The COVID-19 pandemic led to a severe disruption of teaching
throughout higher education institutions around the world [37]. Uni-
versity instructors had to nd swift responses and often shifted from
on-site to online teaching in a matter of days [17]. This meant that higher
education practitioners were fully occupied by setting up virtual learning
environments, exploring tools they had never used before, and searching
for ways to engage with their students, which was often perceived as a
straining experience [4]. Regarding testing, instructors had to rely on a
wide range of local ad-hoc solutions often bound to ofcial guidelines that
higher education institutions and government agencies had developed
under strong time pressure. While some early adopters that already had
experience with online testing prior to COVID-19 may have coped better
with this transition, it stands to reason that this applied only to a small
minority of institutions and instructors [15].
Differences in approaches to online testing might be further
complicated by political developments and structures, as can be exem-
plied by the situation in the higher education sector of Germany in
2020. Like most of Western Europe, Germany was hit by a spike of
COVID-19 infections during spring 2020. The rst response to slow
down the spread of the infections (e.g., regulations regarding social
contacts, lockdown of educational institutions) was coordinated
throughout the country. However, Germany is a decentralized country
with federal states and municipalities having relatively high autonomy.
Thus, as the rst wave of the pandemic decreased, some of those mea-
sures were adapted or changed on a state or even municipal level during
summer 2020. In the higher education sector, this led to vastly different
approaches to teaching and examination between federal states
(L¨
ander) and even between institutions [17]. Some institutions chose
to administer on-site exams (for instance to mitigate risks for academic
integrity). Other institutions focused more strongly on minimizing
health risks and either used a mixture of online and on-site exams or
completely shifted examination procedures to online testing.
The impact of the rapid digital transformations in the higher education
sector on academic integrity is an emerging topic. First studies already shed
some light on this issue. For instance, elevated concerns about cheating
among instructors [42] as well as among students [20], indicative data such
as a spike in internet searches for key terms during testing periods [11], or an
increase in performance compared to prior cohorts (also [11]) can be
interpreted as rst evidence for a potential threat to academic integrity. To
supplement this research and advance our understanding on what actually
happened after the swift shift in examination procedures in summer 2020,
we investigate studentsacademic dishonesty through self-report measures.
A benet of studentsreports is that they directly capture studentsexperi-
ences so that no further inferences are needed to concur that cheating or
other kinds of dishonest behaviors happened.
1.2. Hypotheses on the impact of rapid digitalization of examinations on
academic integrity
We distinguish between three potential hypotheses regarding the
impact of sudden shifts in examination procedures on academic
S. Janke et al.
Computers and Education Open 2 (2021) 100055
3
dishonesty: (1) the unproblematic-digitalization hypothesis, (2) the
selective-behavior-change hypothesis and (3) the strong-threat-to-
integrity hypothesis. The unproblematic-digitalization hypothesis aligns
with research suggesting that online testing is associated with similar or
lower cheating rates than on-site testing [23,33,51,54] as well as the
notion that both on-site and online assessments can be tailored to combat
cheating [2,27]. This hypothesis builds on the assumption that in-
structorsabilities and efforts to uphold academic integrity do not sys-
tematically differ between modes of examination, and consequently
assumes that the shift to online exams does not impair academic integrity.
However, in contrast to a careful development of online testing
procedures, it seems more likely that sudden shifts to online testing may
have led to a rather chaotic implementation of less secure ad-hoc solu-
tions, which in turn reduced instructorsability to detect cheating dur-
ing examination. In terms of overarching theories of cheating, we would
assume that teacherslowered capability and/or motivation to detect
cheating in online exams taken together with studentsassumptions that
cheating is easier in online exams [16] may have reduced anticipated
costs of cheating. This in turn can be expected to lead to higher rates of
cheating in online compared to on-site exams in times of higher uncer-
tainty during the pandemic. One may assume two different scenarios for
this rise in dishonesty: The rst scenario is that a shift to less secure
online testing procedures yields higher rates of cheating during tests, but
not in other academic situations. This selective-behavior-change hypothesis
aligns well with the notion of Holden and colleagues [26] that most
empirical studies found online exams to be associated with an increase
in using specic means of cheating during the exam itself, but not with
an increase in more general measures of academic dishonesty.
A second scenario is that the chaotic shift of paradigms during the
pandemic affected instructors and students even more severely: On the
one hand, instructors might have been overwhelmed and absent, and not
been willing or able to invest more time into implementing optimal
testing procedures. On the other hand, students might have experienced
increased fear of failure due to unfamiliar modes of teaching, testing,
and the higher importance of self-regulated learning [24]. If this were
true, students would have both experienced a lower accountability for
any kind of dishonest behavior, paired with a low expectation to succeed
without relying on such behavior. This might have led to a strong threat
to academic integrity that manifested itself in an increase of further
dishonest behavior beyond the scope of cheating in exams (e.g.,
enhanced rates of plagiarism, fabricating in essays, or bribing of uni-
versity instructors for better grades).
Within the present contribution, we aim to shed light on the question
of whether the rapid change from onsite to online testing during the
academic summer semester 2020 (beginning of April end of
September) in Germany was (a) unproblematic regarding academic
dishonesty in general and cheating in particular, or whether (b) the
introduced online exams were characterized by elevated cheating rates
compared to on-site exams, or (c) whether a change in mode of exami-
nation was associated not only with more cheating but also higher rates
of academic dishonesty overall. Additionally, we investigated whether
experience of an institution with online exams moderated the associa-
tion between mode of examination and academic dishonesty.
2. Material and methods
We conducted a nation-wide online assessment throughout Germany
in November/December 2020 (two months after the end of the summer
semester) in which we contacted students through ofcial mailing lists
of all higher education institutions willing to participate in the study. As
participants were promised full anonymity due to the sensitivity of the
subject matter, we did not assess any information about participants
specic institution and thus cannot state how many and which different
institutions distributed our survey. However, we asked the participants
in which federal state they were studying and about the type of higher
education institution they were enrolled in (see participants section for
more information). As all sixteen federal states and several types of
higher education institutions were represented, we infer that our sample
is characterized by a strong diversity of educational institutions.
2.1. Participants
In total, 3005 students started lling out the questionnaire. For our an-
alyses, we decided to exclude all students who (a) stopped answering the
questionnaire before lling out any items on academic dishonesty (presented
at the very end of the survey), (b) were not enrolled at university in the prior
semester (summer semester 2020) as they had no chance to commit
dishonest actions in the assessed period and (c) students who failed to
correctly answer an attention check that was included in the questionnaire
(please use the option furthest to the right side of the screen to answer this item.)
as this indicated that the participants were not paying attention when
answering the items. Moreover, we excluded students who had not taken any
written exams in the summer semester because the measures assessing
cheating during online examination pertained to written exams. Most of the
students (80.60%) indicated that their performance had been assessed
through some sort of written exam during the summer semester 2020.
Overall, the net sample included 1608 students (Age: M =23.08 years,
SD =3.86 years; Gender: 68.7% female, 0.8% diverse; Study duration: M
=4.41 semesters, SD =2.16 semesters; Bachelor level: 78.5%, Master
level: 12.4%, other study programs including diploma or Staatsexamen
programs: 8.8%). Most students were enrolled at research-focused uni-
versities (41.2%), but also at universities of applied sciences (Fach-
hochschulen; 24.4%), cooperative state universities (Duale
Hochschulen; 29.0%), universities of education (P¨
adagogische Hoch-
schulen; 2.4%), distance learning universities (Fernuniversit¨
aten;
0.5%), and other higher education institutions (2.4%). Most of the
participating students had enrolled in business and economic programs
(31.6%) followed by education (10.8%), mechanical engineering (8.6%),
social management (4.9%), computer science (4.9%), medicine (4.8%),
German studies (4.7%), mathematics (3.7%), and a wide array of further
programs (33 categories <3.0% of the sample). The sample was, thus,
characterized by a broad variety of disciplines and institutions.
The majority of the investigated students (82.5%) indicated that they
were exclusively taught through online courses during the summer se-
mester 2020. Additionally, 16.6% of the participants indicated that their
faculty had employed a mixture of online and on-site teaching (likely
due to semesters disrupted by lockdown measures), whereas only 0.9%
indicated that they were taught exclusively on-site.
2.2. Measures
2.2.1. Mode of examination
Participants indicated whether their performance had been assessed
via exams during the summer semester 2020. If the students indicated
that this was the case, we asked them about the total number of on-site
and/or online exams that they had participated in during the semester.
We used this data to code the mode of examination (1 =on-site only; 2 =
online only; 3 =mixed mode of examination).
2.2.2. Academic dishonesty
All students indicated how frequently they had engaged in deviations
from academic integrity during the summer semester 2020. In total, we
assessed 19 different behaviors with a frequency scale ranging from 1 (never)
to 7 (very frequently). The benet of using a set of pre-dened behaviors
rather than simply asking students about whether they perceived their
behavior as dishonest or unethical is that students assessment of whether
certain behaviors are acceptable or not is characterized by low objectivity
and validity [12]. The behaviors assessed within our questionnaire were
largely adopted from pre-existing measures of academic dishonesty [7,39]
and included behaviors such as plagiarism, fabrication, tempering with
learning material, lying, and bribing (see Table 2 for a full depiction). Two of
the items specically aimed at behavior during online courses with live
S. Janke et al.
Computers and Education Open 2 (2021) 100055
4
communication (engaging in non-course related activities while being log-
ged in; letting others log into a course as proxies for ones own participation)
and were only administered to students who had taken online courses during
the summer semester (n =1593 students). Besides these items, we also
measured with three items whether students had assisted other students in
cheating. We did not include these items in our analyses because the
respective behaviors likely constitute a different kind of dishonest behavior
that is focused less on personal gain. For all but the descriptive analyses, we
aggregated all 19 items to a mean score for academic dishonesty (
α
=0.79).
2.2.3. Cheating in exams
Students who had written exams as performance assessments
answered additional questions regarding cheating in exams depending
on whether they were examined on-site (5 additional items; n =1223
students) or online (3 additional items; n =734 students). These items
were once again based on existing measures [1,39]. If studentsper-
formance was assessed using both modes of examination, they received
all eight additional items. The items were measured with the same
seven-point-scale that had been used to assess the above-mentioned
general aspects of academic dishonesty. Two items were
content-identical between the two modes of examination and as such
suitable for comparisons. These items were During on-site/online
exams, I used additional materials or the internet to solve exam ques-
tions without permission to do so. (unallowed assistance) and During
on-site/online exams, I exchanged ideas about possible answers to exam
questions with others.(direct exchange). We aggregated the two items to
mean scores indicating cheating during examination for further com-
parisons (on-site reliability:
ρ
=0.66, online reliability:
ρ
=0.76). The
remaining four items that were unique for either testing situation were
not aggregated to any score, but we still analyzed the frequency of those
behaviors to assess their relative importance (see Table 2).
2.2.4. Institutional experience with online exams
Participants indicated whether their university had experience with
online exams with one item (Did your university use online exams
before the COVID-19 pandemic?). The students could use three options
to answer the question (1 =not at all; 2 =partly; 3 =exclusively).
However, the number of students indicating that their institutions had
exclusively used online exams was very low (n =7 participants). Thus,
we decided to merge category 2 and 3 to generate a new dichotomous
measure (0 =no experience; 1 =some experience).
2.3. Analyses
First, we calculated descriptive statistics for single behavioral variables to
shed light on the frequency of the different aspects of academic dishonesty
during the pandemic. In general, the items were strongly skewed with an
average of 79.8% of the respondents using the rst scale point (never), 6.8%
the second scale point and 3.2% the third scale point per item when indi-
cating the frequency of their behavior. This means that the rst three of seven
scale points accounted for roughly 90% of the answer pattern. While we still
used the mean scores based on the frequency scale for subsequent analyses
(labeled as frequency measure in the following section), we also investigated
the robustness of our ndings by dichotomizing the respective items post-hoc
(0 =never, 1 =at least once) and then adding them into a sum score (labeled as
amount measure in the following section as it indicates the total amount of
different behaviors shown by the participants).
Second, we investigated whether online testing was associated with
higher rates of academic dishonesty in general (full range of 19 behaviors
answered by all participants) compared to on-site testing. To investigate
this question, we conducted an ANOVA in which we analyzed main effects
of the mode of examination on academic dishonesty.
Third, we investigated whether the mode of examination was asso-
ciated with different frequencies of cheating in exams (specic behavior;
2-item-measure). As participantstendency to cheat during exams was
either measured only once (mode of examination =exclusively on-site
or exclusively online) or twice (mode of examination =mixture of
both variants), we relied on linear mixed models using the lme4 Version
1.1.21 [8] and the lmer test package Version 3.1.1 [32] in R to estimate
an overall effect of the mode of examination. We included cheating in
exams as the dependent variable, mode of examination as a xed effect,
and added a random intercept for participants to the model. In an
additional analysis, we included institutional experience with online
exams to investigate whether the effect of mode of examination was
moderated by this variable. Furthermore, we ran a repeated measure
ANCOVA to investigate whether we could nd any effects of the mode of
examination within students that had been examined with both testing
procedures, indicating a shift in behavior between testing situations. In
this analysis, we additionally controlled for the ratio of online to on-site
exams to control for the possibility that the dominance of one test form
led to elevated rates of cheating. Note that this control for test ratio was
not possible in the previously conducted mixed model analysis as the
number of online exams would have been confounded with the respec-
tive mode of examination
3. Results
3.1. Descriptives
In absolute numbers, 874 of the participants reported that they took
only on-site exams, while 385 of the participants reported that they took
only online exams during the summer semester 2020. An additional
number of 349 participants reported that they took both online and on-
site exams. This shows the wide variety of solutions that had been
implemented by higher education institutions during the rst months of
the pandemic. We provide further descriptive statistics for the three
groups (characterized by the respective examination mode) in Table 1.
The number of exams differed signicantly based on the mode of ex-
amination. Overall, students in the online-only group took fewer exams
than students who were only tested on-site, t(811.13) =8.05, p <.001,
d =0.47 (M
on-site
=4.02 tests, SD =2.36; M
online
=2.93 tests, SD =2.13).
Furthermore, students who took both types of exams also took more on-
site than online exams, F(1348) =26.54, p <.001,
η
=0.07 (M
on-site
=
2.75 tests, SD =1.98; M
online
=2.02 tests, SD =1.35).
Overall, 90.0% of the students indicated that their institution had not
used online exams prior to the pandemic at all and 9.7% indicated that
their institution had at least partly or in rare cases exclusively used
online exams. Interestingly, whether an institution had used online
exams in the past was not predictive for whether participants took online
exams during summer 2020,
χ
2
(2) =4.05, p =.132. More specically,
Table 1
Descriptive statistics for the different modes of examination.
On-site
exams
Online
exams
Mixed
testing
General Descriptive Statistics
Sample Size n =874 n =385 n =349
% female 65.3% 78.0% 67.3%
Age M =22.90;
SD =3.32
M =23.51;
SD =4.54
M =23.07;
SD =4.26
Number of semesters M =4.52;
SD =2.31
M =4.08;
SD =1.78
M =4.50;
SD =2.10
Number of on-site exams M =4.02;
SD =2.36
M =2.75;
SD =1.98
Number of online exams M =2.93;
SD =2.13
M =2.02;
SD =1.35
Academic Dishonesty
Frequency of academic dishonesty M =1.43;
SD =0.42
M =1.51;
SD =0.49
M =1.47;
SD =0.49
Frequency of cheating in on-site
exams
M =1.33;
SD =0.81
M =1.65,
SD =1.07
Frequency of cheating in online
exams
M =2.57;
SD =1.81
M =2.43,
SD =1.78
S. Janke et al.
Computers and Education Open 2 (2021) 100055
5
87.2% of the students that took online exams only or a mix of online and
in-person exams indicated that their institutions had not applied such
assessment procedures in prior semesters.
Detailed frequencies of academic dishonesty in our sample can be
found in Table 2. It is noteworthy that three out of ve of the most
frequently reported behaviors (M >2.00) were directly tied to digital
learning and online exams (i.e., absenteeism in online classes while being
logged in, exchange with others during online tests, use of unallowed
materials during online tests). While the mean scores indicate low fre-
quencies in academic dishonesty, we also found that students showed a
substantial number of different dishonest behaviors. Overall, only 4.9% of
the participants reported that they had not engaged in any of the inves-
tigated behaviors during the summer semester 2020 at all, whereas 12.4%
indicated that they had engaged in only one of the behaviors. In contrast,
the vast majority of participants (82.7%) indicated that they had engaged
in multiple behaviors during the time in question (M =4.79 behaviors, SD
=3.81 behaviors). As particularly some of the more common behaviors
might be considered as minor infractions or less severe (e.g., engaging in
other activities during course time), we discarded the ve most frequent
behaviors with M
frequency
>2.00 in an exploratory fashion. Still, only
28.6% of the students indicated that they had not engaged in any of the
remaining, more severe, behaviors during the past semester and another
18.2% indicated that they had engaged in only one of the behaviors. In
contrast, the majority of the participants (53.2%) indicated that they had
engaged in multiple remaining behaviors during the time in question (M
=2.60 behaviors, SD =3.09 behaviors).
3.2. Impact of examination mode on academic dishonesty
We found very small effects of the mode of examination on the fre-
quency measure of academic dishonesty, F(2,1605) =4.46, p =.012,
η
2
=0.006. Paired post-hoc tests revealed that this main effect of exami-
nation mode was qualied through signicant mean differences be-
tween pure online testing (M =1.51, SD =0.49) and on-site testing (M
=1.43, SD =0.42), t(1257) = − 3.00, p =.003, d = − 0.18. In contrast,
the rate of dishonesty for mixed testing (M =1.47; SD =0.49) neither
differed signicantly from online testing, t(732) =1.17, p =.243 nor
from on-site testing, t(1221) = − 1.41, p =.158. However, neither the
overall effect, F(2,1605) =1.82, p =.162, nor a post-hoc test for online
versus on-site testing, t(1257) = − 1.76, p =.078, reached signicance
when using the amount measure for academic dishonesty.
3.3. Impact of mode of examination on cheating in exams
The descriptive data shows trends for the postulated difference be-
tween modes of examination. More specically, 31.7% of all students
who had written on-site exams indicated that they had used unallowed
assistance and/or engaged in direct exchange with other students during
the assessment. For online testing, the number of persons engaging in
these behaviors was almost twice as high with 61.4% reporting that they
had engaged in either behavior.
These descriptive trends were qualied through mixed-model ana-
lyses showing that participants more frequently reported cheating in
online tests (M
frequency
=2.22, SD =1.71) than in ofine tests (M
frequency
=1.50, SD =0.99), F(1, 1193) =62.72, p <.001 (see also Fig. 1). Using
an approximation of Cohens d as suggested by Westfall et al. [55], we
calculated an overall effect size of d =0.55 for the effect of mode of
examination on cheating frequency. These effects were robust when
using the amount measure for cheating in exams, F(1, 1162.7) =68.78, p
<.001, d =0.39. The effect still remained stable when including the
institutions experience with online exams as a potential moderator. We
neither found a signicant main effect of institutions experience with
online exams, F(1, 1643.8) =0.09, p =.766, nor a signicant interaction
effect, F(1, 1193) =0.51, p =.476. The same was true when using the
amount measure.
Finally, a repeated-measure ANOVA also showed that students that
had been tested using both modes of examination reported a higher
frequency of cheating in online exams (M
frequency
=2.43, SD =1.79)
compared to on-site exams (M
frequency
=1.66, SD =1.08), F(1327) =
92.54, p <.001,
η
2
=0.22 (see also Fig. 1). The effect remained stable
when using the amount measure for cheating in exams, F(1327) =30.89,
p <.001,
η
2
=0.09. Controlling for the ratio between online and on-site
exams did not change the pattern of the results substantially, F(1326) =
94.64, p <.001,
η
2
=0.23 (for the frequency measure); F(1326) =27.61,
p <.001,
η
2
=0.08 (for the amount measure).
Table 2
Prevalence of academic dishonesty-Sorted by relative frequency.
Behavior M SD %
Academic Dishonesty
Logging into an online course and engaging in other
activities during course time.
4.45 2.04 88.1
Solving tasks together with other students that were meant
as individual assignments.
2.87 1.91 63.7
Referencing sources in term papers that one has not read. 1.68 1.31 29.8
Copying content from the internet, a book, or an article
without naming the source.
1.50 1.01 26.4
Deliberately listing sources in the bibliography that were
not used in writing the text.
1.44 1.10 20.2
Copying entire passages from other sources when writing
term papers without indicating through proper references.
1.37 0.94 19.5
Having others do individual assignments and handing them
in as own work.
1.41 1.05 19.4
Making up excuses for missing a deadline. 1.32 0.99 13.5
Modifying information from scientic sources so that they
better t ones line of argumentation.
1.23 0.71 12.7
Letting someone else sign a course attendance sheet to cover
up not being present in the course.
1.29 0.95 11.3
Convincing someone else to log into an online course under
a false alias to mimic participation.
1.19 0.83 6.7
Submitting the same work as a learning assignment for
different courses.
1.13 0.63 6.0
Handing in or presenting an entire piece of work of another
person as own work.
1.08 0.48 4.0
Deliberately trying to manipulate an instructor through
display of emotions (e.g., crying) to get deadline
extensions or better grades.
1.07 0.45 3.5
Changing a response after a performance assessment was
graded, then reporting that the assessment had been
misgraded and requesting credit for the altered response.
1.06 0.47 2.3
Trying to bribe an instructor to get deadline extensions or
better grades.
1.05 0.39 2.1
Paying others to do ones own learning assignments. 1.05 0.40 2.1
Hiding or damaging books in the library to prevent others
from using them.
1.03 0.31 1.4
Deleting parts of online resources to prevent others from
using them.
1.02 0.29 1.1
Cheating during on-site exams (only participants with at least one on-site exam)
Using previous exams for preparation without permission to
do so.
2.56 2.13 44.5
Exchanging ideas with others about possible answers during
an examination.*
1.55 1.20 23.7
Copying answers from someone else during an examination. 1.38 0.96 18.7
Solving exam questions by using additional materials or the
internet without permission.*
1.45 1.13 18.4
Using cheat sheets during an examination. 1.41 1.11 16.5
Cheating during online exams (only participants with at least one online exam)
Solving exam questions by using additional materials or the
internet without permission.*
2.51 1.98 48.6
Exchanging ideas with others about possible answers during
an examination. *
2.50 2.02 45.9
Faking technical difculties during examinations to gain an
advantage (e.g., more time, repetition of the exam).
1.12 0.63 4.5
Notes. Scales ranged from 1 (never) to 7 (very frequently). Time of reference was
the summer semester of 2020. The fourth column (%) indicates the relative
frequency of students who indicated that they had engaged in the behavior by
answering the item with a value >1. * Items were used to compare cheating
during in-person and online tests.
S. Janke et al.
Computers and Education Open 2 (2021) 100055
6
4. Discussion
Overall, our ndings indicate that the sudden shifts from on-site to
online testing in German higher education institutions during the
COVID-19 pandemic in summer 2020 may have posed at least some
threat to academic integrity. Students reported elevated rates of cheat-
ing in online exams compared to on-site exams. Furthermore, students
whose performance was tested through both modes of examination re-
ported cheating more frequently during online exams albeit having
taken a lower number of online then on-site exams (with the average
difference being around one exam). Importantly, this nding contradicts
the alternative explanation that our results might be merely due to
differences between institutional cultures or students having more op-
portunities to cheat in online exams due to a higher number of online
exams. In fact, as students took more on-site exams and thus had
objectively more opportunities to cheat on-site, our results can be
deemed as a rather conservative test. However, beyond these differences
in cheating frequency during testing, a shift in the mode of examination
to online testing during the pandemic did not seem to drastically in-
crease the number of dishonest behaviors students engaged in, although
it slightly increased the frequency of academic dishonesty. With that
being said, we found that a high rate of the participating students re-
ported to have engaged in behavior that can be labeled as academic
dishonesty. More specically, between 71.4% to 93.8% of the partici-
pating students (depending on whether we included the most frequent
behaviors) reported that they had engaged in behaviors that infringe
rules of academic integrity during this single critical semester. At this
point, we cannot say whether these high rates are due to the specics of
our sample, typical for Germany (comparative data are lacking), or
reect a more general threat to academic integrity in the wake of the
pandemic. It is noteworthy though, that most of the prevalent critical
behaviors were directly bound to online teaching or online exams.
4.1. Theoretical implications
Our ndings provide further cumulative evidence for the debate on
whether online exams pose a risk for academic integrity. More specif-
ically, we found that the implementation of online exams that likely had
to be prepared in a short timeframe during summer 2020 was associated
with increased cheating behavior in German higher education in-
stitutions (compared to on-site exams). First and foremost, this is a
strong warning sign that (an ad-hoc shift to) online testing can make it
difcult to uphold principles of academic integrity. Nevertheless, we
found that the threat to academic integrity was mostly bound to the
testing situation, while effects on other forms of academic dishonesty
during the semester were small. This, in turn, speaks against the premise
that ad-hoc online testing fosters a climate of dishonesty. Our study,
thus, underlines the notion that ad-hoc online testing mostly affects
behavior in the testing situation, while effects on the overall climate and
broader measures of dishonesty are likely more negligible (in line with
[26]). This also means that it is important to rely on measures tailored to
assess behavior in the concrete situation rather than on omnibus mea-
sures of academic dishonesty when investigating the impact of modes of
examination on academic integrity.
On a more practical note, we would like to highlight that the par-
ticipants indicated that their institutions had little experience with on-
line exams. While we did not nd that the experience of an institution
moderated the effect of the mode of examination on cheating, the
number of participants from institutions with prior experience with
online exams was rather small in all groups (accounting only for 12.8%
of all students that were tested through online exams). It should be
noted, though, that students might not be aware of all examination
modes that are practiced at their university, thus, assessing institutional
experience with a more objective measure might be commendable in
future studies. However, even institutions or instructors that had already
used online exams in the past were likely unprepared for the challenges
evolving in the wake of a global pandemic. The simultaneously
emerging needs to adapt teaching, communication, and examination
were a strong challenge for instructors and administration alike that was
often experienced as straining and distressful [4]. In such a climate,
maintaining academic integrity likely often became a secondary priority
compared to maintaining a minimum of instruction and managing
limited resources.
At the same time, students were likely highly stressed as well. Among
other stressors, also due to being exposed to a multitude of novel online
teaching and examination methods and unclear communication from
educators and universities. This may have decreased expectations to
succeed in exams, which could explain further why students considered
cheating as viable behavior under conditions of low accountability (i.e.,
during online exams). Related to this, we would like to emphasize that
we do not mean to imply that helplesseducators were victims of
opportunistic students during the COVID-19 pandemic. Instead, the
elevated rates of cheating were probably the result of a multitude of
factors that increased stress on both sides due to the demands of a highly
unpredictable situation, in combination with impaired accountability in
swiftly implemented ad-hoc online examinations.
4.2. Potential avenues to reduce cheating rates in online exams
While our research provides important empirical insights into the
prevalence of cheating in ad-hoc online exams, it is not suitable to
provide evidence on the impact of different practical solutions for the
presented problem. However, existing research may inspire at least
three promising avenues to address the issue: Increasing accountability
through proctoring, applying examination modes that reward deep un-
derstanding, and shifting teaching from a focus on performance to a
stronger focus on learning.
First, increasing accountability is a method to counteract perceptions
that cheating online has a high likelihood of remaining undetected.
Particularly, proctoring online exams could mitigate the effects that we
found in our sample [2,3,27]. However, at this point, most higher ed-
ucation institutions (in general and likely also in our sample) still lack
experience, equipment, or clear guidelines to use proctoring in online
exams [13]. Closing this gap in testing methodology could thus be an
important step forward for higher education institutions around the
world as online exams are increasingly used for performance assess-
ment, particularly as the pandemic continues.
Second, it is important to clarify that the perceived feasibility of
cheating not only depends on accountability, but also on the complexity
Fig. 1. Frequency of cheating in exams depending on examination mode
Note. Scale ranged from 1 (never) to 7 (very frequently). Left side of the gure
indicates the overall effect derived from mixed model analysis (full sample),
while the right side indicates the effect derived from a repeated measure
ANCOVA for those students who had taken both online and on-site exams
(mixed testing group; n =349 students).
S. Janke et al.
Computers and Education Open 2 (2021) 100055
7
of the exam. Recent laboratory research has shown that the way per-
formance is evaluated in a performance test can impact cheating rates.
More specically, students were less likely to cheat in a performance test
if they assumed that the instructor would later grade the process of
solving the question rather than just the correct results. This was the case
even though good performance was incentivized through monetary re-
wards and the responsible examiner left the room for a prolonged time
during the experiment [6]. The observed reduction of cheating is likely
rooted in correctly copying a consistent work process being more dif-
cult than just copying the answers from other sources (such as fellow
studentsanswers). For practitioners, this could mean that to decrease
cheating, it might be benecial to rely on exams that are more strongly
characterized by open questions that require the students to truly reect
about the subject matter and show through their writing that they have
gained a deep understanding of the learning contents. In contrast, closed
questions, multiple choice items, and simple reproduction of knowledge
may make it easier to evaluate whether a question is answered right or
wrong, but could also be more susceptible to fraud.
Third, a substantial body of research has shown that students who
have developed strong learning goals (dened as the striving for
competence development) are less likely to cheat in exams (subsumed in
a recent meta-analysis, [31]). This is probably because students with a
strong learning focus consider cheating as a costly shortcut that un-
dermines true understanding. Fostering learning goals in students makes
it necessary to provide instruction that takes studentsinterests and
needs into account. Particularly, research has shown that giving students
autonomy over their learning process, providing valuable feedback and
recognition as well as a collaborative learning climate can evoke
learning goals in students [36]. It should be noted that enacting such
learning structures might be challenging but that it might also be an
effective way to yield lower cheating rates regardless of the mode of
examination.
Finally, while we note an increase in infringement against academic
rules, those rules themselves can be subject to debate. For instance,
universities and instructors could allow students to use older exams for
preparation, as this material may foster a deeper understanding of the
learning subject. In the same vein, one might argue that the ability to
memorize information is less valuable than knowing how to research
and integrate information. As such, open-book online exams may allow
to lift the ban on using helpful material during examination, which could
foster deep processing of the learning content over surface learning [57].
Similar arguments could be made for collaborative exams that allow
students to foster their abilities to cooperate with others and to explore
the learning content together [48]. While both open-book exams and
collaborative exams may be less feasible on-site due to constrictions of
the examination situation (necessary space, availability of technology,
level of noise due to cooperation), the feasibility of such examination
procedures could become a true asset of wide(r) spread online exami-
nation procedures. In sum, we argue that uncovering deviations from
academic norms does not necessarily mean that universities have to
further restrict examination situations to uphold these norms but could
also spark debate about the meaningfulness regarding some of these
norms.
4.3. Limitations and suggestions for future research
Our data strongly reect an unprecedented situation shaping a spe-
cial learning environment that challenged instructors [4] and students
[24,35] alike. This is both a strength and a limitation of our observations
and ndings. While our study can contribute to understanding some of
the challenges that the rapid digitalization in the wake of COVID-19
imposed on educational institutions, the results also reect this special
situation. For instance, it was striking that such a high rate of students
admitted having engaged in some sort of critical behavior during a
single semester. This might partly reect that students struggled with
the demands of a new virtual learning environment such as the need for
digital literacy and the ability to structure and organize ones learning
process in a more self-regulated way than before the pandemic [24]. As a
result, even students who normally might not have engaged in academic
dishonesty, may have been more inclined to do so under the special
circumstances. This may have been especially true when online exams
were used as an examination method due to (anticipated) reduced
accountability.
Additionally, it is possible that instructors were ambiguous in their
communication of academic standards regarding cheating in online
environments, as the distressing situation during the pandemic may
have impeded the development of a clear communication strategy.
Ambiguous communication and standards that varied between in-
structors, however, might have enhanced studentsinclination to cheat
in online environments as students typically rely on the information
given by faculty to evaluate whether their behavior is justiable [29,
45]. In sum, the special situation during the onset of the pandemic could
have fostered insecurities in students as wells as instructors, which could
have led to elevated cheating rates in online environments. Whether a
sudden shift to online testing under normal conditions (in-person
teaching, no existential threat bound to a deadly global pandemic)
evokes such a spike in cheating behavior as observed in this study must
be further tested in future studies.
As we relied on self-reports and due to academic dishonesty being
considered socially undesirable, we assume that the behavior reported
by the students represents an underestimation of the actual magnitude
of academic dishonesty [10]. Besides social desirability, one may argue
that students acting dishonestly in exams may also be more likely to lie
in online surveys. However, dishonest behavior is typically directed
towards potential gains that are bound to the respective context. This is
why situational factors strongly inuence dishonest behavior (see [39];
also [43]). While (successful) dishonest behavior has direct positive
consequences in the context of examination (i.e., good grades), this does
not apply to an anonymous online survey in which dishonesty does not
result in substantial benets. The high reported rates of dishonest
behavior that we observed in our study further underline this notion, as
we have little reason to believe that students at large were inclined to
exaggerate their engagement in socially unaccepted behaviors.
However, discussing alternate measures for academic cheating may
be helpful to get a better understanding on the value and limitations of
using self-reports for research into dishonest behavior. In this regard,
some researchers have relied on more indirect measures (e.g., concerns
of instructors, [42]; increase in performance compared to prior cohorts,
[11]). While undoubtedly insightful, these alternative assessments of
cheating have in common that they rely on additional inferences, for
instance that elevated concerns by educators are indeed valid or that
better performance must be at least partly rooted in dishonest behavior.
There is a plethora of factors like the credibility of instructors judg-
ments about cheating or changes in the difculty of exams during the
pandemic that complicate such inferences. If individuals admit to
cheating in an exam via self-reports, however, additional inferences are
not needed. Furthermore, even though social desirability bias may
impact the observed cheating rates, it is unlikely that this bias would
impact reports of cheating in on-site versus online exams differently.
Importantly, we assessed whether students had engaged in actual
behavior (retrospective admission of critical behavior) rather than
measuring their personal willingness to do so (prospective inference of
future behavior) because it is yet unclear how the latter translates into
behavior (see [25,49] for diverging positions).
An alternative assessment of cheating would be observations of
actual cheating behavior in the situation. This, however, comes with the
caveat of additional ethical and practical considerations regarding the
feasibility to directly observe cheating without informing student ser-
vices and without having the observation interfere with students
behavior (and thus breaching anonymity as well as increasing
accountability). This often limits observation of cheating to laboratory
research which allows for deception of participants under avoidance of
S. Janke et al.
Computers and Education Open 2 (2021) 100055
8
ethical pitfalls (i.e., [5]). Laboratory research, in turn, is limited
regarding external validity, as the experimental tasks that participants
work on hardly have the same relevance as their actual exams. Taken
together, this underscores a) the value of data from student self-reports,
which should b) always be interpreted in a larger research framework
that makes use of a multitude of designs to compensate for the limita-
tions of this measure.
Even though our sample is not representative, it is unlikely that our
sampling strategy resulted in a sample of students who are more prone to
dishonesty than their peers. Particularly, we did not disclose that we
were specically interested in dishonesty when advertising the study
and used a systematic recruitment procedure through ofcial mailing
lists. Additionally, empirical research has shown that voluntary survey
studies are more likely to lead to underestimations of cheating rates
[41]. Against this background, we consider the rather high rates of ac-
ademic dishonesty in our sample to be concerning, as a majority of the
participating students reported that they had infringed rules of academic
integrity over the course of only one semester. While this may not be a
direct effect of a shift to online testing (non-signicant effect on number
of behaviors), the elevated rates could reect a shift in culture due to
more absent and overwhelmed instructors and unclear rules in the
hastily assembled online learning environments. Unfortunately, we
cannot test this assumption as we lack a comparison group indicating
academic dishonesty during a regular semester. Future studies might
thus consider comparing COVID-19 data on academic dishonesty (as
presented in this study) with rates of academic dishonesty in the up-
coming years. Furthermore, we think it is important to qualify our
ndings across different countries that chose to enact large-scale digi-
talization in their higher education sector to infer whether our results
can be generalized.
Another issue with the frequentist approach to measuring dishonest
behavior was that the judgment of the relative frequency of the different
behaviors was made by the students themselves, which makes it more
prone to subjective interpretation. However, the same cannot be said for
the analyses with the amount measures, as the judgement of whether
one engaged in a certain behavior or not is likely far less biased. The
general magnitude of the effect sizes and the fact that we found the
association between mode of examination and cheating both on the
frequency measure, the amount measure, and also for students whose
performance was examined through both online and on-site exams,
enhances our condence in the robustness of our ndings.
Finally, it is important to note that the conducted research put a
strong focus on the impact of examination mode on academic dishon-
esty. Researchers have postulated a multitude of additional factors
bound to the individual (e.g., achievement motivation, [43] or person-
ality, [25]) and to the situation (e.g., social norms, [40]) that impact
dishonest behavior (see [18] for an overview). These factors may at least
partly explain the observed associations between examination mode and
cheating (as we discussed for instance regarding studentsexpectations
of reduced costs for cheating in online examination; in line with [16,
29]). It is an important avenue for future research to investigate whether
additional personal and situational factors discussed in the literature
mediate or moderate the impact of examination mode on academic
cheating.
5. Conclusions
The worldwide COVID-19 pandemic has posed severe challenges for
societies across the globe. One of these emerging challenges was the
need for rapid digitalization of all life domains including education. Our
study shows that the imposed pressure on the education system also put
a strain on academic integrity. We found high rates of self-admitted
academic dishonesty reported by students for the summer semester of
2020. Moreover, the rate of cheating in exams during this time depended
on the examination mode, with online exams being more prone to the
use of unpermitted assistance then on-site exams. An upside to these
ndings is that they challenge higher education institutions to nd an-
swers on challenges that were already emerging and merely accelerated
by the pandemic. Online testing will likely be a phenomenon of
remaining and increasing relevance. This development carries both
dangers and opportunities for performance assessment in higher edu-
cation. The broad implementation and rigorous testing of procedures
that mitigate cheating through higher accountability or complexity as
well as the discussion of academic norms that may hinder the imple-
mentation of new and innovative modes of examination (e.g., open-book
and collaborative exams) is therefore an important task for educational
research and practice.
Declaration of Competing Interest
The authors declare that they have no known competing nancial
interests or personal relationships that could have appeared to inuence
the work reported in this paper.
Acknowledgments
The presented research was made possible through a research grant
by the German Research Foundation to Stefan Janke (JA 3137/11) and
Martin Daumiller (DA 2392/11). We thank Elisabeth Limberg, Danielle
Schrepfer as well as Paula Schmelzer for their assistance with pro-
gramming and data collection and Caroline Tremble for her valuable
services in language editing.
References
[1] Akbulut Y, S
¸enda˘
g S, Birinci G, Kılıçer K, S
¸ahin MC, Odabas¸ı HF. Exploring the
types and reasons of Internet-triggered academic dishonesty among Turkish
undergraduate students: development of Internet-Triggered Academic Dishonesty
Scale (ITADS). Comput Educ 2008;51(1):46373. https://doi.org/10.1016/j.
compedu.2007.06.003.
[2] Alessio HM, Malay N, Maurer K, Bailer AJ, Rubin B. Examining the effect of
proctoring on online test scores. Online Learn 2017;21(1):14661.
[3] Arnold IJ. Cheating at online formative tests: does it pay off? Internet High Educ
2016;29:98106. https://doi.org/10.1016/j.iheduc.2016.02.001.
[4] Daumiller M, Rinas R, Hein J, Janke S, Dickh¨
auser O, Dresel M. Shifting from face-
to-face to online teaching during COVID-19: The role of university faculty
achievement goals for attitudes towards this sudden change, and their relevance for
burnout/engagement and student evaluations of teaching quality. Computers in
Human Behavior 2021;118:106677. https://doi.org/10.1016/j.chb.2020.106677.
[5] Daumiller M, Janke S. Effects of performance goals and social norms on academic
dishonesty in a test. British Journal of Educational Psychology 2020;90(2):53759.
https://doi.org/10.1111/bjep.12310.
[6] Daumiller M, Janke S. The impact of performance goals on cheating depends on
how performance is evaluated. AERA Open 2019;5(4):2332858419894276.
https://doi.org/10.1177/2332858419894276.
[7] Bashir H, Bala R. Development and validation of academic dishonesty scale (ADS):
presenting a multidimensional scale. Int J Instr 2018;11(2):5774. https://doi.org/
10.12973/iji.2018.1125a.
[8] Bates D, Maechler M, Bolker B, Walker S. Fitting linear mixed-effects models using
lme4. J Stat Softw 2015;67(1):148. https://doi.org/10.18637/jss.v067.i01.
[9] Bernardi RA, Metzger RL, Bruno RGS, Hoogkamp MAW, Reyes LE, Barnaby GH.
Examining the decision process of studentscheating behavior: an empirical study.
J Bus Eth 2004;50(4):397414. https://doi.org/10.1023/B:
BUSI.0000025039.47788.c2.
[10] Bernardi RA, Adamaitis KL. Data contamination by social desirability response
bias: an international study of studentscheating behavior. Res Prof Responsib Eth
Account 2007;11:15784. https://doi.org/10.1016/S1574-0765(06)11008-0.
[11] Bilen E, Matros A. Online cheating amid COVID-19. J Econ Behav Org 2020;182:
196211. https://doi.org/10.1016/j.jebo.2020.12.004.
[12] Burrus RT, McGoldrick K, Schuhmann PW. Self-reports of student cheating: does a
denition of cheating matter? J Econ Educ 2007;38(1):316. https://doi.org/
10.3200/JECE.38.1.3-17.
[13] Butler-Henderson K, Crawford J. A systematic review of online examinations: a
pedagogical innovation for scalable authentication and integrity. Comput Educ
2020. https://doi.org/10.1016/j.compedu.2020.104024. Advance online
publication.
[14] Chirikov I, Shmeleva E, Loyalka P. The role of faculty in reducing academic
dishonesty among engineering students. Stud High Educ 2020;45(12):246480.
https://doi.org/10.1080/03075079.2019.1616169.
[15] Clark TM, Callam CS, Paul NM, Stoltzfus MW, Turner D. Testing in the time of
COVID-19: a sudden transition to unproctored online exams. J Chem Educ 2020;97
(9):34137. https://doi.org/10.1021/acs.jchemed.0c00546.
S. Janke et al.
Computers and Education Open 2 (2021) 100055
9
[16] Costley J. Student perceptions of academic dishonesty at a cyber-university in
South Korea. J Acadc Eth 2019;17(2):20517. https://doi.org/10.1007/s10805-
018-9318-1.
[17] Crawford J, Butler-Henderson K, Rudolph J, Malkawi B, Glowatz M, Burton R,
Magni P, Lam S. COVID-19: 20 countrieshigher education intra-period digital
pedagogy responses. J Appl Learn Teach 2020;3(1):120. https://doi.org/
10.37074/jalt.2020.3.1.7.
[18] Crown DF, Spiller SM. Learning from the literature on collegiate cheating: a review
of empirical research. J Bus Eth 1998;17:683700. https://doi.org/10.1023/A:
1017903001888.
[19] Dafn LW, Jones AA. Comparing student performance on proctored and
nonproctored exams in online psychology courses. Online Learn 2018;22(1):
13145. https://doi.org/10.24059/olj.v22i1.1079.
[20] Daniels LM, Goegan LD, Parker PC. The impact of COVID-19 triggered changes to
instruction and assessment on university studentsself-reported motivation,
engagement and perceptions. Soc Psychol Educ 2021;24:299318. https://doi.org/
10.1007/s11218-021-09612-3.
[21] Davis SF, Grover CA, Becker AH, McGregor LN. Academic dishonesty: prevalence,
determinants, techniques, and punishments. Teach Psychol 1992;19(1):1620.
https://doi.org/10.1207/s15328023top1901_3.
[22] Gamage KA, Silva EKd, Gunawardhana N. Online delivery and assessment during
COVID-19: safeguarding academic integrity. Educ Sci 2020;10(11):301. https://
doi.org/10.3390/educsci10110301.
[23] Grijalva TC, Kerkvliet J, Nowell C. Academic honesty and online courses. Coll Stud
J 2006;40(1):1805.
[24] Hamdan KM, Al-Bashaireh AM, Zahran Z, Al-Daghestani A, Samira AH,
Shaheen AM. University studentsinteraction, Internet self-efcacy, self-regulation
and satisfaction with online education during pandemic crises of COVID-19 (SARS-
CoV-2). Int J Educ Manag 2021. https://doi.org/10.1108/IJEM-11-2020-0513/
full/html.
[25] Heck DW, Thielmann I, Moshagen M, Hilbig BE. Who lies? A large-scale reanalysis
linking basic personality traits to unethical decision making. Judgm Decis Mak
2018;13(4):35671.
[26] Holden, O., Kuhlmeier, V.A., & Norris, M. (2020). Academic integrity in online
testing: a research review. PsyArXiv. 10.31234/osf.io/rjk7g.
[27] Hylton K, Levy Y, Dringus LP. Utilizing webcam-based proctoring to deter
misconduct in online exams. Comput Educ 2016;92:5363. https://doi.org/
10.1016/j.compedu.2015.10.002.
[28] Khan ZR, Balasubramanian S. Students go click, ick and cheat... e-cheating,
technologies and more. J Acad Bus Ethics 2012;6:126.
[29] King CG, Guyette RW, Piotrowski C. Online exams and cheating: an empirical
analysis of business studentsviews. J Educ Online 2009;6(1):111.
[30] King DL, Case CJ. E-cheating: incidence and trends among college students. Issues
Inf Systems 2014;15(1):207. https://doi.org/10.48009/1_iis_2014_20-27.
[31] Krou MR, Fong CJ, Hoff MA. Achievement motivation and academic dishonesty: a
meta-analytic investigation. Educ Psychol Rev 2020;33:42758. https://doi.org/
10.1007/s10648-020-09557-7.
[32] Kuznetsova A, Brockhoff PB, Christensen RHB. lmerTest Package: tests in linear
mixed effects models. J Stat Softw 2017;82(13):126. https://doi.org/10.18637/
jss.v082.i13.
[33] Ladyshewsky RK. Post-graduate student performance in ‘supervised in-class
vs.‘unsupervised onlinemultiple choice tests: implications for cheating and test
security. Assess Eval High Educ 2015;40(7):88397. https://doi.org/10.1080/
02602938.2014.956683.
[34] Lanier MM. Academic integrity and distance learning. J Crim Justice Educ 2006;17
(2):24461. https://doi.org/10.1080/10511250600866166.
[35] Lestari, W., Aisah, L., & Nurafah, L. (2020). What is the relationship between self-
regulated learning and studentsmathematical understanding in online lectures
during the covid-19 pandemic? Paper presented at the 2nd International Seminar
on Applied Mathematics and Mathematics Education (2nd ISAMME), Cimahi,
Indonesia.
[36] Lüftenegger M, Van De Schoot R, Schober B, Finsterwald M, Spiel C. Promotion of
studentsmastery goal orientations: does TARGET work? Educ Psychol (Lond)
2014;34(4):45169. https://doi.org/10.1080/01443410.2013.814189.
[37] Marinoni G, Vant Land H, Jensen T. The impact of covid-19 on higher education
around the world. International Association of Universities; 2020.
[38] McCabe DL. It takes a village: academic dishonesty & educational opportunity. Lib
Educ 2005;91(3):2631.
[39] McCabe DL, Trevino LK. Individual and contextual inuences on academic
dishonesty: a multicampus investigation. Res High Educ 1997;38(3):37996.
https://doi.org/10.1023/A:1024954224675.
[40] McCabe DL, Trevino LK, Buttereld KD. Dishonesty in academic environments: the
inuence of peer reporting requirements. J High Educ 2001;72(1):2945. https://
doi.org/10.1080/00221546.2001.11778863.
[41] Miller A, Shoptaugh C, Parkerson A. Under reporting of cheating in research using
volunteer college students. Coll Stud J 2008;42(2):32639.
[42] Moralista R, Oducado RM. Faculty perception toward online education in higher
education during the coronavirus disease 19 (COVID-19) pandemic. Univ J Educ
Res 2020;8(10):473642. https://doi.org/10.13189/ujer.2020.081044.
[43] Murdock TB, Anderman EM. Motivational perspectives on student cheating:
toward an integrated model of academic dishonesty. Educ Psychol 2006;41(3):
12945. https://doi.org/10.1207/s15326985ep4103_1.
[44] Nguyen JG, Keuseman KJ, Humston JJ. Minimize online cheating for online
assessments during COVID-19 pandemic. J Chem Educ 2020;97(9):342935.
https://doi.org/10.1021/acs.jchemed.0c00790.
[45] Raines DA, Ricci P, Brown SL, Eggenberger T, Hindle T, Schiff M. Cheating in
online courses: the student denition. J Effect Teach 2011;11(1):809.
[46] Richardson R, North M. Strengthening the trust in online courses: a common sense
approach. J Comput Sci Coll 2013;28(5):26672.
[47] Rogers CF. Faculty perceptions about e-cheating during online testing. J Comput
Sci Coll 2006;22(2):20612. https://doi.org/10.5555/1181901.1181936.
[48] Shen J, Hiltz SR, Bieber M. Learning strategies in online collaborative
examinations. IEEE Trans Prof Commun 2008;51(1):6378. https://doi.org/
10.1109/TPC.2007.2000053.
[49] Steger D, Schroeders U, Wilhelm O. Caught in the act: predicting cheating in
unproctored knowledge assessment. Assessment 2021;28(3):100417. https://doi.
org/10.1177/1073191120914970.
[50] Stiles BL, Wong NCW, LaBeff EE. College cheating thirty years later: the role of
academic entitlement. Deviant Behav 2018;39(7):82334. https://doi.org/
10.1080/01639625.2017.1335520.
[51] Stuber-McEwen D, Wiseley P, Hoggatt S. Point, click, and cheat: frequency and
type of academic dishonesty in the virtual classroom. Online J Distance Learn Adm
2009;12(3). Retrieved from, http://www.westga.edu/~distance/ojdla/fall123/st
uber123.html.
[52] Teixeira AA, Rocha MF. Cheating by economics and business undergraduate
students: an exploratory international assessment. High Educ 2010;59(6):663701.
https://doi.org/10.1007/s10734-009-9274-1.
[53] Varble D. Reducing cheating opportunities in online test. Atl Market J 2014;3(3):
13149.
[54] Watson G, Sottile J. Cheating in the Digital Age: do students cheat more in on-line
courses? Online J Distance Learn Adm 2008;13(1). Retrieved from, https://mds.
marshall.edu/cgi/viewcontent.cgi?article=1000&context=eft_faculty.
[55] Westfall J, Kenny DA, Judd CM. Statistical power and optimal design in
experiments in which samples of participants respond to samples of stimuli. J Exp
Psychol Gen 2014;143(5):202045. https://doi.org/10.1037/xge0000014.
[56] Wigeld A, Eccles JS. Expectancy-value theory of achievement motivation.
Contemp Educ Psychol 2000;25(1):6881. https://doi.org/10.1006/
ceps.1999.1015.
[57] Williams JB, Wong A. The efcacy of nal examinations: a comparative study of
closed-book, invigilated exams and open-book, open-web exams. Br J Educ
Technol 2009;40(2):22736. https://doi.org/10.1111/j.1467-8535.2008.00929.x.
S. Janke et al.
... Within the currently expanding data base, the tendency of an increased number of cheating attempts in digital exam settings compared to face-to-face exams becomes apparent: According to a survey of 1608 students at German universities by Janke et al. (2021), 31.7% of the students surveyed report having used unauthorized aids or communicated with other students in face-to-face exams. In online exams, this figure is almost twice as high at 61.4% [4]. ...
... Within the currently expanding data base, the tendency of an increased number of cheating attempts in digital exam settings compared to face-to-face exams becomes apparent: According to a survey of 1608 students at German universities by Janke et al. (2021), 31.7% of the students surveyed report having used unauthorized aids or communicated with other students in face-to-face exams. In online exams, this figure is almost twice as high at 61.4% [4]. It should be noted that no statistics of officially confirmed cheating attempts in face-to-face and digital exams could be found or viewed so far. ...
... King and Case (2014) also find that not only do a higher number of students cheat on online exams, but those who cheat do so with increased regularity (3.3 times per semester) compared to cheating in face-to-face exams within a semester (2.9 times per semester) [6]. The result of increased number of cheating attempts in online exams, compared to face-to-face exams, is also reached by Dendir and Maxwell (2020) [4] or Varble (2014) [7]. At the same time, Weiner and Hurtz (2017) show that there are no significant differences between exam groups working in a PC pool and those taking the exam outside the PC pool -where both groups were under observation with proctoring measures [8]. ...
Article
Full-text available
Seit dem Sommersemester 2020 werden an der TU Dresden, sowie fast allen deutschen und internationalen Hochschulen, Prüfungen zum wesentlichen Anteil digital durchgeführt. Diese Veränderung in der Prüfungslandschaft, auch an der TU Dresden, hat unweigerlich bereits bekannte, aber auch neue Fragen zur Planung und Durchführung betrugssicherer (digitaler) Prüfungen mit sich gebracht. Diskussionen zu Prävention, Nachweis oder Sanktion von Betrugsversuchen werden mitunter leidenschaftlich geführt, bedürfen allerdings einer evidenzbasierten Grundlage, um die Sachlage angemessen einschätzen und Handlungsmöglichkeiten ableiten zu können. Nachfolgend werden bestehende Erkenntnisse zur aktuellen Situation aus Umfragen an der TU Dresden sowie (inter-)national gebündelt dargestellt, die TUD und eine Fakultät an der TUD unter die Lupe genommen und Handlungsmöglichkeiten abgeleitet sowie einzubeziehende Aspekte der Thematik diskutiert. In diesem Beitrag wird in keiner Weise ein Generalverdacht des Betrugs in digitalen Prüfungen gegenüber Studierenden ausgesprochen. Ziel ist, die Thematik von Betrugsversuchen in digitalen Prüfungen unter einer wissenschaftlichen und didaktischen Brille zu betrachten.
... Според редица изследвания (Sariasih & Tisnawijaya, 2019) (Hodges, 2017), (Jones, 2011), още преди настъпване на пандемията, нерегламентирани и неправомерни действия на студентите по време на изпитните процедури в онлайн среда са многократно отчитани и признавани и от самите студенти. ...
... Прилагането на различни подходи при изпитването в онлайн среда се счита за важен елемент за осигуряване на максимална честност при провеждането на изпита и свеждане на измамите до минимум. Няколко фактора, обаче, могат да възпрепятстват преподавателите да използват стратегии за намаляване на измамите, като се започне от липсата на знания за тези процедури, липсата на мотивация за прилагането им, а също и технически проблеми, като например липса на достъп до софтуер за проверка на измами (Janke, et al., 2021). ...
Article
The article focuses on studying the success rate of NSA “Vasil Levski” students in the period of distance semester examinations in the conditions of the COVID-19 pandemic. The average exam scores between in-person and remote examination periods are compared. The following indicators were examined: the number of examinations held and the number of examinations passed by students; distribution in absolute and relative values of the scores obtained in the range according to the Bulgarian assessment scale; averages of the scores obtained. It has been found that the distance learning platform virtual.nsa.bg offers favorable conditions for quality online exams, which allow obtaining grades adequate to the knowledge shown by the examinees. The methodology by which test exams are created in the platform, and the security mechanisms developed when conducting exams significantly reduce the opportunities for examinees to artificially increase their grades without having demonstrated the required knowledge.
... While lectures are relatively simple to publish and access virtually, maintaining integrity for online student assessment and testing may be more challenging [4]. Research to date is sparse and inconclusive regarding whether academic dishonesty is more, less, or equally prevalent in an online environment [5][6][7]. However, online, high-stakes, summative examinations present two of three conditions that are positive predictive factors of student cheating behavior (i.e., opportunity via the examination's online medium and pressure via the high-stakes nature of the assessment), thereby raising the concern for academic dishonesty [8]. ...
... Our observations regarding impact on academic integrity are contrary to those of another study where ad hoc online testing implemented during the pandemic was found to have negative implications on academic integrity [6]. A different study found that 52% of students perceived there to be no difference in ease of cheating for online vs. traditional assessments [22]. ...
Article
Full-text available
Remote proctoring is often used to ensure testing integrity in a distance education environment but may impact academic performance. This quasi-experimental study aimed to evaluate changes in examination scores after transitioning to remote proctoring during the COVID-19 pandemic. Student pharmacists (n = 384) served as their own controls in this before-after analysis of examination scores with in-person versus remote proctoring. To assess differences in examination scores among students with varying levels of testing anxiety, students were classified into low, moderate, or high testing anxiety groups based on their Cognitive Test Anxiety Scale-Second Edition (CTAS-2) score. Students were also stratified into two groups based on their cumulative grade point average (GPA). After transitioning to remote proctoring, examination scores significantly decreased for first-year (P1) students but significantly increased for second-year (P2) students. When stratified by CTAS-2 score, no significant difference in examination scores was found. When stratified by GPA, no significant difference in examination scores was found for P1 students, but a significant improvement was noted for P2 students with remote proctoring. The results of this study indicate that examination scores do not consistently improve or decline after introducing remote proctoring even when considering a student's GPA and level of testing anxiety.
... Online academic cheating related to the COVID-19 crisis has already been studied. Researchers conducted a nationwide online assessment throughout Germany by contacting students through an official mailing list [3]. The study was conducted by asking students whether they had cheated in online exams during the COVID-19 pandemic to compare the numbers to prepandemic statistics. ...
... The problem of online cheating in clinical settings (particularly unsupervised rapid antigen tests) has not yet been addressed in previous papers and is the subject of this study. The closest subject has been about attitudes towards academic online testing [3]- [10] and many others. However, this paper will convince the reader that academic online testing has little or no resemblance to clinical testing and that the proctoring methods of the former cannot be used in the registration of COVID test results in the adverse scenario. ...
Article
If there are not too many candidates for rapid antigen tests compared to the number of medics available (let’s term this as a mild scenario), candidates can scan their results or share live videos of their test with medics [2]. These methods are already being used by various providers of rapid antigen tests [2], which is a valid way to verify candidates’ test results for mild scenarios. Now imagine the adverse scenario where urgent unsupervised tests are needed for mass evacuation and isolation, attending work and meetings or travel (let’s term this as an adverse scenario). In this scenario, there would be too many candidates for testing [2]. Rapid antigen tests from an approved company [1] are sufficiently accurate, and their clinical validity is beyond the scope of this paper. This study is only concerned with the online registration of results. However, the abovementioned scanning/video registration methods would be impractical in the above-defined adverse scenario. This is because scans and videos require the availability of medics. When the lockdowns were lifted, we faced an adverse situation. Expert companies in this field stepped in to help. For instance, HCA [14] partnered with Healgen [20] so that candidates could register their test results via a Web portal. However, there was no mechanism to verify whether or not candidates cheated. Therefore, the reporting phase of an unsupervised test in an adverse scenario is open for cheating: if a candidate’s result was actually positive, or if they did not take the test at all, they can still register a negative outcome with 100% success on the online portal [2]. This paper convinces the reader that there may be a hugely lucrative area of online cheating – the registration of unsupervised rapid antigen test results. I also propose a potential artificial intelligence (AI) solution, as well as a novel creative solution which I am already working on in my next paper. Keywords: online cheating; registration of lateral flow test results; COVID software; COVID misinformation; unsupervised tests; antifraud COVID testing
... Increases in cheating behaviour in online settings have been recorded in studies internationally. In Germany, for example, shifts from on-site to online education programmes in the Covid era were found to cause more cheating among students in online than in on-site exams (Janke, Rudert, Petersen, Fritz & Daumiller, 2021). The effects on other measures of academic dishonesty were negligible. ...
... The effects on other measures of academic dishonesty were negligible. Janke et al. (2021) concluded that negative consequences for integrity are associated with the application of ad hoc online testing. ...
Conference Paper
Full-text available
"The Covid-19 pandemic of the last two years is having an immense effect on teaching and learning in higher education. The rapid shift to online assignments and examinations in response to the pandemic and the consequent lockdown forced higher education institutions to become innovative with regard to online assessment. Furthermore, academic integrity during online examinations is a crucial concern since it affects the quality and trustworthiness of examination systems in higher education. In our experiences and according to course reports by lecturers at the largest distance education university in South Africa (Unisa), students handled online assessment in varied ways, which ranged from honesty to students being guilty of copy-and-paste and students assisting other students or phoning somebody for assistance. The two main research questions were: what is involved in academically dishonest behaviours in online courses, and can digital technologies such as online invigilator applications contribute towards academic integrity? The purpose of this exploratory case study was to analyse the types of challenges experienced by Baccalaureus Educationis (BEd) and Postgraduate Certificate in Education (PGCE) students during fully online examinations. We want to propose guidelines for instructors and administrators in their decision-making process regarding online evaluations and encourage future studies that will form the foundation of evidence-based practices. The study further focused on a new app referred to as the Invigilator Application (IA). This app was compulsory for students to use during their online assessment, and our interest is to discover how the IA may contribute towards academic integrity. The findings are reported in terms of the cheating behaviour that occur in different components of course assessments and are discussed in terms of personal motivation theory and broader social and community pressures."
... 2)-a period of time corresponding to when many courses moved to online delivery and assessment. Recent research has also identified an increase in cheating in Canada (DeCoito & Estaiteyeh, 2022), Germany (Janke et al., 2021), and Spain (Comas-Forgas et al., 2021) over the course of the pandemic. ...
Article
Full-text available
The coronavirus pandemic impacted all aspects of society, causing countries and local communities to close workplaces, move schools to remote instruction, limit in-person contact, cancel public gatherings, and restrict travel. Attempts to mitigate COVID-19 through remote instruction provided unique opportunities for researchers to examine the resources teachers utilize to drive their practices. We examine the impacts of the pandemic on grades 6-12 mathematics teachers and math interventionists, with particular attention to teachers’ integration of digital resources. Using purposive sampling, we surveyed 50 participants—across urban, suburban, and rural districts—throughout the United States. The descriptive survey focused on six aspects of teachers’ practices with digital resources. Results indicate that challenges encountered and lessons learned included a lack of student engagement and motivation, increased distractions, and varied access to technology. Integration of technology did not positively impact students’ mathematical proficiency across all teachers. Common resources used across planning of lessons, implementation of instruction, and assessment included the Google platform, Desmos, and GeoGebra. Where appropriate, we situate our results within the larger context of recent international research. These findings support teacher practices that constantly attempt to optimize students’ mathematics and social emotional learning, regardless of the environment or situation.
... Measuring these elements becomes challenging in a remote setting arrangement as the reliability of online assessments (Verhoef and Coetser 2021) gains interest within the academic space. Many research studies have questioned this in relation to promoting learning --especially with the online assessment practices conducted during Covid-19 lockdown (Janke et al., 2021;Ngqondi et al., 2021;Elsalem et al. 2021;Nwosu and Chukwuere 2020;Bilen and Matros 2020;Elzainy et al., 2020;Miller 2020;Quality Assurance Agency, 2020). This brings to the limelight the issues of quality and integrity in online assessments, which gained prominence during the Covid-19 pandemic lockdown. ...
Conference Paper
Full-text available
The outbreak of Coronavirus (Covid-19) and the associated government containment measures to address the problem forced many universities across the globe to adopt or deepen their online teaching and learning activities. As a result, assessments that ordinarily follow pedagogical approaches of face-to-face classroom-based practices and procedures, moved to online variants-a novel practice for many academics across disciplines. Expectedly, such shifts have ignited debates among scholars about assessments quality and academic integrity. From both academic and students' perspectives, the aim of this paper is to review systematically, and critically evaluate and understand the relationship between online assessments and performance outcomes in safeguarding quality and academic integrity. Applying a qualitative approach, data were collected through a Systematic Literature Review (SLR) framework, from Experiential Research Method (ERM) and from Informal Shared Conversations (ISC) with postgraduate students. It is hoped that the findings of the study will contribute positively towards efforts to strengthen existing ethical values in policy and practice within the academia.
Article
Full-text available
The health contingency caused by the COVID-19 had a strong impact on the educational field, mainly due to the drastic replacement of face-to-face sessions with virtual sessions. This situation forced mathematics teachers to make several adjustments to cope with this new scenario. In this qualitative, exploratory, transversal, and descriptive paper, based on case studies, we identified didactical adaptations implemented by three mathematics teachers, from a public high school, to teach their courses. Information was collected employing open interviews, field notes, official documents, and informal communications. The didactical adjustments were classified into three categories: planning, implementation, and evaluation. The main results are that teachers dedicated more time to planning, in relation to the face-to-face scenario, due to the search for videos which were employed as a means for supporting the instruction process. In addition, teaching styles were associated with traditional approaches. The main disadvantages of the virtual approach, mentioned by the teachers, was the lack of interaction between students; as well as a lack of information on gestural and corporal expressions of the students which acts as a mechanism to complement the evaluation processes.
Article
The article documents faculty experiences with the shift online due to the pandemic and provides recommendations to science, technology, engineering, and mathematics (STEM) instructors. Over 100 faculty members were surveyed on these topics and contrasted with previously reported student experiences.
Article
Full-text available
COVID-19 had a severe impact on teaching and learning in schools and tertiary education institutions. Human Rights Watch (Impact of Covid-19 on Children’s Education in Africa, 2022) posits that due to the closure of many schools in African countries, a significant majority of learners were excluded from continuing their education as learning and teaching halted. The pandemic has also had its effect on the higher education sector within the South African context, however, data about non-attendance and drop-out seems not to be freely available as revealed by searching online. Nonetheless, it is reasonable to assume that there were drop-outs and non-attendance by students, however, not to the same extent as within schools as universities embarked on contingency plans to minimize dropouts. Nelson Mandela University in Gqeberha South Africa where the author of this editorial resides, implemented contingency plans within weeks of the COVID pandemic outbreak to offer students ICT devices on loan, e.g. by debiting it against their student accounts while Wi-Fi data access was secured through collaboration with internet providers. It appears that the university sector was able to be more proactive than schools, which can probably be attributed to the fact that the majority of universities have their own ICT infrastructures. In addition, many modules incorporated the use of ICT in various ways, including the use of learning management systems such as Moodle or similar. This resulted in many universities being better prepared to embrace the opportunity to extend online teaching, learning, and assessment in various forms during the COVID-19 pandemic.
Article
Full-text available
During the northern hemisphere Winter 2020 academic term, university students had to adjust to remote learning in response to the COVID-19 pandemic. This abrupt change provided a unique opportunity to examine students’ motivation, engagement and perceptions of success and cheating under two learning conditions, namely traditional and remote. We used a single survey to collect retrospective self-report data from a convenience sample of Canadian undergraduate students (n = 98) about their motivation, engagement and perceptions of success and cheating before COVID-19 and then in remote learning. Students' achievement goals, engagement and perceptions of success all significantly decreased, while their perceptions of cheating increased. Moreover, we used regression analyses to examine associations amongst achievement goals and engagement, perceptions of success and cheating concerns. Mastery-approach goals were positively associated with more engagement and higher perceptions of success. Achievement goals were unrelated to cheating. Students in large classes and who were originally concerned about cheating became more concerned about cheating in remote learning conditions. Our study provides information to researchers and instructors about how achievement goals relate to student outcomes across learning conditions. By extension, we provide timely recommendations for instructors as they continue to wrestle with how to deliver their courses during the COVID-19 pandemic.
Article
Full-text available
Covid-19 pandemic forced online lecturing and learning from home (LFH) to be carried out on all campuses in Indonesia. One obstacle in implementing LFH is how students can organize themselves so they can learn well from home. Based on observations in class, students who can manage their learning (Self-Regulated Learning, SRL) well tend to better understand the mathematical material provided. Mathematical understanding is very important because it becomes the basis for students to be able to master other higher mathematical abilities. The purpose of this research is to determine the relationship between self-regulated learning and students’ mathematical understanding in online lectures during the covid-19 pandemic. This research is qualitative descriptive research. The subjects in this research were 14 students contracted discrete mathematics courses on mathematics education department at Wiralodra University in the 2019/2020 event semester by purposive sampling technique. The instruments of this research were self-regulated learning questionnaires, mathematical understanding tests and interviews. Data analysis consist of data reduction, data display, and conclusion drawing/verification. The results showed that there are directly proportional relationship between self-regulated learning and students’ mathematical understanding. Based on these findings, it is necessary to apply self-regulated learning strategies in lectures to improve students’ mathematical understanding.
Article
Full-text available
Globally, the number of COVID-19 cases continues to rise daily despite strict measures being adopted by many countries. Consequently, universities closed down to minimise the face-to-face contacts, and the majority of the universities are now conducting degree programmes through online delivery. Remote online delivery and assessment are novel experiences for many universities, which presents many challenges, particularly when safeguarding academic integrity. For example, invigilated assessments, often considered as more secure, are not an option given the current situation and detecting any cheating would be significantly challenging. This paper reviews assessment security in the digital domain and critically evaluates the practices from different universities in safeguarding academic integrity, including associated challenges.
Article
Full-text available
This research determined the perception toward online education among faculty in a State College in the Philippines. This study used a descriptive online survey involving a sample of 27 faculty members. Statistical tools employed were descriptive statistics, Mann-Whitney U test, and Kruskal-Wallis test. Research findings indicated that the majority of faculty had intermediate computer competency and had no training in online teaching, with only a few having a very stable internet connection. Faculty considered online education to result in more academic dishonesty, impersonal and lack feeling compared to face-to-face classes, and difficult to manage in terms of technology. Additionally, faculty were undecided if they are in favor of online education. The faculty significantly differed whether they are in favor of online education based on age, sex, college, educational attainment, years in teaching, academic rank, level taught, and employment status. Faculty of Higher Education Institutions must be provided with continued support and training as they adapt to the new normal in the higher education landscape and as they embrace the instructional challenges brought by the Coronavirus disease 19 pandemic.
Article
Purpose This study aimed to investigate Jordanian university students' interaction, Internet self-efficacy, self-regulation and satisfaction regarding online education during the COVID-19 pandemic. Design/methodology/approach A correlational cross-sectional design was utilized using convenience sampling to include 702 undergraduate students from Jordanian universities using an online self-administered questionnaire. Descriptive statistics, T -tests, one-way ANOVA and multiple regression analyses were used to analyze the data. Findings The mean score of students' satisfaction was low ( m = 45.14, SD = 25.62). Regarding student's interaction, learner-instructor interaction had the highest total mean score ( m = 58.53, SD = 24.51), followed by learner-learner interaction ( m = 47.50, SD = 22.64). Learner-content interaction had the lowest total mean score ( m = 45.80, SD = 24.60). Significant differences in students' satisfaction were identified according to the level of education, university type and marital status. Significant predictors of students' satisfaction with online education were self-regulated learning, Internet self-efficacy, learner-content interaction, learner-learner interaction and the number of e-learning theoretical courses. Originality/value Online education is not well-established in developing countries. This study contributed to the limited knowledge of university students’ preparedness and satisfaction with online education during the early stage of COVID-19 pandemic.
Article
As a result of the COVID-19 pandemic, many faculty members were required to abruptly shift from face-to-face to online teaching. Within this, some instructors managed well, while others struggled. To elucidate interindividual differences in online teaching and learning during this unexpected circumstance, we focus on faculty members’ attitudes towards this shift and examine their associations with underlying motivations as well as burnout/engagement and student learning. We analyzed longitudinal data of 80 faculty members’ achievement goals during the semester prior to shifting to online teaching, as well as their attitudes and burnout/engagement during the first semester with enforced online teaching. We additionally included 703 student ratings of these faculty members’ teaching quality. Results indicated that learning approach goals of faculty were positively associated with perceiving the shift to online teaching as a positive challenge and as useful for their own competence development. Conversely, performance (appearance) avoidance and work avoidance goals went along with perceiving this change as threatening, which was in turn positively related to burnout levels and negatively related to student ratings of teaching quality. Taken together, these findings point to the relevance of faculty goals and attitudes for successful online teaching and learning.
Article
We present evidence of cheating that took place in online examinations during COVID-19 lockdowns and propose two solutions with and without a camera for the cheating problem based on the experience accumulated by online chess communities over the past two decades. The best implementable solution is a uniform online exam policy where a camera capturing each students computer screen and room is a requirement. We recommend avoiding grading on a curve and giving students less time but simpler questions on tests.
Article
Digitization and automation across all industries has resulted in improvements in efficiencies and effectiveness to systems and process, and the higher education sector is not immune. Online learning, e-learning, electronic teaching tools, and digital assessments are not innovations. However, there has been limited implementation of online invigilated examinations in many countries. This paper provides a brief background on online examinations, followed by the results of a systematic review on the topic to explore the challenges and opportunities. We follow on with an explication of results from thirty-six papers, exploring nine key themes: student perceptions, student performance, anxiety, cheating, staff perceptions, authentication and security, interface design, and technology issues. While the literature on online examinations is growing, there is still a dearth of discussion at the pedagogical and governance levels.