Access to this full-text is provided by Springer Nature.
Content available from BMC Medical Education
This content is subject to copyright. Terms and conditions apply.
R E S E A R C H A R T I C L E Open Access
Learning pathology using collaborative vs.
individual annotation of whole slide
images: a mixed methods trial
Michael Sahota, Betty Leung, Stephanie Dowdell and Gary M. Velan
*
Abstract
Background: Students in biomedical disciplines require understanding of normal and abnormal microscopic
appearances of human tissues (histology and histopathology). For this purpose, practical classes in these disciplines
typically use virtual microscopy, viewing digitised whole slide images in web browsers. To enhance engagement,
tools have been developed to enable individual or collaborative annotation of whole slide images within web
browsers. To date, there have been no studies that have critically compared the impact on learning of individual
and collaborative annotations on whole slide images.
Methods: Junior and senior students engaged in Pathology practical classes within Medical Science and Medicine
programs participated in cross-over trials of individual and collaborative annotation activities. Students’
understanding of microscopic morphology was compared using timed online quizzes, while students’perceptions
of learning were evaluated using an online questionnaire.
Results: For senior medical students, collaborative annotation of whole slide images was superior for
understanding key microscopic features when compared to individual annotation; whilst being at least equivalent
to individual annotation for junior medical science students. Across cohorts, students agreed that the annotation
activities provided a user-friendly learning environment that met their flexible learning needs, improved efficiency,
provided useful feedback, and helped them to set learning priorities. Importantly, these activities were also
perceived to enhance motivation and improve understanding.
Conclusion: Collaborative annotation improves understanding of microscopic morphology for students with
sufficient background understanding of the discipline. These findings have implications for the deployment of
annotation activities in biomedical curricula, and potentially for postgraduate training in Anatomical Pathology.
Keywords: Collaborative learning, Virtual microscopy, Annotation, eLearning
Background
Histology and histopathology are the studies of micro-
scopic morphology of normal and abnormal tissues, re-
spectively. Traditionally, the learning and teaching of
both histology and histopathology required the use of
glass slides and light microscopes. Students would exam-
ine slides and attempt to compare what they saw down
their microscope to the examples provided by their in-
structor. In the context of increasing class sizes in med-
ical schools, the traditional model of learning histology
and histopathology using light microscopy (LM) has be-
come impractical.
Virtual microscopy (VM) is the use of computer tech-
nology to view digitised versions of glass slides as whole
slide images (WSIs) [1]. To enable this approach, WSIs
are typically served via the Internet and viewed with a
web browser. An important advantage of VM is that it
allows access to WSIs outside of scheduled class times
[1–5]. It has been demonstrated that students access
VM materials throughout the day [6]. Additionally, VM
may also enhance students’learning [7, 8]. However, it is
worth noting that VM can be expensive to set up and is
prone to technical difficulties [1–4, 8–10].
* Correspondence: g.velan@unsw.edu.au
Department of Pathology, School of Medical Sciences, Faculty of Medicine,
UNSW Australia, Sydney, Australia
© The Author(s). 2016 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and
reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to
the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver
(http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
Sahota et al. BMC Medical Education (2016) 16:311
DOI 10.1186/s12909-016-0831-x
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
As students view identical tissue sections using VM,
the risk of variability in learning materials is eliminated
[1,2,4,8,11–13]. However, this may promote the ten-
dency for students to undervalue the importance/exist-
ence of variation in tissue sections [14]. It has been
suggested by Helle and colleagues [15] that learning
using virtual microscopy results in greater learning gains
for high-performing students, compared with lower-
performing students. However, aggregated analysis of all
participants in that study showed no significant differ-
ence between students learning with VM and LM.
In contrast to the studies promoting the benefits of
VM for learning, several trials comparing VM with trad-
itional methods of learning histology and histopathology
found no net improvement in student performance [1, 6,
10, 16–19]. These findings are unsurprising, because ac-
tivities that promote interaction and engagement with
microscopic morphology are needed to improve learning
outcomes, irrespective of the medium employed. Never-
theless, a meta-analysis of virtual microscopy using mul-
tiple standardised comparative studies concluded that,
on balance, VM is superior to LM for learning [14].
From 2002, UNSW Australia (UNSW) transitioned
from using LM to VM in histology and pathology clas-
ses. This innovation significantly enhanced students’
learning experiences [18]. Each year over 3000 biomed-
ical students in UNSW undertake 100,000 student hours
of practical work in subjects that require an understand-
ing of microscopic appearances of human tissue. As a
member of the Biomedical Education Skills and Training
(BEST) Network established in 2013, UNSW adopted
the use of Slice™[20], a biomedical image repository and
image viewer, to enhance the efficiency and availability
of viewing WSIs.
VM has enabled the establishment of ‘digital laborator-
ies’, which allow students and teachers to build unique and
personalised learning materials, i.e. annotations [3–6, 8,
12, 21]. Such annotations may enhance learning by pro-
moting longer and more meaningful interactions between
students and learning resources [21].
Recently, the functionality of Slice was enhanced to
enable users to collaboratively annotate WSIs in real-
time (Fig. 1). This facilitates the implementation of col-
laborative learning activities involving both students and
teachers. Using Slice, learners can collaborate by sharing
annotations on a common image layer using any device,
from any location.
Collaborative learning is an educational methodology
that focuses upon the interactions between participants
in the learning process, creating a sense of community.
Specifically, a sense of community promotes opportun-
ities for active engagement and interaction, which in
turn result in improved levels of self-perceived learning,
skill enhancement, enjoyment, engagement and other
learning outcomes for individuals [22–26]. Furthermore,
factors such as group dynamics and student perceptions
of learning can potentiate the learning outcomes of col-
laboration [27, 28].
It is therefore unsurprising that collaborative learning
modalities have been globally employed across many dif-
ferent fields. Collaborative approaches have been progres-
sively incorporated into virtual environments that have
been shown to further improve students’engagement and
learning outcomes [29–34]. Such forms of learning are
categorised as computer-supported collaborative learning
(CSCL) environments. By utilising computers, more ave-
nues of achieving and displaying collaborative interaction
are available for exploration [35].
Additionally, CSCL environments can aid in the
navigation of complex tasks by reducing cognitive
load.Thisisachievedbyenablingstudentstofocus
on sub-elements to address a bigger and more com-
plex problem [36]. In such socio-constructivist envi-
ronments, the role of the teacher changes from a
Fig. 1 Screenshot showing the interface and capabilities of Slice for annotation of WSIs. The ability to annotate and invite others to the
annotation layer is shown via the blue links on the top left of the figure. Users of the collaborative layer and their respective annotations are
shown in a list to the left of the figure. All annotations in the current viewing area (indicated by the navigation box at the top right) are
represented simultaneously on the screen (centre)
Sahota et al. BMC Medical Education (2016) 16:311 Page 2 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
dispenser of knowledge to a facilitator of knowledge
exchange and co-creation.
Thus, students readily accept both VM and collabora-
tive learning approaches. The existing literature indicates
that both approaches independently benefit student
learning. However, very few studies evaluate the qualita-
tive and quantitative impact on student learning of col-
laborative learning using VM [14].
Implementing a virtual collaborative learning environ-
ment has been shown to result in improved learning effi-
ciency for students with no degradation in summative
examination performance. Students also provided posi-
tive feedback regarding the ‘opportunity to collaborate’
when utilising VM [5, 6, 16, 19, 37–40]. However, one of
the rare studies that described collaborative annotations
on WSI [6], while demonstrating enhanced engagement
by students, did not show any improvement in learning
outcomes.
Thus, it remains to be determined whether collaborative
learning using VM is significantly better for knowledge ac-
quisition than individual annotation. The present study
aimed to evaluate the quantitative and qualitative differ-
ences in learning with individual and collaborative annota-
tion of WSIs for cohorts of junior and senior students.
Methods
Participants
Students enrolled in Medicine and Medical Science pro-
grams at UNSW were recruited to participate in this
study during their Pathology practical classes. The stu-
dent cohorts were:
1. Junior students in Year 2 of a Bachelor of Medical
Science program, who were enrolled in an
introductory Pathology course (n= 119);
2. Senior students in the enrolled in a selective course
known as Rational Use of Investigations in the final
year (Year 6) of the Medicine program (n= 12).
These students were chosen for this study because
they represent novices (Year 2 Medical Science) and ex-
perienced users of virtual microscopy in Pathology
respectively.
Students were advised that they could opt-out of study
participation at any time. To encourage participation,
the annotation activities were integrated into existing
class structures.
Trial design
Formal instruction and regular classwork occurred
within the first half of each 2-h Pathology practical clas-
ses. For the remainder of each class, the sequential as-
sessment of individual and collaborative annotation
activities took place. All students in each class annotated
a set of WSIs under the same initial (either collaborative
or individual) conditions. The entire class then crossed
over to the alternate condition (either individual or col-
laborative) to annotate a second set of WSI. The initial
condition alternated between consecutive iterations of
the same class, while identical sets of WSIs were used
(Table 1). This protocol aimed to control for potential
carry-over effects, whereby the order of individual and
collaborative annotation activities might have affected
quiz performance. Annotation activities were 30 min in
duration. No formal feedback was provided to students
during the intervention. However, following the inter-
vention, all students received automated feedback in
their performance in timed online quizzes.
The trials for junior medical science students enrolled
in an introductory course in Pathology were conducted
with four WSIs. The first class (Class 1, n= 63) com-
pleted individual annotation activities on WSIs showing
a tubular adenoma of the colon and a colorectal carcin-
oma. After a brief 30-min break from intervention con-
ditions (washout period) consisting of traditional
classwork, the class then completed collaborative anno-
tation activities on WSIs showing squamous cell carcin-
oma of the tongue and invasive ductal carcinoma of the
breast. All students then individually completed the
questionnaire and a timed quiz online.
The second junior class (Class 2, n= 56) replicated the
same order of activities as Class 1, but annotated the
WSIs showing tubular adenoma and the colorectal
adenocarcinoma collaboratively, then proceeded to indi-
vidually annotate the WSIs showing squamous cell car-
cinoma of the tongue and invasive ductal carcinoma of
the breast. These students then completed the same
questionnaire and timed quiz as students in Class 1.
The trial for senior medical students enrolled in a
Pathology selective course also utilised four WSIs. The
class first annotated a WSI showing diffuse alveolar
damage individually. After a brief washout period of
traditional classwork, students then collaboratively an-
notated a WSI showing a pulmonary carcinoid tumour.
All students then completed the online questionnaire
and a timed quiz on the features of relevant WSI.
Table 1 Sequential Cross-Over Trial Structure
Class 1 Class 2
WSI 1
(Slides A & B)
Collaborative Individual
30 min of normal
classwork
WSI 2
(Slides C & D)
Individual Collaborative
5-min questionnaire
10-min online quiz
Sahota et al. BMC Medical Education (2016) 16:311 Page 3 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
The same sequential cross-over protocol was repli-
cated with the same group of students 1 week later, this
time employing WSIs showing endometrial adenocarcin-
oma and herpes oesophagitis, followed by a timed quiz
assessing those topics.
These tasks were supported by written and spoken task
descriptions, a tailored digital learning environment in
which to perform the annotation tasks and instructor feed-
back on student performance at the end of the task. Task
control factors refer to the extent to which learners can
control the task - specifically, the path, pace, content and
instruction. Student task control in this study was limited
due to time and classroom constraint considerations.
Annotation activities
Once allocated to either individual or collaborative anno-
tation conditions, students were asked to annotate WSIs,
focusing on a small number of selected features. These
activities were designed to provide visual cues to assist
students’understanding of the microscopic features of
disease processes. Such visual cues trigger the interpretive
process, resulting in improved pattern recognition,
student performance, productivity and efficiency on diag-
nostic tasks [5, 38, 41].
Collaborative annotation
Students allocated to the collaborative annotation condi-
tion were asked to identify microscopic features by an-
notating WSIs in randomly self-allocated groups of 3–5
students. This number of students working collabora-
tively was both logistically suitable and in accordance
with existing literature [16]. Students engaged in collab-
orative annotation activities could view the annotations
of their peers, thus enabling them to review and update
their own annotations based upon peer feedback.
A CSCL environment was employed to facilitate con-
ditions under which students would be more likely en-
gage in collaboration with peers to improve learning
outcomes, i.e. engage in discussion, negotiation & self-
regulation. This was achieved by each students creating
digital artefacts (annotations) from their own computer,
then comparing their annotations in real-time to those
created by their peers. The primary collaborative elem-
ent of the intervention is the ability for students to edit
their own annotations based upon the seeing and dis-
cussing the annotations made by their peers. By influen-
cing and learning from one another’s annotations,
students were then able to produce a final product or
‘consensus annotation’based on the shared understand-
ing of the group.
Individual annotation
When allocated to the individual annotation condition,
students were asked to work alone to identify and
annotate pre-selected histological features of a set of
WSIs. During this period, students could not view the
annotations that their peers created and except for tech-
nical assistance, instructor feedback was minimised.
Students who were working individually did not have
access to the annotations created by their peers, nor did
they have access to peer-based discussion outside of the
information provided to both groups by the instructor.
Whole slide image selection
The WSIs chosen for the study were derived from the
curriculum pertaining to each student cohort and were
accessed via the Slice™image database [20]). All WSIs
selected for this study had not been previously examined
by the students, and thus contained novel material.
WSIs and associated annotation activities were reviewed
by subject matter experts to ensure that the level of diffi-
culty was suitable for each cohort.
Evaluation of knowledge and perceptions of learning
Design of online quizzes
After completing both individual and collaborative anno-
tation activities, participants in each cohort attempted
tailored 10-min time-limited online quizzes, created
using the Adaptive eLearning Platform (AeLP) devel-
oped by Smart Sparrow Ltd (Sydney, Australia) [42].
Quizzes were linked securely from the university’s
Learning Management System. The quizzes for each co-
hort related to the WSIs explored during the preceding
annotation activities. There were a total of nine items in
the junior quiz, and 13 items in the quizzes for senior
students. Items included feature identification using drag
and drop (Fig. 2) and drop-down lists, as well as image-
based multiple choice items. These questions primarily
focused on the correct identification of histopathological
features of specific disease entities by utilising previously
unseen WSI showing the same pathological processes
explored in class.
To minimise bias in favour of either the individual or
collaborative annotation conditions, subject matter ex-
perts designed and reviewed all quiz items to ensure that
they were of comparable difficulty, and were appropriate
for the level of the students [43]. To facilitate compari-
sons of performance between conditions, care was taken
to ensure that all quizzes contained an equal number of
items related to each WSI, and that all items were of
similar difficulty. Thus, students in each cohort were
given equal opportunity to display their understanding
of microscopic features studied either collaboratively or
individually.
User experience questionnaire
To evaluate students’perceptions of their learning ex-
perience using WSI annotation, an online questionnaire
Sahota et al. BMC Medical Education (2016) 16:311 Page 4 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
was developed. The questionnaire items had previously
been validated as part of the Perceived Utility of Learn-
ing Technologies Scale (G. Velan, personal communica-
tion) and were employed to gather student feedback
regarding the Slice platform, individual and collaborative
annotation activities, as well as student perceptions of
understanding the microscopic features of diseases,
before and after annotation. The questionnaires were pre-
sented to participants online, immediately preceding the
timed quiz. This order was employed to maximise the
likelihood of participants completing the questionnaire.
Data analysis
Quiz and questionnaire data were extracted from the
AeLP as comma delimited text files, which were opened
with Microsoft Excel® (Microsoft Software Inc., Redmond,
Washington). The data was de-identified and sorted be-
fore being imported into GraphPad Prism 6 v6.04 (Graph-
Pad Software Inc., San Diego, California) for statistical
analysis.
Quantitative analysis
Quiz data Comparisons between groups of students
within the same cohort were performed using unpaired
t-tests.
Comparisons of individual students’performance in quiz
questions related to the collaborative and individual anno-
tation conditions were performed using paired t-tests.
All quiz data are expressed as mean percentage scores
± 95% confidence interval (CI). Differences between
groups or conditions were considered statistically signifi-
cant when the p-value was observed to be less than 0.05.
When significant differences were detected, effect sizes
were calculated using Cohen’sd.
Questionnaire data Comparisons of students’perceived
understanding before and after collaborative annotation
activities were performed using a Mann-Whitney U test.
Data derived from Likert scale items and the ranking
item in the questionnaire are expressed as median ±
interquartile range (IQR).
Qualitative analysis
Responses to open-ended questionnaire questions were
analysed using a grounded theory approach [44, 45].
Briefly, this involved the collection and review of qualita-
tive data such that repeated ideas, concepts and themes
were codified and then grouped into overarching themes
that were then characterised as either positive or negative.
Saturation was deemed to have been achieved when no
new codes were detected. The analysis was performed in-
dependently by two investigators, and any differences were
resolved by consensus.
Results
Quantitative results
Class equivalency
In order to determine the equivalence of two classes of
second-year medical science students, overall quiz re-
sults were compared. There was no statistically signifi-
cant difference in mean quiz performance between
classes (Class 1 n= 63, mean = 57.53%, 95% CI = 54.00–
61.05%; Class 2 n= 56, mean = 60.74%, 95% CI = 57.35–
64.14%; t= 1.295, df = 335; P= 0.1961). Quiz results for
Fig. 2 Example of the authoring environment for creation of a drag and drop question using the Adaptive eLearning Platform
Sahota et al. BMC Medical Education (2016) 16:311 Page 5 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
all students in that cohort were therefore pooled for fur-
ther analyses.
Quiz scores: individual vs collaborative
Junior students’mean quiz scores following individual
annotation did not differ significantly from mean quiz
scores following collaborative annotation—(individual
mean = 61.51%, 95% CI = 57.54–65.49%; collaborative
mean = 57.98%, 95% CI = 53.62–62.34%; t= 1.549, df =
118; P= 0.1241; n= 119).
Amongst the senior (Phase 3 Medicine) cohort, there
was a statistically significant difference in favour of col-
laborative annotation over individual annotation, with a
large effect size (Cohen’s d = 1.37) (individual mean =
57.58%, 95% CI = 47.32–67.83%; collaborative mean =
77.78%, 95% CI = 69.53–86.02%; t= 3.416, df = 11; P=
0.006; n= 12).
Students’perceptions of understanding
There was a statistically significant improvement be-
tween students’median perceptions of understanding on
a scale from 1 to 10 before and after annotation for jun-
ior (median before annotation = 4 out of 10, interquartile
range 2–5; median after annotation = 6 out of 10, inter-
quartile range 3–7; P< 0.0001; n= 119) and senior stu-
dents (median before annotation = 3.5 out of 10,
interquartile range 2.25–5; median after annotation = 7
out of 10, interquartile range 5.25–8; P= 0.005; n=12).
From the questionnaire data, median responses to the
“Learning Effectiveness through Collaboration”item
were significantly higher for the senior cohort (Junior
median 4, interquartile range 3–5; Senior median 5,
interquartile range 4–6; P= 0.04).
Qualitative results
Positive feedback
Junior students’perceptions of learning using the annota-
tion activities were obtained from their open-ended ques-
tionnaire responses. Positive feedback comments (n=73)
from these students indicated that the ability to annotate
(n= 36), the user-friendly Slice™interface (n=22), im-
proved understanding (n= 19), collaboration with peers
(n= 17) and the engagement/interactivity provided by the
activities (n= 15) were the most enjoyable aspects of the
intervention.
For senior students, positive comments (n= 9) focused
on the ability to annotate (n= 7), opportunity to collab-
orate (n= 1) and level of engagement/interaction (n= 1).
Representative positive feedback comments from each co-
hort are shown in Table 2 (for full feedback see Additional
files 1 and 2).
Negative feedback
Negative feedback on the annotation activities was also
gathered from open-ended questionnaire responses.
Junior students’negative comments (n= 82) indicated
concerns with insufficient guidance/feedback (n= 30),
insufficient integration of the intervention into the class
(n= 20), lack of prior knowledge to make best use of the
collaborative activities (n= 6) and issues with the tech-
nology (n= 6).
Senior students’negative feedback (n= 7) focused on a
need for more guidance/feedback (n= 4) as well as the
technological limitations of the software (n= 3).
Representative negative feedback by cohort is shown
in the Table 3 (for full feedback see Additional files 1
and 2).
Discussion
For senior students in this study, collaborative WSI an-
notation resulted in significantly improved quiz scores
when compared with individual WSI annotation. In con-
trast, there were no significant differences in quiz scores
between collaborative and individual annotation condi-
tions for junior students.
The above findings align with previous studies, which
demonstrated that students with greater amounts of
knowledge and experience are more likely to benefit
from the collaborative learning process [26, 27, 46–48].
While this phenomenon is likely to be primarily related
to the extent of students’background knowledge of the
discipline area, there may be other contributory factors.
Specifically, as opposed to junior students, senior stu-
dents’extensive history of socialising and collaborating
is likely to have positively influenced their interactions
with one another, potentiating their collaborative learn-
ing outcomes [28]. Furthermore, in comparison to many
junior Medical Science students, these senior medical
Table 2 Representative positive feedback comments from each
cohort
Representative Positive Feedback Theme Identified
Junior Being able to annotate on the slide itself,
as well as being able to share slides with
colleagues. High quality virtual slides are
also better than textbook or lecture slides.
Annotation;
Collaboration;
Interface
I enjoyed the group component of it. The
discussion helped facilitate my
understanding of the topic.
Collaboration;
Improved
Understanding;
Collaboration allows me to pull on
resources that I do not normally have in
order to gain a better understanding of
the topic.
Collaboration;
Improved
Understanding;
Senior Annotation and the ability to collaborate Annotation;
Collaboration;
Ease of use. Engaging way of arranging
learning activities
Interface
Engaging;
Sahota et al. BMC Medical Education (2016) 16:311 Page 6 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
students were high-achieving and highly motivated–such
learners have been shown to benefit more from VM-based
collaboration compared to their lesser-achieving counter-
parts [15, 27].
Both cohorts of students perceived significant
improvements in their understanding following partici-
pation in the annotation activities. This finding was ex-
pected as collaborative WSI annotation was designed to
increase student engagement, which is positively corre-
lated with perceptions of improved understanding [3–5,
7, 8, 12, 38, 41, 47, 48].
It is noteworthy that senior students perceived collab-
orative annotation to be significantly more effective for
learning than the junior cohort. Nevertheless, both co-
horts perceived significant improvements in their under-
standing following annotation activities. The discrepancy
between self-assessment of understanding and quiz
performance by the junior cohort in this study might be
related to a lack of regular self-assessment by students
[49, 50]. This, together with deliberate withholding of
teacher feedback prior to the quiz, could have affected
the accuracy of junior students’self-assessment [51].
Themes that recurred across each cohort regarding
students’positive perceptions of annotation activities on
WSIs included collaborating with peers, as well as in-
creased interactivity and engagement. These aspects are
consistent with previous studies of students’learning
preferences, that is by interacting and engaging with
learning materials in a social environment [5, 6, 16, 19,
37–40, 52].
The negative feedback responses gathered from the
open-ended questionnaire focussed primarily on insuffi-
cient feedback and technical issues. The reported lack of
feedback is understandable, because all formal feedback
was withheld from students prior to the online quiz to
avoid biasing quiz outcomes. Criticisms of the Slice™
platform’s functionality have been utilised to inform fur-
ther developments.
Limitations
For senior students (n= 12), this study was underpow-
ered. Nevertheless, this study demonstrated a statistically
significant difference in quiz scores in favour of collab-
orative annotation for the senior cohort, with a large ef-
fect size. This is indicative of an important real-world
difference in learning between individual and collabora-
tive conditions for senior students.
It might have been helpful to employ a pre-annotation
test of understanding of microscopic Pathology for each
cohort, in order to better quantify the improvement
procured from the annotation exercises. However, such
pre-tests were not logistically possible, and may even
have biased the results of the study, via test-enhanced
learning [53].
Furthermore, the intervention was limited by the num-
ber and scope of WSI that were used for annotation and
assessment purposes, i.e. two WSI per condition, four
per cohort. While it might have been helpful to create
more data points, we were constrained by the time avail-
able in each class.
The extent to which students could control their
learning was limited by instructors, i.e. the path, overall
pace, content and instruction was predetermined by in-
structors and was subject to classroom and time con-
straints [54]. In this sense, the learning environment was
scaffolded and was not a truly free CSCL environment.
Participants were informed that no course credit
would be awarded for their performance in the know-
ledge quizzes. Therefore, lack of motivation to succeed
might have affected students’performance in the quiz-
zes. However, these factors are unlikely to have biased
results in favour of either individual or collaborative an-
notation conditions.
A risk inherent in cross-over studies is the potential
for carryover effects following the cross over, which can
reduce observed differences between groups. In particu-
lar, it is possible that those students who commenced
with collaborative annotation might have been advantaged
in subsequent individual annotation activities. In our se-
quential cross-over design, carryover effects were con-
trolled for by alternating the order of individual and
collaborative annotation activities between classes. Within
each cohort, analyses of quiz performance showed no sig-
nificant difference between classes, thereby providing
Table 3 Representative negative feedback comments from each cohort
Representative Negative Feedback from Junior Students Theme Identified
Junior I would like if answers could be provided for the activity after we have conducted our own team or individual
attempt, that way we can see whether our understanding is correct.
Lack of feedback
More background information e.g. an image of a normal structure and orientation on things that we should be
looking out for. Some annotation activities are particularly difficult without this, e.g. obscure tissue like breasts that
we don’t often look at the histology of.
Not enough background
knowledge
It would be nice if there was a way for everyone in a group to be able to edit annotations/layers simultaneously
because that would make it a lot easier to contribute information and receive other answers.
Interface
Senior Once we start annotating the instructions are no longer available. Interface
To enhance self-study, perhaps some pre-annotated slices with features marked and explained. Lack of feedback
Sahota et al. BMC Medical Education (2016) 16:311 Page 7 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
reassurance that carryover effects did not bias the out-
comes of this study. A washout period, such as that pro-
vided in this study, might also have helped to reduce such
carryover effects.
Conclusions
This is the first reported study that has critically evaluated
student knowledge acquisition using collaborative annota-
tion of WSI. The mix of quantitative and qualitative
methods utilised in this study provided a realistic overall
picture of student learning in that context. For senior med-
ical students in this study, collaborative WSI annotation
was superior for understanding key microscopic features
when compared to individual WSI annotation. Collabora-
tive annotation was equivalent to individual annotation for
junior medical science students. These findings have impli-
cations for the deployment of annotation activities in Medi-
cine and Medical Science, as well as a variety of other
disciplines that utilise images to facilitate the learning of
morphology.
This investigation showed that students positively per-
ceive collaborative learning, regardless of experience level.
However, in the discipline of histopathology, collaborative
annotation of WSIs was shown to be objectively beneficial
only for senior students with sufficient background know-
ledge and experience.
Future research
It might be beneficial to replicate this study, while correl-
ating students’prior academic performance with their
quiz scores. This would enable exploration of whether
high-performing students benefit more or less from a col-
laborative approach than low-performing students, as has
been suggested previously [15]. The administration of a
pre-intervention test would also provide a baseline level
for comparison with the post-intervention results [55].
Studies to evaluate knowledge retention rates over
time may be valuable in providing a longitudinal view of
student learning. Such studies might provide crucial evi-
dence of the long-term benefit of collaborative annota-
tion of WSIs.
Finally, collaborative annotation on WSIs might have
potential to optimise learning for Anatomical Pathology
trainees. If further studies in such settings validate the
positive impact of collaborative annotation, this could
have implications for specialist training in Anatomical
Pathology.
Additional files
Additional file 1: Junior student raw data. This spreadsheet contains all of
the raw quiz score and qualitative data obtained from junior medical science
students. This data underpins the findings in this report. (XLS 395 kb)
Additional file 2: Senior student raw data. This spreadsheet contains all
of the raw quiz score and qualitative data obtained from senior medicine
students. This data underpins the findings in this report. (XLS 52 kb)
Abbreviations
AeLP: Adaptive eLearning Platform; BEST: Biomedical Education Skills &
Training; CI: Confidence interval; CSCL: Computer-supported collaborative
learning; IQR: Interquartile range; LM: Light microscopy; UNSW: University of
New South Wales; VM: Virtual microscopy; WSI: Whole slide image
Acknowledgements
Not applicable.
Funding
Not applicable.
Availability of data and materials
The dataset(s) supporting the conclusions of this article is (are) included
within the article (and its Additional files).
Authors’contributions
The ethics application and the subsequent modifications of it were
submitted by SD. The annotation activities and online quizzes were created
by BL and MS. All authors were involved in the design of the study and
review of assessment materials. The data was analysed by MS and GV. All
authors contributed to the writing and review of the manuscript. All authors
read and approved the final manuscript.
Competing interests
The author declares that they have no competing interests.
Consent for publication
Not applicable.
Ethical approval and consent to participate
Ethical approval for this study was provided by the Biomedical Panel of the
UNSW Australia Human Research Ethics Committee (HC15140).
All participants consented to participation within the study.
Received: 17 June 2016 Accepted: 29 November 2016
References
1. Kumar RK, Velan GM, Korell SO, Kandara M, Dee FR, Wakefield D. Virtual
microscopy for learning and assessment in pathology. J Pathol. 2004;204(5):
613–8.
2. Farah CS, Maybury TS. The e-evolution of microscopy in dental education. J
Dent Educ. 2009;73(8):942–9.
3. Maybury TS, Farah CS. Electronic blending in virtual microscopy. J Learn
Des. 2010;4(1):41–51.
4. Pinder KE, Ford JC, Ovalle WK. A new paradigm for teaching histology
laboratories in Canada’s first distributed medical school. Anat Sci Educ. 2008;
1(3):95–101.
5. Tian Y, Xiao W, Li C, Liu Y, Qin M, Wu Y, Xiao L, Li H. Virtual microscopy
system at Chinese medical university: an assisted teaching platform for
promoting active learning and problem-solving skills. BMC Med Educ. 2014;
14(1):74.
6. Triola MM, Holloway WJ. Enhanced virtual microscopy for collaborative
education. BMC Med Educ. 2011;11(1):4.
7. Bergman EM, Prince KJ, Drukker J, van der Vleuten CP, Scherpbier AJ. How
much anatomy is enough? Anat Sci Educ. 2008;1(4):184–8.
8. Fónyad L, Gerely L, Cserneky M, Molnár B, Matolcsy A. Shifting gears higher-
digital slides in graduate education-4 years experience at Semmelweis
University. Diagn Pathol. 2010;5:73.
9. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol.
2006;3(2):77–101.
10. Scoville SA, Buskirk TD. Traditional and virtual microscopy compared
experimentally in a classroom setting. Clin Anat. 2007;20(5):565–70.
Sahota et al. BMC Medical Education (2016) 16:311 Page 8 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
11. Dee FR, Lehman JM, Consoer D, Leaven T, Cohen MB. Implementation of
virtual microscope slides in the annual pathobiology of cancer workshop
laboratory. Hum Pathol. 2003;34(5):430–6.
12. Husmann PR, O’Loughlin VD, Braun MW. Quantitative and qualitative
changes in teaching histology by means of virtual microscopy in an
introductory course in human anatomy. Anat Sci Educ. 2009;2(5):218–26.
13. Weaker FJ, Herbert DC. Transition of a dental histology course from light to
virtual microscopy. J Dent Educ. 2009;73(10):1213–21.
14. Wilson AB, Taylor MA, Klein BA, Sugrue MK, Whipple EC, Brokaw JJ. Meta-
analysis and review of learner performance and preference: virtual versus
optical microscopy. Med Educ. 2016;50(4):428–40.
15. Helle L, Nivala M, Kronqvist P, Gegenfurtner A, Bjork P, Saljo R. Traditional
microscopy instruction versus process-oriented virtual microscopy
instruction: a naturalistic experiment with control group. Diagn Pathol. 2011;
6 Suppl 1:S8.
16. Goldberg HR, Dintzis R. The positive impact of team-based virtual
microscopy on student learning in physiology and histology. Adv Physiol
Educ. 2007;31(3):261–5.
17. Harris T, Leaven T, Heidger P, Kreiter C, Duncan J, Dick F. Comparison of a
virtual microscope laboratory to a regular microscope laboratory for
teaching histology. Anat Rec. 2001;265(1):10–4.
18. Kumar RK, Freeman B, Velan GM, De Permentier PJ. Integrating histology
and histopathology teaching in practical classes using virtual slides. Anat
Rec B New Anat. 2006;289(4):128–33.
19. Van Es SL, Kumar RK, Pryor WM, Salisbury EL, Velan GM. Cytopathology
whole slide images and adaptive tutorials for postgraduate pathology
trainees: a randomized crossover trial. Hum Pathol. 2015;46(9):1297–305.
20. About Slice. https://www.best.edu.au/slice/. Accessed 2 Dec 2016.
21. Helle L, Nivala M, Kronqvist P. More technology, better learning resources,
better learning? Lessons from adopting virtual microscopy in
undergraduate medical education. Anat Sci Educ. 2013;6(2):73–80.
22. Alavi M. Computer-mediated collaborative learning: An empirical evaluation.
MIS Q. 1994;18(2):159–74.
23. Dillenbourg P. What do you mean by collaborative learning? In: Collaborative-
learning: Cognitive and Computational Approaches. 1999. p. 1–19.
24. Gokhale AA. Collaborative learning enhances critical thinking. 1995.
25. Whipple WR. Collaborative learning: Recognizing it when we see it. AAHE
Bull. 1987;4:6.
26. Springer L, Stanne ME, Donovan SS. Effects of small-group learning on
undergraduates in science, mathematics, engineering, and technology: A
meta-analysis. Rev Educ Res. 1999;69(1):21–51.
27. Chan CK, Chan Y-Y. Students’views of collaboration and online
participation in Knowledge Forum. Comput Educ. 2011;57(1):1445–57.
28. Janssen J, Erkens G, Kirschner PA, Kanselaar G. Influence of group
member familiarity on online collaborative learning. Comput Hum
Behav. 2009;25(1):161–70.
29. Papastergiou M. Digital game-based learning in high school computer
science education: Impact on educational effectiveness and student
motivation. Comput Educ. 2009;52(1):1–12.
30. Prensky M, Prensky M. Digital game-based learning, vol. 1. St. Paul: Paragon
house; 2007.
31. Sung H-Y, Hwang G-J. A collaborative game-based learning approach to
improving students’learning performance in science courses. Comput Educ.
2013;63:43–51.
32. Gregory S, Gregory B, Campbell M, Farley HS, Sinnappan S, Kennedy-Clark S,
Craven D, Murdoch D, Lee MJ, Wood D. Australian higher education
institutions transforming the future of teaching and learning through 3D
virtual worlds. In: Proceedings ASCILITE 2010: 27th Annual Conference of
the Australasian Society for Computers in Learning in Tertiary Education:
Curriculum, Technology and Transformation for an Unknown Future: 2010.
University of Queensland: Brisbane; 2010. p. 399–415.
33. Mutter D, Dallemagne B, Bailey C, Soler L, Marescaux J. 3D virtual reality and
selective vascular control for laparoscopic left hepatic lobectomy. Surg
Endosc. 2009;23(2):432–5.
34. Zyda M. From visual simulation to virtual reality to games. Computer. 2005;
38(9):25–32.
35. Dillenbourg P, Järvelä S, Fischer F. The evolution of research on computer-
supported collaborative learning. In: Technology-enhanced learning. edn.
Springer; 2009. p. 3–19.
36. Lehtinen E. Computer-supported collaborative learning: An approach to
powerful learning environments. In: Powerful learning environments:
Unravelling basic components and dimensions. 2003. p. 35–54.
37. Avila RE, Samar ME, Sugand K, Metcalfe D, Evans J, Abrahams PH. The First
South American Free Online Virtual Morphology Laboratory: Creating
History. Creat Educ. 2013;4(10):6.
38. Bridge P, Trapp JV, Kastanis L, Pack D, Parker JC. A virtual environment for
medical radiation collaborative learning. Australas Phys Eng Sci Med. 2015;
38(2):369–74.
39. Cogdell B, Torsney B, Stewart K, Smith RA. Technological and Traditional
Drawing Approaches Encourage Active Engagement in Histology Classes for
Science Undergraduates. Biosci Educ. 2012;19:1-15.
40. Leifer Z. The use of virtual microscopy and a wiki in pathology education:
Tracking student use, involvement, and response. J Pathol Inform. 2015;6:30.
41. Nivala M, Säljö R, Rystedt H, Kronqvist P, Lehtinen E. Using virtual
microscopy to scaffold learning of pathology: A naturalistic experiment on
the role of visual and conceptual cues. Instr Sci. 2012;40(5):799–811.
42. Smart Sparrow. https://www.smartsparrow.com. Accessed 2 Dec 2016.
43. Powell KC, Kalina CJ. Cognitive and social constructivism: Developing tools
for an effective classroom. Education. 2009;130(2):241.
44. Charmaz K, Mitchell RG. Grounded theory in ethnography. In: Handbook of
ethnography. 2001. p. 160–74.
45. Corbin JM, Strauss A. Grounded theory research: Procedures, canons, and
evaluative criteria. Qual Sociol. 1990;13(1):3–21.
46. Kirschner PA, Sweller J, Clark RE. Why Minimal Guidance During Instruction
Does Not Work: An Analysis of the Failure of Constructivist, Discovery,
Problem-Based, Experiential, and Inquiry-Based Teaching. Educ Psychol.
2006;41(2):75–86.
47. Palincsar AS. 12 Social constructivist perspectives on teaching and learning.
In: An introduction to Vygotsky. 2005. p. 285.
48. Woo Y, Reeves TC. Meaningful interaction in web-based learning: A social
constructivist interpretation. Internet High Educ. 2007;10(1):15–25.
49. Boud D, Lawson R, Thompson DG. The calibration of student judgement
through self-assessment: disruptive effects of assessment patterns. High
Educ Res Dev. 2015;34(1):45–59.
50. Brown GTL, Andrade HL, Chen F. Accuracy in student self-assessmen t:
directions and cautions for research. Ass Educ: Princ Policy Pract. 2015;
22(4):444–57.
51. Rosman T, Mayer A-K, Krampen G. Combining self-assessments and
achievement tests in information literacy assessment: empirical results and
recommendations for practice. Assess Eval High Educ. 2014;40(5):740–54.
52. Diaz DP, Cartnal RB. Students’learning styles in two classes: Online distance
learning and equivalent on-campus. Coll Teach. 1999;47(4):130–5.
53. Larsen DP, Butler AC, Roediger III HL. Test‐enhanced learning in medical
education. Med Educ. 2008;42(10):959–66.
54. Kirschner P, Strijbos J-W, Kreijns K, Beers PJ. Designing electronic collaborative
learning environments. Educ Technol Res Dev. 2004;52(3):47–66.
55. Mione S, Valcke M, Cornelissen M. Evaluation of virtual microscopy in
medical histology teaching. Anat Sci Educ. 2013;6(5):307–15.
• We accept pre-submission inquiries
• Our selector tool helps you to find the most relevant journal
• We provide round the clock customer support
• Convenient online submission
• Thorough peer review
• Inclusion in PubMed and all major indexing services
• Maximum visibility for your research
Submit your manuscript at
www.biomedcentral.com/submit
Submit your next manuscript to BioMed Central
and we will help you at every step:
Sahota et al. BMC Medical Education (2016) 16:311 Page 9 of 9
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers and authorised users (“Users”), for small-
scale personal, non-commercial use provided that all copyright, trade and service marks and other proprietary notices are maintained. By
accessing, sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of use (“Terms”). For these
purposes, Springer Nature considers academic use (by researchers and students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and conditions, a relevant site licence or a personal
subscription. These Terms will prevail over any conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription
(to the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of the Creative Commons license used will
apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may also use these personal data internally within
ResearchGate and Springer Nature and as agreed share it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not
otherwise disclose your personal data outside the ResearchGate or the Springer Nature group of companies unless we have your permission as
detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial use, it is important to note that Users may
not:
use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at
onlineservice@springernature.com
Available via license: CC BY 4.0
Content may be subject to copyright.