Available via license: CC BY-NC 3.0
Content may be subject to copyright.
57
ORIGINAL ARTICLE
Ubiquitous testing using tablets: its impact on medical
student perceptions of and engagement in learning
Kyong-Jee Kim and Jee-Young Hwang
D
epartment of Medical Education, Dongguk University School of Medicine, Gyeongju, Korea
Purpose:
Ubiquitous testing has the potential to affect medical education by enhancing the authenticity of the assessment using
multimedia items. This study explored medical students’ experience with ubiquitous testing and its impact on student learning.
Methods:
A cohort (n=48) of third-year students at a medical school in South Korea participated in this study. The students were
divided into two groups and were given different versions of 10 content-matched items: one in text version (the text group) and
the other in multimedia version (the multimedia group). Multimedia items were delivered using tablets. Item response analyses were
performed to compare item characteristics between the two versions. Additionally, focus group interviews were held to investigate
the students’ experiences of ubiquitous testing.
Results:
The mean test score was significantly higher in the text group. Item difficulty and discrimination did not differ between
text and multimedia items. The participants generally showed positive responses on ubiquitous testing. Still, they felt that the lectures
that they had taken in preclinical years did not prepare them enough for this type of assessment and clinical encounters during
clerkships were more helpful. To be better prepared, the participants felt that they needed to engage more actively in learning in
clinical clerkships and have more access to multimedia learning resources.
Conclusion:
Ubiquitous testing can positively affect student learning by reinforcing the importance of being able to understand
and apply knowledge in clinical contexts, which drives students to engage more actively in learning in clinical settings.
K
ey Words
:
Educational measurement, Multimedia, Ubiquitous testing, Assessment
Received: September 18, 2015 •Revised: November 17, 2015 •Accepted: December 8, 2015
Corresponding Aut hor: Jee-Young Hwang (http://orcid.org/0000-0003-1491-8413)
Department of Medical Education, Dongguk University School of Medicine, 123 Dongdae-ro,
Gyeongju 38066, Korea
Tel: +82.54.770.2415 Fax: +82.54.770.2447 email: hwangmd@dongguk.ac.kr
Korean J Med Educ 2016 Mar; 28(1): 5 7-66.
http://dx.doi.org/10.3946/kjme.2016.1028.1.5 7
eISSN: 2005-7288
Ⓒ The Korean Society of Medical Education. All rights reserved.
This is an open-access article distributed under the terms of the
Creative Commons Attribution Non-Commercial License (http://
creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted
non-commercial use, distribution, and reproduction in any medium,
provided the original work is properly cited.
Introduction
Authentic assessment is central to medical licensure
examinations to ensure that examinees’ clinical compe-
tencies are assessed. The adoption of computer-based
testing using multimedia such as images, sounds, and
video clips allows the presentation of clinical findings in
a more authentic and undigested format than when they
are presented in text items [1]. Multimedia-based
presentation of clinical findings in assessment items,
henceforth multimedia items, enhances the authenticity
of the assessment, which leads to more valid assessment
of examinees’ clinical competencies [2]. Multimedia
items have been implemented in some medical licensure
exams such as the United States Medical Licensing
Examination and studies have shown that assessment
using multimedia items is feasible and can measure some
constructs different from what text items can measure
[2,3].
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
58
Korean J Med Educ 2016 Mar; 28(1): 57-66.
In South Korea, multimedia items are to be introduced
in the national medical licensure exam by the year 2020.
With computer-based testing not yet introduced in the
medical licensing exam in South Korea, the use of tablets
is under consideration. Using tablets for assessment is
expected to be efficient because they are ubiquitous and
thus require less financial resources and space [4].
Therefore, ubiquitous testing, in which assessment is
delivered by ubiquitous computing technologies such as
smartphones and tablets, is practical in nationwide tests
such as healthcare personnel licensure exams.
The most salient change likely with the use of ubi-
quitous testing in assessment in medical education is the
adoption of multimedia items, which is expected to
influence how students learn because assessment drives
and even enhances learning [5,6]. In particular, it is
speculated that teaching and learning in clinical contexts
will be more emphasized with the adoption of multi-
media items, as examinees will need to be able to
interpret clinical findings in an authentic format similar
to that are encountered in clinical practice [7]. Yet,
relatively little known about student experience with this
new type of assessment that offers empirical evidence to
support this assumption. Therefore, this study investi-
gated students’ experience with ubiquitous testing using
tablets and its impact on their perceptions of and
engagement in learning. Our research questions were (1)
Do the item characteristics differ across text items and
multimedia items using ubiquitous testing? (2) What are
students’ perceptions of the benefits and challenges of
ubiquitous testing? (3) Do students’ perceptions of and
engagement in learning change after they experience
ubiquitous testing?
It is expected that the present study help enhance our
understanding of the validity of ubiquitous testing. To
adopt an assessment tool, we need to evaluate its
validity. Yet, there is a paucity of evidence to support
the validity of multimedia items when they are delivered
in the form of ubiquitous testing. One of the standards
in establishing the validity of an educational assessment
is to collect evidence on its impact on examinees and on
teaching and learning [8]. Therefore, this study aimed to
gain more insight into the feasibility of ubiquitous
testing by exploring its impact on medical education.
Subjects and methods
A cohort of all Year 3 students (n=48) in the 4-year
medical program at Dongguk University Medical School
(DUMS) in South Korea participated in this study. The
curriculum at DUMS consists of mostly lecture-driven
preclinical courses for the first 2 years, followed by 2
years of clinical clerkships. The participants were
attending core clinical clerkships in one of the two
academic medical centers affiliated with the university
when this study was conducted. Both undergraduate-
entry (n=18) and graduate-entry (n=30) students were
admitted in the study cohort.
Participants were given 10 content-matched items in
either multimedia or text versions in multiple-choice
question format. These items were in the domain of
clinical knowledge for general practitioners and were
developed by one of the authors. The 10 items consisted
of two items on clinical anatomy on laparoscopic
operation (organ, artery), four items on physical
examination findings (ear, knee, brain, and muscle), two
items on real time sonography findings (heart, fetus),
one item on chest auscultation finding (heart and lung),
and one item on patient encounter (abdominal pain). Fig.
1 shows sample item on auscultation in text and
mul t i media format s .
The participants were randomly assigned to one of two
groups according to their placement of clinical rotations
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
59
Fig. 1. A Sample Item on Auscultation in Text (Top) and Multimedia (Bottom) Formats
Fig. 2. A Picture of a Student Taking the Ubiquitous Testing Using
Tablet
at the time of the study. The control group (the text
group) took the test in the conventional text format and
the experimental group (the multimedia group) was
given multimedia items. Nine video clips and one sound
clip of chest auscultation were incorporated in the
mul t i media items. The multimedia it ems were del i vered
using 10.1 inch tablets with Android operating system
offered by the school’s medical education committee
(Fig. 2). The ubiquitous testing system was developed by
a vendor, NSDevil (Seoul, Korea) and was equipped with
features for examinees to see the elapse of time, navigate
between items, mark on skipped items, and take notes.
The multimedia group used headphones to listen to audio
clips of auscultation findings and video clips of inter-
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
60
Korean J Med Educ 2016 Mar; 28(1): 57-66.
views or physical examinations, whereas for the text
group these findings were described in texts using
standard terminology.
The participants took the test in their placement of
clinical rotations in September 2014. Students at DUMS
had taken tests in the pencil-and-paper format, and thus
had not encountered any multimedia items prior to
participation in this study. This was a pilot test and
students’ test scores did not count towards their course
grades. Institutional Review Board (IRB) approval was
not requested for the present study, because it fell under
the general exemption from our IRB for educational
outcomes data.
Item analyses (test score, item difficulty, item discri -
mination, and response time) were performed and
Student t-test was used to compare characteristics
between text items and multimedia ones. For baseline
comparison, participants’ cumulative grade point aver-
ages through Years 1 and 2 were compared across the
two groups using Student t-test. Additionally, distri -
butions of participant backgrounds (i.e., gender, entry-
level, and ages) were compared across the two groups
using chi-square analysis. IBM-SPSS version 20 (IBM
Corp., Armonk, USA) was used and the significance level
was 0.05 for the statistical analysis.
Additionally, we conducted focus group interviews of
the participants in the multimedia group to investigate
their experiences of ubiquitous testing. Focus group
interview is a qualitative research method for collecting
data on the phenomenon being studied by analyzing
conversations among study participants [9]. We used the
qualitative research method as it is known to be useful
in eliciting student perspective [10]. Although all 24
participants in the multimedia group were placed in the
focus groups to ensure diversity of opinions elicited
from the study, they were divided into three groups to
give groups small enough to ensure input from all
participants.
The interviews were conducted in a semi-structured
format with 12 questions: three related to participant
perceptions of item characteristics of multimedia items,
five to students’ emotional responses to ubiquitous test-
ing, which is known to be linked to student learning and
performance (e.g., enjoyment, anxiety, and boredom)
[11], and four to changes in participant perceptions of
and engagement in learning after experiencing ubi-
quitous testing. The interview questions are presented in
Appendix 1.
The interviews were conducted 3 weeks after the
participants took the test to investigate the impact of
their experience with ubiquitous testing on their learning
behaviors. Each interview session took 40 to 45 minutes.
Although one of the authors participated in the
interview sessions, he did not play an active role during
the interview sessions and let one of the participants in
the group facilitate the discussions during the interview
and the author observed the sessions to minimize the
possibility of him making an impact on the discussions.
All the interviews were audio recorded and were
transcribed by the authors. The transcripts were analyzed
and emerging themes were identified from the data using
the thematic analysis method.
Results
1. Study participants
Table 1 shows the study participants’ demographics
and the results of comparisons of baseline performance
across the two groups. There was no statistically
significant difference in participants’ academic perfor-
mance or distributions of participant demographics
between the two groups.
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
61
Table 1. Cross-Tabulation of Participant Demographics and Academic Performance
Variable Text group (n=24) Multimedia group (n=24) p-value
Entry-level
Undergraduate 9 9 1.00
Graduate 15 15
Sex
Male 13 14 0.77
Female 11 10
Age 26.54±3.24 26.21±3.11 0.72
GPA 3.18±0.59 3.10±0.48 0.61
Data are presented as number or mean±standard deviation.
GPA: Grade point average.
Table 2. Mean Scores, Item Difficulty, and Item Discrimination for Text and Multimedia Items
Item Text group (n=24) Multimedia group (n=24) p-value
Test score (10 items) 6.25±1.54 5.08±1.56 0.012
Item difficulty 0.63±0.20 0.51±0.25 0.261
Item discrimination 0.33±0.18 0.33±0.25 0.984
Data are presented as number or mean±standard deviation.
2. Item analyses
Table 2 compares item responses across the two
groups. The mean test score of the multimedia group was
significantly lower than that of the text group. The mean
item difficulty of the text items was higher than that of
multimedia items and the mean item discrimination was
comparable between the two groups. Still, there were no
statistically significant differences in item difficulty and
discrimination. The mean total response time for the 10
items was longer in the multimedia group than in the
text group (14.98 minutes vs. 5.79 minutes).
3. Benefits and challenges of ubiquitous
testing using tablets
Participants generally showed some positive responses
to ubiquitous testing. Most mentioned they were a little
anxious about this new type of assessment in the
beginning, but they quickly became comfortable because
they were used to mobiles devices. The participants felt
using tablets was more convenient than pencil-and-
paper tests as the display quality is much better in tablets
than in papers and they did not have to mark answers on
the optical mark recognition sheet, which sometimes
they could make errors. Additionally, some participants
pointed out the feature in the tablet system that showed
the time elapsed was helpful.
Moreover, participants pointed out they would likely
make less mistakes in ubiquitous testing than in tradi-
tional tests. One student stated, “I have more chances to
make mistakes in taking a test when a lot of items are
presented. In this type of (ubiquitous) testing, only one
item is shown on each page, so that helps me concentrate
on the question one by one, and that helps me make
fewer mistakes.” Other students also agreed with the
statement that they could pay better attention to each
item on the test in ubiquitous testing and felt that would
make them less likely to make mistakes. Additionally,
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
62
Korean J Med Educ 2016 Mar; 28(1): 57-66.
some even found it interesting to view video clips in the
items because they felt like they were watching what was
really happing in the clinical settings. Yet, some
participants mentioned that they felt more time pressure
in ubiquitous testing because they needed time to view
video clips in the items.
Most of the participants pointed out that ubiquitous
testing will be beneficial in enhancing their clinical
competencies. One student stated, “Physical exams is
fundamental in clinical competencies, and we can’t learn
physical exams merely by seeing some images. So, if we
could learn how physical exams are performed by
watching some clinical videos and if we were tested that
way, we could learn better about physical exams and that
could enhance our clinical competencies. That way we
could be assessed on our clinical competencies more
effectively.”
4. Impact of ubiquitous testing on student
perceptions of and engagement in learning
The participants reflected on how the teaching and
learning in preclinical years prepared them for ubi -
quitous testing. Most of the participants felt that what
they had learned from lectures in the preclinical years
was insufficient for them to perform well in ubiquitous
testing. One student noted “I lost confidence (while
taking the ubiquitous testing) feeling that I knew only
superficially. There was a video clip of a physical exam
in one of the items. I knew what that exam was, but I
did not know its procedures well enough, so that was a
difficult question for me.” Another student added, “Most
of the images presented in the tests that I had taken
(before the ubiquitous testing) were familiar ones
because they were similar to those in textbooks. But,
those in multimedia items were not something that I had
seen in textbooks or somethings that I saw during
clinical rotations, so these were difficult to me.”
Most of the participants mentioned that what they
learned from clinical encounters during their clerkships
helped them more than lectures to prepare for the
ubiquitous testing. One student mentioned “I think
lectures that I took in Year 2 courses did not help much.
What I learned during clerkships when the preceptors
had us do auscultations, explained us about clinical
images to interpret them, and by observing surgery in
the operation room was more helpful.”
The participants also highlighted the need for more
multimedia learning resources to help them prepare for
ubiquitous testing. One stated that “We viewe d some
clinical videos during lectures when we were Year 2. I
did not pay a lot of attention to those videos then and
skimmed them over, because they would not be in the
exams. But, now with ubiquitous testing, such video clips
will be very helpful in preparing for this kind of test.”
Yet, most of the participants pointed out the lack of
available multimedia learning resources. As one student
mentioned, “I could do well on the test about ausculta-
tions and anatomical structures that I saw from actual
cases that I encountered during clinical clerkships, but I
don’t think I can do well on others. I cannot come across
every case that I need to know during clinical clerkships,
so it would be more helpful if there are more multimedia
learning resources available for us.”
The participants reflected on their engagement in
clinical rotations during the past 3 weeks after they
experienced ubiquitous testing. Some participants men-
tioned that such an experience did not influence their
learning behaviors because they knew multimedia items
would be introduced after they graduate from medical
school. Still, some participants mentioned that their
experience with ubiquitous testing made them think
about their learning behaviors in clinical clerkships. One
participant noted that “This (ubiquitous testing) would
have been a much more difficult test if I had taken it in
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
63
Year 1 or 2, but I could answer some of the questions
from what I learned during clinical rotations. So, I was
thinking that I will need to change my attitude toward
learning during clinical clerkships if I take this kind of
test.”
In addition, some participants reported that they
engaged in learning in clinical clerkships more actively
after they experienced ubiquitous testing. One student
mentioned that “I attended clinical rotations in oto-
rhinolaryngology last week and there were a lot of phy-
sical exams done there. Those were something that could
be in multimedia items, so I tried to learn more about the
principles behind those exams. So, I asked the preceptors
questions more often about why the physical exam is
performed and how it is done and I tried to observe the
anatomical structures more closely when I was attending
surgery in the operation room. Now, I try to learn things
more precisely and do not want to skip things that are
not clear to me from what I see during clinical
rotations.”
Discussion
Our study found that the mean test score of the
multimedia group was lower than that of the text group,
although there were no significant differences in item
difficulty or discrimination between the text and multi-
media items. These findings are similar to those found
from studies of computer-based testing [2,3]. Further-
more, we found that the item difficulty was generally
higher in the text items, but it was higher in the
multimedia format in three items, which pertained to the
sonographic structure of heart, muscle examination, and
patient encounter. These findings are consistent with
those from previous studies that multimedia items are
more difficult than text items when text descriptions
using textbook terminologies are replaced by multimedia
presentations, such as in auscultation findings, and that
multimedia items are easier when multimedia presen-
tations provide richer information on the conditions of
the patient than in text descriptions, such as in a video
clip portraying hip flexion [2,3].
The total response time was longer in the multimedia
group, which is consistent with findings from previous
studies [2,3]. We speculated that the students needed to
devote more time and effort to interpreting information
from multimedia presentations due to their unfamiliarity
with such presentations. This experience may have
motivated the students to reflect upon their learning
behaviors and prompted them to think about how they
should change their learning behaviors to adapt to this
new type of assessment.
Students generally showed positive responses to ubi-
quitous testing. They quickly became comfortable with
this new technology, some even found it interesting to
take a test involving clinical videos, and some felt their
clinical competence could be assessed better in this
testing format. These findings are consistent with those
of Roh et al. [4] who found that students showed positive
reactions to their experience with ubiquitous testing.
However, our study revealed student perceptions that
traditional teaching and learning methods have not
adequately prepared them for this type of assessment.
This highlights the need for medical schools to support
student learning by providing more multimedia re -
sources.
Our study confirms our speculation that ubiquitous
testing can positively affect student learning by re-
inforcing the importance of being able to understand and
apply knowledge in clinical contexts, which drives
students to engage more actively in learning in clinical
settings. Possibly, the method of assessment rather than
the delivery medium influenced the students’ perceptions
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
64
Korean J Med Educ 2016 Mar; 28(1): 57-66.
and attitudes on learning. Whether it is computer-based
testing or ubiquitous testing, multimedia presentations in
test items make assessment more authentic, which
stimulates student awareness of the importance of being
able to understand and apply knowledge in clinical
contexts. This would help move students up the ladder in
their learning and performance from “knowing” to
“knowing how” in the Miller’s pyramid [12].
Three study limitations can be discerned. First, this
study was performed with a small number of students in
one medical school who had had no experience with
computer-based testing and little clinical exposure dur-
ing preclinical years. The extent to which students are
experienced in computer-based testing and exposed to
clinical settings from early on in their medical studies
may differ across medical schools. Therefore, readers
should take such differences into consideration when
generalizing our findings to their own contexts. Second,
participants in our study generally felt comfortable with
ubiquitous testing because they were already familiar
with ubiquitous technologies. Hence, our findings may
not be generalized to students who are new to mobile
technology or in locations with limited access to Wifi or
mobile network for displaying multimedia contents.
Third, the item analyses were conducted with a small
sample of students using a small number of items. We
did not attempt to ensure the statistical power in our
analysis of the quantitative data as it was a pilot study
conducted for the subsequent qualitative study of the
students’ experiences with ubiquitous testing. Future
research is warranted through larger scale studies to
ensure statistical power and enhance the generalizability
of our findings.
Although our study indicates students’ positive re-
sponses to ubiquitous testing and its potential to make a
positive impact on their perceptions of learning, our
study did not investigate whether such an impact actually
leads to better clinical performance, which it was beyond
the scope of our study. Therefore, future research is
recommended that investigates whether the changes in
student perceptions of and engagement in learning
driven by the adoption of ubiquitous testing leads to
better learning outcomes.
Acknowledgements:
None.
Funding:
None.
Conflicts of interest:
None.
References
1. Peterson MW, Gordon J, Elliott S, Kreiter C. Computer-
based testing: initial report of extensive use in a medical
school curriculum. Teach Learn Med 2004; 16: 51-59.
2. Shen L, Li F, Wattleworth R, Filipetto F. The promise
and challenge of including multimedia items in medical
licensure examinations: some insights from an empirical
trial. Acad Med 2010; 85(10 Suppl): S56-S59.
3. Holtzman KZ, Swanson DB, Ouyang W, Hussie K,
Allbee K. Use of multimedia on the step 1 and step 2
clinical knowledge components of USMLE: a controlled
trial of the impact on item characteristics. Acad Med
2009; 84(10 Suppl): S90-S93.
4. Roh H, Lee JT, Rhee BD. Ubiquitous-based testing in
medical education. Med Teach 2015; 37: 302-303.
5. Wood T. Assessment not only drives learning, it may also
help learning. Med Educ 2009; 43: 5-6.
6. Larsen DP, Butler AC, Roediger HL 3rd. Test-enhanced
learning in medical education. Med Educ 2008; 42:
959-966.
7. Huh S. Can computerized tests be introduced to the
Korean medical licensing examination? J Korean Med
Assoc 2012; 55: 124-130.
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
65
8. Downing SM. Validity: on meaningful interpretation of
assessment data. Med Educ 2003; 37: 830-837.
9. Stalmeijer RE, Mcnaughton N, Van Mook WN. Using
focus groups in medical education research: AMEE Guide
No. 91. Med Teach 2014; 36: 923-939.
10. Barbour RS. Making sense of focus groups. Med Educ
2005; 39: 742-750.
11. Pekrun R. The control-value theory of achievement
emotions: assumptions, corollaries, and implications for
educational research and practice. Educ Psychol Rev
2006; 18: 315-341.
12. Miller GE. The assessment of clinical skills/competence/
performance. Acad Med 1990; 65(9 Suppl): S63-S67.
Kyong-Jee Kim and Jee-Young Hwang : Ubiquitous testing using tablets
66
Korean J Med Educ 2016 Mar; 28(1): 57-66.
Appendix 1. Focus Group Interview Questions (Only with Multimedia Group Participants)
1. Do you think multimedia items were easier or more difficult than text items?
2. Which do you think can better assess your clinical competence
—
text or multimedia items?
3. What type of multimedia items were the most difficult for you?
4. In which format do you think you can do better in the test
—
text or multimedia items?
5. Did you become more or less confident about the test after you took ubiquitous testing?
6. Do you think you can make more mistakes when you take ubiquitous testing than in traditional tests?
7. Did you get more nervous when you took the ubiquitous testing?
8. Did you feel more interested when you took the ubiquitous testing?
9. How did you feel about taking the test using tablets? What were the advantages and challenges?
10. Do you feel the courses that you had taken in the medical school prepared you well for ubiquitous testing? If not, what were those
courses, and what courses helped you the most?
11. Do you think you would prepare for ubiquitous test the same way as you did with traditional tests? What would you do differently?
12. Do you think the experience of ubiquitous testing has affected your learning behavior in clinical clerkships?