Content uploaded by Vicki H.M. Dale
Author content
All content in this area was uploaded by Vicki H.M. Dale on Aug 08, 2014
Content may be subject to copyright.
The OSCE
The Objective Structured Clinical Examination
(OSCE) as a Determinant of Veterinary Clinical
Skills
Margery H. Davis
g
Gominda G. Ponnamperuma
g
Sean McAleer
g
Vicki H.M. Dale
ABSTRACT
The Objective Structured Clinical Examination (OSCE) has become an excellent tool to evaluate many elements of a student’s
clinical skills, especially including communication with the patient (human medicine) or client (veterinary medicine); eliciting
clinical information from these conversations; some aspects of the physical examination; and many areas of clinical evaluation
and assessment. One key factor is that the examination can be structured to compare different students’ abilities.
INTRODUCTION
The Objective Structured Clinical Examination (OSCE) has
been used in medicine for 30 years. Since its introduction at
Dundee Medical School, in Scotland, its use has spread
throughout the world. A recent unpublished literature
search carried out in Dundee identified more than 700
published articles on the OSCE. The utility of the OSCE in
medicine is undoubted—but does it have potential for
veterinary medicine?
The concept of OSCEs for veterinary education is not new.
It has been over a decade since the OSCE was recommended
as an alternative to traditional examinations in veterinary
education, which typically only stimulate factual recall.
1
Nevertheless, only recently has the OSCE been used in some
UK veterinary schools.
This article describes the OSCE and highlights its relevance
to veterinary education.
THE OSCE: A DESCRIPTION
The exam consists of a number of stations, usually 14 to 18,
2
that candidates rotate through. At each station the candidate
is asked to carry out a specific task within an allotted period.
The exam is finished when all candidates have completed
all stations. Figure 1 shows the general format of a 15-station
OSCE. OSCEs with a larger or smaller number of stations
have a similar format.
The OSCE may need to be run on several sites at the same
time, or repeated, to accommodate the total number of
candidates. It is advisable to restrict the number of runs to
two for test-security reasons. The candidates for the second
run should be sequestered before the end of the first run;
they can be briefed while the first run is in progress. This
will prevent contamination of the second-run candidates
with exam information from first-run candidates.
All candidates go through the same stations. The OSCE
stations are selected to reflect the curriculum content and
course outcomes and are planned using an assessment
blueprint.
3
The purpose of the blueprint is to ensure that all
outcomes are assessed and that the curriculum content is
appropriately sampled. An example of an exam blueprint
for the final-year small-animal clinical studies course at the
University of Glasgow, Scotland, is shown in Table 1. Ideally
there should be at least one checkmark in each column and
each row.
Each station is built around a specific task that the candidate
must accomplish within a standard period.
4
Most stations
last between five and 10 minutes, but stations of up to 20
minutes are not uncommon.
If a particular task cannot be accomplished in the standard
station time, linked or double stations may be employed.
Linked stations are sequential stations at the first of which
the candidate carries out a task or prepares for the second
station, where he or she builds on the findings of the first
station or answers questions related to it. Double stations
require doubling of resources, including examiners, and
take twice the standard station time. A staggered start is
Figure 1: The format of a 15-station OSCE
578 JVME 33(4) ß 2006 AAVMC
Table 1: An assessment blueprint for an OSCE in veterinary medicine
Competence
Internal medicine
Cardiopulmonary
Neurology
Soft Tissue
Ophthalmology
Orthopedics
Dermatology
Diagnostic Imaging
Oncology
Anesthesia, Intensive
Care, Fluid Therapy
Dentistry
Behavioral Problems
Caged Pets & Exotics
Vaccination, Parasite
Control, Zoonoses
Legislation, Prescription
Writing, Ethics
Other
Communication
skills (history
taking; client
education;
explanation of
a condition,
treatment,
or investigation)
ppp p
Clinical exam,
technique,
or interpretation
(on a live animal
or cadaver, from
photos or video)
pp p p
Practical skill
(theater and
surgical skills,
urine exam,
microscopy, fine
needle aspirate,
lab techniques)
pp ppp p
Data and image
interpretation
(biochemistry,
hematology,
urinalysis, ECG,
radiographs,
ultrasound)
pp p p
Other
p
JVME 33(4) ß 2006 AAVMC 579
required for both linked and double stations, as candidates
cannot start halfway through the task.
Candidates may be observed by one or more examiners
while carrying out the task at the station; this is a manned
station. Each examiner is equipped with a checklist or a
rating scale. The checklist is constructed of items that the
candidate should carry out at the station. The checklist
should not be overly long, as it becomes unwieldy to
complete; it is important to remember that only the key or
core items, as identified by the subject experts, should be
included in the checklist. Global rating scales may also be
used, as well as or instead of checklists, and have been
shown to be more reliable than checklists.
5
Rating scales
give predetermined descriptors for each point on the scale,
which provide an accurate description of candidate beha-
vior for a given rating.
5
The use of checklists and rating
scales for each station is the reason for the objective
descriptor in the name ‘‘OSCE.’’
There may, however, be stations where the candidate is
unobserved (unmanned stations); here a paper-based answer
is required, or the candidate provides responses at the next
station (linked station). The candidates may be given
answer books for unmanned stations, which they carry
with them round the exam and submit for marking at the
end. Alternatively, they can complete an answer sheet at
each unmanned station and put it into a ‘‘post box’’ at the
station, usually a cardboard box with a posting slit that
allows input of answer sheets but not their removal.
The sampling of curriculum content and outcomes as
demonstrated by the blueprint, the use of checklists and
rating scales, and all candidates’ seeing the same patients
contribute to the objective nature of the exam; because a wide
range of clinical skills can be assessed within the OSCE
framework, the OSCE is a clinical exam.
The OSCE is unique among examination formats because it
provides a framework to assess the hands-on skills of the
candidate in a simulated setting. Hence, it assesses the
candidate at the level of competence, as opposed to assessing
skills in real-life settings, which is assessing at the level of
performance.
6, 7
The following description of the OSCE considers the six
questions that must be addressed by any assessment: why;
what; when; where; by whom; and how.
8
Why?
Traditionally, clinical skills in medicine were assessed using
long and short cases.
In the long case, the candidate spends a fixed period of time
with a real patient, as opposed to a simulated patient, to
take a comprehensive clinical history and carry out a
physical examination. The examiners later assess the
candidate using oral questions based on the patient that
the candidate examined. Different candidates see different
patients, and different examiners assess different candi-
dates. As a result, the questions put to candidates vary.
Some candidates may be presented with an ‘‘easy’’ patient,
while others will be more ‘‘difficult’’; that is, the level of
difficulty of the assessment varies. Appropriate sampling of
the curriculum is not possible with one long case, which
contributes to the lack of reliability and lack of content
validity of the results of this type of test.
In the short case, one or two examiners directly observe the
candidate carrying out a particular task (e.g., eliciting a
physical sign or examining an organ system). All candi-
dates, however, are assessed neither with the same patients
nor by the same examiners. Though sampling of the
curriculum content can be improved by using multiple
short cases, the variability of cases seen by the candidates
adversely affects test reliability.
In 1969 the fairness of long- and short-case exams for
individual candidates was queried.
9
In a landmark article,
Wilson and colleagues showed that an individual candidate
carrying out the same task was scored differently by
different examiners. The range of scores assigned to the
same candidate by different examiners viewing the same
actions (i.e., poor inter-rater reliability) rendered the assess-
ment unreliable.
The OSCE was introduced to overcome the above problems
pertaining to exam content variability, subjectivity, and
fairness of the long and the short case.
10
These issues were
addressed by structuring OSCE stations to ensure that all
candidates go through the same test material, or test
material of a similar level of difficulty, and are scored
using the same criteria (e.g., pre-validated checklists or
rating scales) by the same, or similar, examiners.
It has recently been recognized that some traditional
examinations in veterinary medicine lack the reliability
and objectivity of the OSCE, and, therefore, some schools in
the UK have implemented the OSCE as a fairer form of
assessment.
What?
What can be tested in the OSCE is limited only by the
imagination of the test designers. Stations can be developed
to test course outcomes such as clinical skills (e.g., chest
examination); practical procedures (e.g., urine testing);
patient investigations (e.g., interpretation of a chest radio-
graph); patient management (e.g., completing a prescription
sheet); communication skills (e.g., breaking bad news); IT
(information technology) skills (e.g., accessing a database);
and providing patient education (e.g., lifestyle advice).
Knowledge is invariably assessed by the OSCE.
11, 12
Though
marks are not usually given for answering knowledge-
based questions, without the underlying knowledge and
understanding the candidate cannot carry out the instruc-
tions at each station. Critical thinking and clinical judgment
are the other cognitive skills that can be assessed in an
OSCE.
The OSCE can also be used to assess attitudes
13
and
professionalism, but both of these can be more conveniently
assessed using the portfolio framework.
14
The OSCE has been described in a range of medical
specialties: family medicine,
15
general practice,
16
surgery,
17
pediatrics,
18
internal medicine,
19
obstetrics and gynecol-
ogy,
20
emergency medicine,
21
psychiatry,
22
oncology,
23
and
anesthesiology.
24
As basic training in medicine becomes
increasingly integrated,
25, 26
so does its assessment, and the
OSCE framework supports integrated assessment.
In the UK, given the similarities
27
between the General
Medical Council’s ‘‘principles of professional practice’’
28
and the Royal College of Veterinary Surgeons’ ‘‘10 guiding
principles,’’ it seems that the OSCE has as much relevance to
580 JVME 33(4) ß 2006 AAVMC
the assessment of professional skills in veterinary education
as it does in medical education.
Shown below are examples of OSCE stations for veterinary
students at the University of Glasgow, first introduced in the
2003/2004 academic session. Each example has four parts:
instructions to the candidate; instructions to the examiner;
checklist; and marking scheme. The second example also
includes an equipment list, an essential component when
equipment is required at a station.
The first example (Figure 2) assesses communication skills
related to a clinical scenario. The second example (Figure 3)
assesses candidate competence in a practical procedure.
When?
The OSCE can be used as a pre-test, in-course test, or
post-test.
The pre-test assesses baseline student competence before the
course begins and can provide a student profile. Courses
can then be customized to meet individual student needs.
Comparing the results of the pre-test with those of
subsequent tests provides information about the course,
the student, and the teaching.
The OSCE has been a useful in-course assessment, from the
standpoint of both formative assessment (i.e., providing
feedback to the student) and course evaluation (i.e.,
providing feedback to the instructor). OSCE checklists can
be used by students for peer and self-assessment (i.e.,
formative assessment). Involving students in developing
OSCE checklists may be one way of helping them learn and
encouraging them to engage in self-and peer assessment. A
mid-course OSCE will also provide feedback to instructors
as to how successful their teaching has been and what areas
that they should concentrate on in future.
The OSCE has been used mostly as a post-test in medical
education. Over the years its objective and structured nature
has ensured its appropriateness for summative assessment
purposes (i.e., where passing or failing the exam has
consequences for the candidate).
Within the continuum of medical education, the OSCE has
been used in undergraduate,
29
post-graduate,
30
and con-
tinuing
31
medical education.
In health professional education, a variety of professions
such as medicine, dentistry,
32, 33
veterinary medicine, nur-
sing,
34
midwifery,
35
para-medical services,
36
medical labora-
tory technology, pharmacy,
37
and radiography have used
the OSCE framework for their assessment purposes.
Relevant articles in this issue demonstrate the use of the
OSCE mainly as a post-test, with in-course (mock) tests used
to prepare students at various undergraduate levels in
veterinary medicine for their end-of-year examination.
Where?
Initially, OSCEs were held in hospital wards. However,
experiences such as patient cardiac arrests during the exam
have moved the OSCE away from the wards to Clinical
Skills Centres, empty outpatient departments (OPDs), or
specially designed test centers. What all these venues have
in common is that they can accommodate several stations,
such that candidates can rotate around the different stations
in sequence without overhearing the candidates at adjacent
stations. If the stations are set up within small cubicles,
they can simulate the real-life environment of a clinic or
OPD and ensure privacy for intimate procedures. Some test
centers have simulated emergency rooms or patient home
environments.
One UK veterinary school, the Royal Veterinary College in
London, already has its own clinical skills facility for
students to practice their skills, which doubles as an OSCE
site.
28, 38
Other UK veterinary schools are following this
example.
By whom?
Test designers, on-site examiners, site organizer(s), time-
keeper(s), secretarial staff for computerized collation of
results (e.g., data entry, handling optical mark reader), and
portering staff are all involved in the OSCE process.
Depending on the content tested at the station, OSCE
examiners may be clinicians, pre-clinical instructors, stan-
dardized or simulated patients, or other health care
professionals (e.g., nurses, paramedics, social workers).
While clinicians are needed for OSCE stations assessing
clinical skills, patients and other health professionals are
often used for stations assessing outcomes such as commu-
nication skills, empathy, attitudes, and professionalism.
There are two important prerequisites for all examiners:
they need to be trained in using the checklist or rating scale
and familiar with the topic assessed at the station.
How?
Box 1 outlines the steps to be followed in designing an
OSCE station. The guidelines for implementing an OSCE are
shown in Box 2.
BOX 1: A GUIDE TO DEVELOPING AN OSCE STATION
Selecting a Topic for an OSCE Station
.
Clearly define the ‘‘clinical task’’ (e.g., interpreting an
x-ray) that is to be assessed by the station. Have a
clear idea of its place in the assessment blueprint (i.e.,
what are the content area(s) and outcome(s) that the
‘‘exam task’’ is assessing?).
.
Identify a clinical scenario that will provide sufficient
information for the candidate to carry out the clinical
task, within the stipulated time limit.
.
Identify the different assessment tools (e.g., checklist,
rating scale, global rating) that can be used to assess
the candidate and choose the most suitable tool.
Developing an OSCE Station
.
Document the layout of the station (if needed with
clear diagrams).
.
Construct the question(s) that the candidate will be
asked.
.
Develop instructions to the candidate to explain what
the candidate is required to do at the station.
.
Develop the checklist/rating scale and validate it,
with the help of one or more colleagues.
JVME 33(4) ß 2006 AAVMC 581
Figure 2: Example OSCE Station 1, from a fourth-year companion-animal studies examination
582 JVME 33(4) ß 2006 AAVMC
Figure 3: Example OSCE Station 2, from a final-year small-animal clinical-studies examination
JVME 33(4) ß 2006 AAVMC 583
ADVANTAGES AND DISADVANTAGES
The main advantages of the OSCE over other clinical exams
are that it provides
.
a framework to assess a representative sample of the
curriculum, in terms of content and outcomes, within
a reasonable period of time;
.
the same assessment material to all candidates;
.
Develop a script for the patient/standardized
patient/actor (if necessary).
.
Develop instructions to the patient/standardized
patient/actor.
.
Construct a marking scheme for the checklist, with
itemized marks for each step.
.
Design a marking sheet, paying attention to ease
of use.
.
Develop instructions to the examiner(s).
.
List the equipment needed for the station (e.g.,
mannequin, IV drip set, ophthalmoscope).
.
Test-run (pilot) the station.
Before the Exam
These are mainly responsibilities of the central examination
unit, in consultation with the examiners.
.
Agree the question weightings and pass mark with
the exam board (i.e., set standards).
.
Develop instructions to the supervisors, invigilators,
or exam implementers to
sequence the station (i.e., the station number; e.g., it
may be convenient to place a hand-washing and
rest station after a station involving intimate
examination; and if two stations are linked, one
station should immediately follow the other
numerically).
configure the equipment/furniture.
identify how the arrows to and from the exam
station should be positioned. It is important that
the stations form a circuit, with no crossover
points.
identify linked and double stations that require a
staggered start.
decide when to distribute answer books, candidate
instructions, and examiner checklists/ratings
scales and when to transport patients to the exam
venue.
decide how and when the marking sheets will be
collected. It is sometimes useful to collect checklists
from examiners during the exam to facilitate data
entry.
.
Draw up a map of the examination venue, indicating
the position of the exam station, with arrows
indicating the direction in which candidates should
proceed.
.
Draw up a timetable for the whole exam.
BOX 2: A GUIDE TO ORGANIZING AND IMPLEMENTING
THE OSCE
Before the OSCE
.
Start early.
.
Design a blueprint.
.
Identify topics.
.
Identify the number of candidates to be assessed
and the number of venues and runs; allocate
candidates to venues/runs.
.
Fix time and date.
.
Book venues.
.
Route the exam.
.
Notify the candidates: when and where to turn up;
nature of the exam.
.
Appoint examiners and brief them (remember to
appoint stand-by examiners).
.
Brief support staff.
.
Appoint venue coordinators/site managers and
brief them.
.
Arrange resources for cases within stations and
portering the resources (equipment, real or
standardized patients, radiographs, etc.).
.
Prepare documentation – note the format of the
exam material within stations, draw up the exam
site plan, copy and collate paperwork, label candidate
details.
.
Provide a complete set of exam material for site
coordinators and brief them.
.
Set up the stations well in advance of the exam.
.
Remember to order refreshments for examiners,
candidates, and patients.
On the Day of the OSCE (Tasks for the Site Coordinator)
.
Arrive at least one hour before the exam is due to
start.
.
Check each station for completeness.
.
Note candidate absentees.
.
Brief candidates.
.
If there is a second run, sequester the second-run
candidates by taking them in for briefing before the
end of the first run to prevent contamination.
.
Start exam and timer.
.
Oversee conduct of exam.
.
Collect scoring sheets.
.
Gather preliminary examiner feedback.
.
Oversee dismantling of exam and safe return of
patients.
584 JVME 33(4) ß 2006 AAVMC
.
an opportunity to score all candidates using
the same assessment criteria (i.e., the same
pre-validated and structured checklists and/or
rating scales);
.
an assessment that uses trained examiners.
These advantages make the OSCE high in validity and
reliability.
39
Thus, it is suitable for high-stakes summative
examinations.
The main disadvantage of the OSCE over the traditional
long case is that the OSCE does not allow assessment of the
candidate’s holistic approach to a clinical case or patient.
7
The OSCE is thus criticized for fragmentation of skills. It is
important that a learner’s ability to carry out a full history
and physical examination is assessed and that the trainee be
given feedback. The Clinical Evaluation Exercise (CEX) or
mini-CEX
40
provides an instrument for such assessment.
The other main disadvantage is the cost of organizing an
OSCE. These costs mainly relate to the examiners’ and test
designers’ time and that of the simulated patients (SPs), if
paid SPs are used. If the rental of a test centre is necessary,
these costs may also be substantial. Organizing the
examination involves considerable effort. Studies on the
OSCE have shown, however, that the logistics are achiev-
able, even for national examinations.
41
Although the exam is
costly, its cost effectiveness is high in terms of the
information it provides.
SUMMARY
Van der Vleuten’s formula
42
can be used to evaluate the
utility of the OSCE. The formula suggests that the utility of
an assessment is the function of its validity; reliability;
acceptability; educational impact; cost effectiveness; and
practicability.
If suitably designed to test candidate competence in a range
of curriculum outcomes, the OSCE is a valid assessment.
However, since it is conducted under simulated examina-
tion conditions, it does not provide valid information on the
candidate’s ability to perform the skill in real-life situations.
The performance level of Dutch general practitioners, for
example, has been shown to be lower than their competence
level as assessed via OSCE.
43
The reliability of an exam tends to be related to the length of
testing time.
44
The structured examination format, with
wide sampling, lasting approximately one-and-a-half to
two-and-a-half hours, makes the OSCE more reliable than
the long and short case formats.
Owing to its structured nature, which allows every
candidate to be tested under the same conditions, the
OSCE is highly acceptable to students, who appreciate its
fairness.
45
It therefore has high face validity.
The OSCE requires the examiners to directly observe the
candidate carrying out clinical skills. Thus, the examination
highlights the importance of clinical skills to students. Since
the exam material represents a wide sample of the
curriculum and a vast range of skills can potentially be
assessed, students cannot risk ignoring clinical skills. It thus
has a positive educational impact.
The resources needed to conduct an OSCE are demanding.
The high cost, however, is justified by the information it
provides about the clinical competence of the candidate,
making the exam cost effective.
Designing an OSCE station is a skilled activity, and
organizing the examination involves considerable effort.
The returns, however, have proved the OSCE to be a
worthwhile exercise, as judged by the many institutions
from Europe, North and South America, Asia, Australasia,
and Africa reporting their OSCEs in the literature.
REFERENCES
1 Weeks BR, Herron MA, Whitney MS. Pre-clinical
curricular alternatives: method for evaluating student
performance in the context of clinical proficiency. J Vet Med
Educ 20: 9–13, 1993.
2 American Council for Graduate Medical Education
[ACGME], American Board of Medical Specialties [ABSM].
Objective structured clinical examination (OSCE). In Toolbox
of Assessment Methods, version 1.1 <http://www.acgme.org/
Outcome/assess/Toolbox.pdf>. Accessed 10/03/06.
ACGME Outcomes Project, 2000 p7.
3 Newble D, Dawson B. Guidelines for assessing clinical
competence. Teach Learn Med 6: 213–220, 1994.
4 Selby C, Osman L, Davis M, Lee M. How to do it: set up
and run an objective structured clinical exam. Brit Med J 310:
1187–1190, 1995.
5 Hodges B, McIlroy JH. Analytical global OSCE ratings
are sensitive to level of training. Med Educ 37: 1012–1016,
2003.
6 Miller G. The assessment of clinical skills/competence/
performance. Acad Med 65: S63–S67, 1990.
7 Van der Vleuten CPM. Validity of final examinations in
undergraduate medical training. Brit Med J 321: 1217–1219,
2000.
8 Harden RM. How to ...Assess students: an overview.
Med Teach 1: 65–70, 1979.
9 Wilson GM, Lever R, Harden RMcG, Robertson JIS,
MacRitchie J. Examination of clinical examiners. The Lancet
January 4: 37–40, 1969.
10 Harden RM, Stevenson M, Downie WW, Wilson GM.
Assessment of clinical competence using objective
structured examination. Brit Med J 1: 447–451, 1975.
11 Coovadia HM, Moosa A. A comparison of traditional
assessment with the objective structured clinical examina-
tion (OSCE). S Afr Med J 67: 810–812, 1985.
12 Norman G. Editorial: inverting the pyramid. Adv Health
Sci Educ 10: 85–88, 2005.
13 Davis MH, Harden RM, Pringle S, Ledingham I.
Assessment and curriculum change: a study of outcome
[abstract]. In: Abstract book: The 7th Ottawa International
Conference on Medical Education and Assessment, 25–28 June
1996, Faculty of Medicine, University of Limburg. Maastricht:
Faculty of Medicine, University of Limburg/Dutch
Association for Medical Education, 1996:104.
JVME 33(4) ß 2006 AAVMC 585
14 Davis MH, Ponnamperuma GG. Portfolio assessment.
J Vet Med Educ 32: 279–284, 2005.
15 Chessman AW, Blue AV, Gilbert GE, Carey M,
Mainous AGIII. Assessing students’ communication and
interpersonal skills across evaluation settings. Fam Med 35:
643–648, 2003.
16 Kramer AW, Jansen KJ, Dusman H, Tan LH, van der
Vleuten CP, Grol RP. Acquisition of clinical skills in
post-graduate training for general practice. Brit J Gen Pract
53: 677–682, 2003.
17 Yudkowsky R, Alseidi A, Cintron J. Beyond fulfilling
the core competencies: an objective structured clinical
examination to assess communication and interpersonal
skills in a surgical residency. Current Surgery 61: 499–503,
2004.
18 Hafler JP, Connors KM, Volkan K, Bernstein HH.
Developing and evaluating a residents’ curriculum. Med
Teach 27: 276–282, 2005.
19 Auewarakul C, Downing SM, Praditsuwan R,
Jaturatamrong U. Item analysis to improve reliability for an
internal medicine undergraduate OSCE. Adv Health Sci Educ
10: 105–113, 2005.
20 Windrim R, Thomas J, Rittenberg D, Bodley J, Allen V,
Byrne N. Perceived educational benefits of objective
structured clinical examination (OSCE) development and
implementation by resident learners. J Obstet Gynaecol
Canada 26: 815–818, 2004.
21 Johnson G, Reynard K. Assessment of an objective
structured clinical examination (OSCE) for undergraduate
students in accident and emergency medicine. J Accid Emerg
Med 11: 223–226, 1994.
22 Park RS, Chibnall JT, Blaskiewicz RJ, Furman GE,
Powell JK, Mohr CJ. Construct validity of an
objective structured clinical examination (OSCE) in
psychiatry: associations with the clinical skills
examination and other indicators. Acad Psychiatr 28:
122–128, 2004.
23 Reddy S, Vijayakumar S. Evaluating clinical skills of
radiation oncology residents: parts I and II. Int J Cancer 90:
1–12, 2000.
24 Hanna MN, Donnelly MB, Montgomery CL, Sloan PA.
Perioperative pain management education: a short struc-
tured regional anesthesia course compared with traditional
teaching among medical students. Region Anesth Pain Med
30: 523–528, 2005.
25 General Medical Council [GMC].Tomorrow’s Doctors:
Recommendations on Undergraduate Medical Education.
London: GMC, 1993.
26 GMC.Tomorrow’s Doctors: Recommendations on
Undergraduate Medical Education. London: GMC, 2003.
27 Quentin-Baxter M, Spencer JA, Rhind SM.
Working in parallel, learning in parallel? Vet Rec 157:
692–695, 2005.
28 GMC.Good Medical Practice. London: GMC, 2001.
29 Davis MH. OSCE: the Dundee experience. Med Teach 25:
255–261, 2003.
30 Taylor A, Rymer J. The new MRCOG Objective
Structured Clinical Examination: the examiners evaluation. J
Obstet Gynaecol 21: 103–106, 2001.
31 Harrison R. Revalidation: the real life OSCE. Brit Med J
325: 1454–1456, 2002.
32 Mossey PA, Newton JP, Stirrups DR. Scope of the OSCE
in the assessment of clinical skills in dentistry. Brit Dent J
190: 323–326, 2001.
33 Schoonheim-Klein M, Walmsley AD, Habets L, van der
Velden U, Manogue M. An implementation strategy for
introducing an OSCE into a dental school. Europ J Dent Educ
9: 143–149, 2005.
34 Bartfay WJ, Rombough R, Howse E, Leblanc R.
Evaluation: the OSCE approach in nursing education. Can
Nurs 100: 18–23, 2004.
35 Govaerts MJ, van der Vleuten CP, Schuwirth LW.
Optimising the reproducibility of a performance-based
assessment test in midwifery education. Adv Health Sci Educ
7: 133–145, 2002.
36 Rao SP, Bhusari PS. Evaluation of disability knowledge
and skills among leprosy workers. Indian J Leprosy 64:
99–104, 1992.
37 Sibbald D, Regehr G. Impact on the psychometric
properties of a pharmacy OSCE: using 1st-year
students as standardized patients. Teach Learn Med 15:
180–185, 2003.
38 Yamagishi BJ, Welsh PJK, Pead MJ. The first veterinary
clinical skills centre in the UK. Res Vet Sci 78Suppl.A8–9,
2005.
39 Roberts J, Norman G. Reliability and learning from the
objective structured clinical examination. Med Educ 24:
219–223, 1990.
40 Norcini JJ, Blank LL, Duffy FD, Fortna GS. The
mini-CEX: a method for assessing clinical skills. Ann Intern
Med 138: 476–481, 2003.
41 Reznick R, Smee S, Rothman A, Chalmers A, Swanson,
D, Dufresne L, Lacombe G, Baumber J, Poldre P,
Levasseur L, Cohen R, Mendez J, Patey P, Boudreau D,
Berard M. An objective structured clinical examination for
the licentiate: report of the pilot project of the Medical
Council of Canada. Acad Med 67: 487–494, 1992.
42 Van der Vleuten CPM. The assessment of professional
competence: developments, research and practical
implications. Adv Health Sci Educ 1: 41–67, 1996.
43 Rethans J, Sturmans F, Drop M, van der Vleuten C.
Assessment of performance in actual practice of general
practitioners by use of standardised patients. Brit J Gen Pract
41: 97–99, 1991.
44 Swanson DB. A measurement framework for
performance-based tests. In Hart IR, Harden RM, eds.
Further Developments in Assessing Clinical Competence
[proceedings of the international conference, June 27–30,
1987, Congress Centre, Ottawa, Canada]. Montreal:
Can-Heal Publications, 1987:13–45.
45 Pierre RB, Wierenga A, Barton M, Branday JM,
Christie CD. Student evaluation of an OSCE in paediatrics
586 JVME 33(4) ß 2006 AAVMC
at the University of the West Indies, Jamaica <http://
www.biomedcentral.com/1472-6920/4/22>. BMC Med
Educ 4(22), 2004.
AUTHOR INFORMATION
Margery Davis, MD, MBChB, FRCP, ILTM, is Professor of
Medical Education and Director of the Centre for Medical
Education at the University of Dundee, Tay Park House, 484
Perth Road, Dundee DD2 1LR Scotland, UK. E-mail:
m.h.davis@dundee.ac.uk.
Gominda Ponnamperuma, MBBS, Dipl. Psychology, MMedEd,
is Lecturer in Medical Education at the Faculty of Medicine,
University of Colombo, P.O. Box 271, Kynsey Road, Colombo 8,
Sri Lanka, and Researcher at the Centre for Medical Education,
University of Dundee, Tay Park House, 484 Perth Road, Dundee
DD2 1LR Scotland, UK.
Sean McAleer, BSc, DPhil, ILTM, is Senior Lecturer in Medical
Education at the University of Dundee, Tay Park House, 484
Perth Road, Dundee DD1LR Scotland, UK. E-mail:
j.p.g.mcaleer@dundee.ac.uk.
Vicki H.M. Dale, BSc, MSc, ILTM, is an Educational
Technologist at the Faculty of Veterinary Medicine, University
of Glasgow, Glasgow G61 1QH Scotland, UK. E-mail:
v.dale@vet.gla.ac.uk.
JVME 33(4) ß 2006 AAVMC 587